Comment by th0ma5
It is unethical to me to provide an accessibility tool that lies.
It is unethical to me to provide an accessibility tool that lies.
>That implies agency and intentionality that they do not have.
No, but the companies have agencies. LLMs lie, and they only get fixed when companies are sued. Close enough.
Sure https://www.nbcnews.com/tech/tech-news/man-asked-chatgpt-cut...
Not going to go back and forth on thos as you inevitably try to nitpick "oh but the chatbot didn't say to do that"
LLMs do not lie. That implies agency and intentionality that they do not have.
LLMs are approximately right. That means they're sometimes wrong, which sucks. But they can do things for which no 100% accurate tool exists, and maybe could not possibly exist. So take it or leave it.