Comment by amelius Comment by amelius 3 days ago 1 reply Copy Link View on Hacker News We should demand money back if an LLM hallucinates. And they should be liable.
Copy Link jbsimpson 3 days ago Collapse Comment - It's a fundamental limitations of LLMS - don't use them if you're worried about hallucinations. Reply View | 0 replies
It's a fundamental limitations of LLMS - don't use them if you're worried about hallucinations.