jbsimpson 3 days ago

It's a fundamental limitations of LLMS - don't use them if you're worried about hallucinations.