Comment by jbsimpson Comment by jbsimpson 3 days ago 0 replies Copy Link View on Hacker News It's a fundamental limitations of LLMS - don't use them if you're worried about hallucinations.