Comment by devmor

Comment by devmor a day ago

0 replies

> I wonder if perhaps a part of LLM hallucinations can be explained by them being provided such reporting and having it (mistakenly) tagged as high-quality training data.

Probably (haha) far more of a function of temperature than training data. If the corpus is large enough for your prompt and you turn the temperature all the way down, you will get almost no hallucinations. You then have what is essentially a search engine.