Comment by yunwal
> It doesn't matter to the end user if hallucinations are an unchangeable limitation
Of course it does. I don’t go around complaining that my stove burns me when I touch it. Anyone who knows anything about LLMs at this point knows not to do anything mission critical with them and that’s a good thing.