Comment by WesleyJohnson
Comment by WesleyJohnson 3 months ago
LLMs hallucinate and often provide incorrect answers. They're a fabulous tool if you're not necessarily looking a specific, correct, answer. But I'm not sure I would want my kids to use them as a tutor, without someone to vet the output.
That's a very good concern to have. Grounding[0] helps a lot with this and will continue to improve. I'll also add that I've had human teachers who were confidently wrong about things.
[0] https://deepmind.google/discover/blog/facts-grounding-a-new-...