Comment by chankstein38
Comment by chankstein38 3 days ago
Does anyone have more info on this? They thank Azure at the top so I'm assuming it's a flavor of GPT? How do they prevent hallucinations? I am always cautious about asking an LLM for facts because half of the time it feels like it just adds whatever it wants. So I'm curious if they addressed that here or if this is just poorly thought-out...
Here's the paper: https://arxiv.org/abs/2402.14207