Comment by COAGULOPATH
Comment by COAGULOPATH 4 days ago
I think this works, not because LLMs have a "hallucination" dial they can turn down, but because it serves as a cue for the model to be extra-careful with its output.
Sort of like how offering to pay the LLM $5 improves its output. The LLM's taking your prompt seriously, but not literally.
It could also mean that it has some weight which is 'hallucination' and leads to more diverse stories.
Ask an LLM what hallucination is, ask it to write a story with etc.
without zeroing out things, everything has and can have some impact