Comment by traverseda

Comment by traverseda 11 hours ago

4 replies

LLMs are deterministic. So far every vendor is giving them random noise in addition to your prompt though. They don't like have a free will or a soul or anything, you feed them exactly the same tokens exactly the same tokens will come out.

mmoskal 4 hours ago

If you change one letter in the prompt, however insignificant you may think it is, it will change the results in unpredictable ways, even with temperature 0 etc. The same is not true of renaming a variable in a programming language, most refactorings etc.

codr7 30 minutes ago

That's not how they are being used though, is it?

jnwatson 9 hours ago

Only if you set temperature to 0 or have some way to set the random seed.

  • vlovich123 8 hours ago

    Locally that’s possible but for multi tenant ones I think there’s other challenges related to batch processing (not in terms of the random seed necessarily but because of other non determinism sources).