Comment by adleyjulian
Comment by adleyjulian 2 days ago
> LLMs get over-analyzed. They’re predictive text models trained to match patterns in their data, statistical algorithms, not brains, not systems with “psychology” in any human sense.
Per the predictive processing theory of mind, human brains are similarly predictive machines. "Psychology" is an emergent property.
I think it's overly dismissive to point to the fundamentals being simple, i.e. that it's a token prediction algorithm, when it's clear to everyone that it's the unexpected emergent properties of LLMs that everyone is interested in.
The fact that a theory exists does not mean that it is not garbage