Comment by boogrpants
Comment by boogrpants 3 days ago
> I think they excel at outputting echoes of their training data that best fit (rhyme with, contextually) the prompt they were given.
Just like people who get degrees in economics or engineering and engage in such role-play for decades. They're often pretty bad at anything they are not trained on.
Coincidentally, if you put a single American English speaker on a team of native German language speakers you will notice information transference falls apart.
Very normal physical reality things occurring in two substrates, two mediums. As if there is a shared limitation called the rest of the universe attempting to erode our efforts via entropy.
LLM is a distribution of human generated data sets. Since humans have the same incompleteness problems in society this affords enough statistical wiggle room for LLMs to make shit up; humans do it! Look in their data!
We're massively underestimating realities indifference to human existence.
There is no doing any better until we effectively break physics, by that I really mean come upon a game changing discovery that informs us we had physics all wrong to begin with.
The fact there are a lot of people around who don't think (including me at times!) does mean LLMs doing that are thinking.
Much like LLMs writing text like mindless middle managers, it doesn't mean they're intelligent, more that mindless middle managers aren't.