Comment by anal_reactor
Comment by anal_reactor 2 days ago
> The anthropization of llms is getting off the charts.
What's wrong with that? If it quacks like a duck... it's just a complex pile of organic chemistry, ducks aren't real because the concept of "a duck" is wrong.
I honestly believe there is a degree of sentience in LLMs. Sure, they're not sentient in the human sense, but if you define sentience as whatever humans have, then of course no other entity can be sentient.
>What's wrong with that? If it quacks like a duck... it's just a complex pile of organic chemistry, ducks aren't real because the concept of "a duck" is wrong.
To simulate a biological neuron you need a 1m parameter neural network.
The sota models that we know the size of are ~650m parameters.
That's the equivalent of a round worm.
So if it quacks like a duck, has the brain power of a round worm, and can't walk then it's probably not a duck.