Comment by anal_reactor

Comment by anal_reactor 2 days ago

6 replies

> The anthropization of llms is getting off the charts.

What's wrong with that? If it quacks like a duck... it's just a complex pile of organic chemistry, ducks aren't real because the concept of "a duck" is wrong.

I honestly believe there is a degree of sentience in LLMs. Sure, they're not sentient in the human sense, but if you define sentience as whatever humans have, then of course no other entity can be sentient.

noosphr 2 days ago

>What's wrong with that? If it quacks like a duck... it's just a complex pile of organic chemistry, ducks aren't real because the concept of "a duck" is wrong.

To simulate a biological neuron you need a 1m parameter neural network.

The sota models that we know the size of are ~650m parameters.

That's the equivalent of a round worm.

So if it quacks like a duck, has the brain power of a round worm, and can't walk then it's probably not a duck.

  • ffsm8 2 days ago

    You just convinced me that AGI is a lot closer then I previously thought, considering the bulk of our brains job is controlling our bodies and responding to the stimulus from our senses - not thinking, talking, planning, coding etc

    • noosphr 2 days ago

      A stegosaurus managed to live using a brain the size of a wallnut on top of a body the size of a large boat. The majority of our brains are doing something else.

  • anal_reactor 2 days ago

    Ok so you're saying that the technology to make AI truly sentient is there, we just need a little bit more computational power or some optimization tricks. Like raytracing wasn't possible in 1970 but is now. Neat.

    • noosphr 2 days ago

      Yes, in the same way that a human is an optimization of a round worm.