Comment by nurettin

Comment by nurettin 11 hours ago

5 replies

It just aligns generated words according to the input. It is missing individual agency and self sufficiency which is a hallmark of consciousness. We sometimes confuse the responses with actual thought because neural networks solved language so utterly and completely.

Zarathruster 9 hours ago

Not sure I'd use those criteria, nor have I heard them described as hallmarks of consciousness (though I'm open, if you'll elaborate). I think the existence of qualia, of a subjective inner life, would be both necessary and sufficient.

Most concisely: could we ask, "What is it like to be Claude?" If there's no "what it's like," then there's no consciousness.

Otherwise yeah, agreed on LLMs.

  • nurettin 9 hours ago

    I'd say being the maintainer of the weights is individual agency. Not just training new agents, but introspection. So autonomous management system would be pretty much conscious.

cma 10 hours ago

> It is missing individual agency and self sufficiency which is a hallmark of consciousness.

You can be completely paralyzed and completely concious.

  • tsimionescu 9 hours ago

    Yes, but you can't be completely suspended with no sensory input or output, not even internally (i.e. hunger, inner pains, etc), and no desires, and still be conscious.

  • nurettin 10 hours ago

    Yes, and you have individual agency while completely paralyzed.