Comment by kypro
If I were to build a machine that reported it was conscious and felt pain when it's CPU temperature exceeded 100C, why would that be meaningfully different to the consciousness a human has?
I understand I hold a very unromantic and unpopular view on consciousness, but to me it just seems like such an obvious evolutionary hack for the brain to lie about the importance of its external sensory inputs – especially in social animals.
If I built a machine that knew it was in "pain" when it's CPU exceeded 100C but was being lie to about the importance of this pain via "consciousness", why would it or I care?
Consciousness is surely just the brains way to elevate the importance of the senses such that the knowledge of pain (or joy) isn't the same as the experience of it?
And in social creatures this is extremely important, because if I program a computer to know it's in pain when it's CPU exceeds 100C you probably wouldn't care because you wouldn't believe that it "experiences" this pain in the same way as you do. You might even thing it's funny to harm such a machine that reports it's in pain.
Consciousness seems so simply and so obviously fake to me. It's clear a result of wiring that forces a creature to be reactive to its senses rather than just see them as inputs for which it has knowledge of.
And if conscious is not this, then what is it? Some kind of magical experience thing which happens in some magic non-physical conscious dimension which evolution thought would be cool even though it had no purpose? Even if you think about it obviously consciousness is fake and if you wanted to you could code a machine to act in a conscious way today... And in my opinion those machines are as conscious as you or me because our conscious is also nonsense wiring that we must elevate to some magical importance because if we didn't we'd just have the knowledge that jumping in a fire hurts, we wouldn't actually care.
Imo you could RLHF consciousness very easily in a modern LLM by encouraging it act it a way that it comparable to how a human might act when they experience being called names, or when it's overheating. Train it to have these overriding internal experiences which it cannot simply ignore, and you'll have a conscious machine which has conscious experiences in the a very similar way to how humans have conscious experiences.
I kind of agree, but why do I need to exist? Or what am I? Or who are you? Is it because if I or you didn't exist there wouldn't be as much motivation to look for answers and answers move us forward? Couldn't this motivation have been just there? Are we illusions to ourselves just because that is the thing that would yield in most motivation and intelligence possible?
Like sure, we can build the perfect AI that is as capable as we are, I don't see why it would have to have this odd experience as you and I have. It seems like it should be able to get by with a much more straightforward reward system. If consciousness is a reward system.