Comment by cardanome

Comment by cardanome 2 days ago

5 replies

My point was we can can't prove that LLM's have consciousness. Yes, the reverse is also true. It is possible that we wouldn't really be able to tell if a AI gained consciousness as that might look very differently than we expect.

An important standard for any scientific theory or hypothesis is to be falsifiable . Good old Russell's teapot. We can't disprove that that a small teapot too small to be seen by telescopes, orbits the Sun somewhere in space between the Earth and Mars. So should we assume it is true? No the burden of proof lies on those that make the claim.

So yes, I could not 100 percent disprove that certain LLM's don't show signs of consciousness but that is reversing the burden of proof. Those that make the claims that LLM's are capable of suffering, that they show signs of consciousness need to deliver. If they can't, it is reasonable to assume they are full of shit.

People here accuse me to be scholastic and too philosophical but the the reverse is true. Yes, we barely know how human brains work and how consciousness evolved but whoever doesn't see the qualitative difference between a human being and an LLM really needs to touch grass.

542354234235 a day ago

I am not saying that LLMs are conscious. What I am saying is that since we don’t really understand what gives rise to our subjective feeling of consciousness, evaluating a non-organic mind is difficult.

For instance, say we had a Westworld type robot that perceived pain, pleasure, happiness, sadness, and reacted accordingly. But that we understood the underlying program, would we say it wasn’t conscious? If we understood our own underlying programing, would we not be “really conscious”?

We say we have LLMs “fake” empathy or feelings but at some point, digital minds will be faking it in a complex way that involves inner “thoughts”, motivations that they “perceive” internally as positive and negative, and various other subjective experiences. It gets very squishy trying to understand how our consciousness isn’t just a fake abstraction on top of the unconscious program, and a digital mind’s abstractions are fake.

Tadpole9181 2 days ago

In one breath: scientific vigor required for your opposition.

In the next breath: "anyone who disagrees with me is a loser."

> Those that make the claims that LLM's are capable of suffering, that they show signs of consciousness need to deliver. If they can't, it is reasonable to assume they are full of shit.

Replace LLM with any marginalized group. Black people, Jews, etc. I can easily just use this to excuse any heinous crime I want - because you cannot prove that you aren't a philosophical zombie to me.

Defaulting to cruelty in the face of unfalsifiablility is absurd.

  • suddenlybananas 2 days ago

    >Replace LLM with any marginalized group. Black people, Jews, etc. I can easily just use this to excuse any heinous crime I want - because you cannot prove that you aren't a philosophical zombie to me.

    This is so flatly ridiculous of an analogy that it becomes racist itself. Maybe the bread I eat is conscious and feels pain (the ancient Manichaens thought so!). Are you know going to refrain from eating bread in case it causes suffering? You can't prove bread doesn't feel pain, you might be "defaulting to cruelty"!

    • Tadpole9181 a day ago

      I'm sure you feel quite clever, but this isn't an actual argument, it's blatant reductio ad absurdum.

      Does bread talk? Can it pass the turing test? Can it produce code and fill out Excel sheets and make art and write novel stories and poetry?

      People boiled lobsters and crabs alive because "they can't feel pain" using the GP's argument. They did - not hypothetically, did - use it to justify slavery. They still use it to cut the fins off of sharks and throw their useless bodies back on the water.

      You wouldn't say something so unconsciable about a cat or a dog or a damn Duck. Yet here we are.

      • suddenlybananas a day ago

        >Does bread talk? Can it pass the turing test? Can it produce code and fill out Excel sheets and make art and write novel stories and poetry?

        None of the animals you described can do any of those things and yet we attribute them suffering. A simulacra of language is not the prerequisite for being a conscious, feeling thing.