Comment by sfn42

Comment by sfn42 a day ago

2 replies

> we do not have a way to scientifically rule out the possibility that LLMs aren't potentially self-aware/conscious entities of their own

That may be. We also don't have a way to scientifically rule out the possibility that a teapot is orbiting Pluto.

Just because you can't disprove something doesn't make it plausible.

intull 20 hours ago

Is this what we are reduced to now, to snap back with a wannabe-witty remark just because you don't like how an idea sounds? Have we completely forgotten and given up on good-faith scientific discourse? Even on HN?

  • sfn42 17 hours ago

    I'm happy to participate in good faith discourse but honestly the idea that LLMs are conscious is ridiculous.

    We are talking about a computer program. It does nothing until it is invoked with an input and then it produces a deterministic output unless provided a random component to prevent determinism.

    That's all it does. It does not live a life of its own between invocations. It does not have a will of its own. Of course it isn't conscious lol how could anyone possibly believe it's conscious? It's an illusion. Don't be fooled.