Comment by Zarathruster
Comment by Zarathruster 3 hours ago
Ah ok, gotcha.
> When you said, "consciousness can't be instantiated purely in language", I took you to mean human language
No, I definitely meant the statement to apply to any kind of language, but it seems clear that I sacrificed clarity for the sake of brevity. You're not the only one who read it that way, but yeah, we're in agreement on the substance.
I think I'm still a bit confused... so, in the languages which cannot produce understanding and consciousness, you mean to include "machine language"? (And thus, any computer language which can be compiled to machine language?)
On your interpretation, are there any sorts of computation that Searle believes would potentially allow consciousness?
ETA: The other issue I have is with this whole idea is that "understanding requires semantics, and semantics requires consciousness". If you want to say that LLMs don't "understand" in that sense, because they're not conscious, I'm fine as long as you limit it to technical philosophical jargon. In plain English, in a practical sense, it's obvious to me that LLMs understand quite a lot -- at least, I haven't found a better word to describe LLMs' relationship with concepts.