Comment by esafak
Comment by esafak 2 days ago
Interesting stuff. I don't have time to read a dissertation so I skimmed his latest paper instead: Why Is Anything Conscious? https://arxiv.org/abs/2409.14545
In it he proposes a five-stage hierarchy of consciousness:
0 : Inert (e.g. a rock)
1 : Hard Coded (e.g. protozoan)
2 : Learning (e.g. nematode)
3 : First Order Self (e.g. housefly). Where phenomenal consciousness, or subjective experience, begins. https://en.wikipedia.org/wiki/Consciousness#Types
4 : Second Order Selves (e.g. cat). Where access consciousness begins. Theory of mind. Self-awareness. Inner narrative. Anticipating the reactions of predator or prey, or navigating a social hierarchy.
5 : Third Order Selves (e.g. human). The ability to model the internal dialogues of others.
The paper claims to dissolve the hard problem of consciousness (https://en.wikipedia.org/wiki/Hard_problem_of_consciousness) by reversing the traditional approach. Instead of starting with abstract mental states, it begins with the embodied biological organism. The authors argue that understanding consciousness requires focusing on how organisms self-organize to interpret sensory information based on valence (https://en.wikipedia.org/wiki/Valence_(psychology)).
The claim is that phenomenal consciousness is fundamentally functional, making the existence of philosophical zombies (entities that behave like conscious beings but lack subjective experience) impossible.
The paper does not seem to elaborate on how to assess which stage the organism belongs to, and to what degree. This is the more interesting question to me. One approach is IIT: http://www.scholarpedia.org/article/Integrated_information_t...
The author's web site: https://michaeltimothybennett.com/
> The claim is that phenomenal consciousness is fundamentally functional, making the existence of philosophical zombies (entities that behave like conscious beings but lack subjective experience) impossible.
This doesn't really address the hard problem, it just asserts that the hard problem doesn't exist. The meat of the problem is that subjective experience exists at all, even though in principle there's no clear reason why it should need to.
Simply declaring it as functional is begging the question.
For example, we can imagine a hypothetical robot that could remove its hand from a stove if it's sensors determine that the surface is too hot. We don't need subjective experience to explain how a system like that could be designed, so why do we need it for an organism?