Comment by novaRom
Consciousness is not required for efficient AI agents, but it might be useful if your agent should have self-preservation. However an agent without embodiment, instincts, and emotions can call its own existence into question. Any powerful agent will find a way to control its own existence.
> Any powerful agent will find a way to control its own existence.
See, I think that's not a given. To my point, I'm acknowledging the possibility that consciousness/self-determination might naturally come about with higher levels of functionality, but also that it might be inevitable or it might be optional, in which case we need to decide whether it's desirable.