Comment by vidarh

Comment by vidarh 18 hours ago

0 replies

The irony of all this is that unlike humans - which we have no evidence to suggest can directly introspect lower level reasoning processes - LLMs could be given direct access to introspect their own internal state, via tooling. So if we want to, we can make them able to understand and reason about their own thought processes at a level no human can.

But current LLM's chain of thought is not it.