Comment by BanditDefender

Comment by BanditDefender 3 days ago

1 reply

LLMs aren't actually able to do that though, are they? They are simply incapable of keeping track of consistent behaviors and beliefs. I recognize that for certain prompts an LLM has to do it. But as long as we're using transformers, it'll never actually work.

ACCount37 2 days ago

People just keep underestimating transformers. Big mistake. The architecture is incredibly capable.

LLMs are capable of keeping track of consistent behaviors and beliefs, and they sure try. Are they perfect at it? Certainly not. They're pretty good at it though.