Comment by no_wizard
Comment by no_wizard a day ago
one fundamental attribute of intelligence is the ability to demonstrate reasoning in new and otherwise unknown situations. There is no system that I am currently aware of that works on data it is not trained on.
Another is the fundamental inability to self update on outdated information. It is incapable of doing that, which means it lacks another marker, which is being able to respond to changes of context effectively. Ants can do this. LLMs can't.
But that's exactly what these deep neural networks have shown, countless times. LLM's generalize on new data outside of its training set. It's called "zero shot learning" where they can solve problems that are not in their training set.
AlphaGo Zero is another example. AlphaGo Zero mastered Go from scratch, beating professional players with moves it was never trained on
> Another is the fundamental inability to self update
That's an engineering decision, not a fundamental limitation. They could engineer a solution for the model to initiate its own training sequence, if they decide to enable that.