Comment by intalentive
Comment by intalentive 9 hours ago
Generative world models seem to be doing ok. Dreamer V4 looks promising. I’m not 100% sold on the necessity of EBMs.
Also I’m skeptical that self-supervised learning is sufficient for human level learning. Some of our ability is innate. I don’t believe it’s possible for statistical methods to learn language from raw audiovisual data the way children can.
Human DNA has under 1GB of information content in it. Most of which isn't even used in the brain. And the brain doesn't have a mechanism to read data out from the DNA efficiently.
This puts a severe limit on how much "innate knowledge" a human can possibly have.
Sure, human brain has a strong inductive bias. It also has a developmental plan, and it follows that plan. It guides its own learning, and ends up being better at self-supervised learning than even the very best of our AIs. But that guidance, that sequencing and that bias must all be created by the rules encoded in the DNA, and there's only this much data in the DNA.
It's quite possible that the human brain has a bunch of simple and clever learning tricks that, if we pried out and applied to our AIs, would give us x100 the learning rate and x1000 the sample efficiency. Or it could be that a single neuron in the human brain is worth 10000 neurons in an artificial neural network, and thus, the biggest part of the "secret" of human brain is just that it's hilariously overparameterized.