Comment by fluoridation

Comment by fluoridation 3 days ago

5 replies

>I haven't heard anything about biological systems doing something comparable to backpropogation

The brain isn't organized into layers like ANNs are. It's a general graph of neurons and cycles are probably common.

HarHarVeryFunny 3 days ago

Actually that's not true. Our neocortex - the "crumpled up" outer layer of our brain, which is basically responsible for cognition/intelligence, has a highly regular architecture. If you uncrumpled it, it'd be a thin sheet of neurons about the size of a teatowel, consisting of 6 layers of different types of neurons with a specific inter-layer and intra-layer pattern of connections. It's not a general graph at all, but rather a specific processing architecture.

  • fluoridation 2 days ago

    None of what you've said contradicts it's a general graph instead of, say, a DAG. It doesn't rule out cyles either within a single layer or across multiple layers. And even if it did, the brain is not just the neocortex, and the neocortex isn't isolated from the rest of the topology.

    • HarHarVeryFunny 2 days ago

      It's a specific architecture. Of course there are (massive amounts) of feedback paths, since that's how we learn - top-down prediction and bottom-up sensory input. There is of course looping too - e.g. thalamo-cortical loop - we are not just as pass-thru reactionary LLM!

      Yes, there is a lot more structure to the brain than just the neocortex - there are all the other major components (thalamus, hippocampus, etc) each with their own internal arhitecture, and then specific patterns of interconnect between them...

      This all reinforces what I am saying - the brain is not just some random graph - it is a highly specific architecture.

      • fluoridation 2 days ago

        Did I say "random graph", or did I say "general graph"?

        >There is of course looping too - e.g. thalamo-cortical loop - we are not just as pass-thru reactionary LLM!

        Uh-huh. But I was responding to a comment about how the brain doesn't do something analogous to back-propagation. It's starting to sound like you've contradicted me to agree with me.

        • HarHarVeryFunny 2 days ago

          I didn't say anything about back-progagation, but if you want to talk about that then it depends on how "analogous" you want to consider ...

          It seems very widely accepted that the neocortex is a prediction machine that learns by updating itself based on sensory detection of top-down prediction failures, and with multiple layers (cortical patches) of pattern learning and prediction, there necessarily has to be some "propagation" of prediction error feedback from one layer to another, so that all layers can learn.

          Now, does the brain learn in a way directly equivalent to backprop in terms of using exact error gradients or a single error function? No - presumably not, it more likely works in layered fashion with each higher level providing error feedback to the layer below, with that feedback likely just being what was expected vs what was detected (i.e. not a gradient - essentially just a difference). Of course gradients are more efficient in terms of selecting varying update step sizes, but directional would work fine too. It would also not be surprising if evolution has stumbled upon something similar to Bayesian updates in terms of how to optimally incrementally update beliefs (predictions) based on conflicting evidence.

          So, that's an informed guess of how our brain is learning - up to you whether you want to regard that as analogous to backprop or not.