Comment by gavinray
Two close friends of mine who were math prodigies that went on to do ML very early (mid 2010's) were always talking to me about an algorithm that sounds similar to this:
"NEAT/HyperNEAT" (Neuroevolution of Augmented Topologies) [0]
I'm no ML practictioner, but as I understood it, the primary difference between NEAT and what is described in this paper is that while NEAT evolves the topology of the network, this paper seems to evolve the weights.
Seems like two approaches trying to solve the same problem -- one evolving networking structure, and the other the weights.
Those 2 friends are quite possibly the most intelligent people I've ever met, and they were very convinced that RL and evolutionary algorithms were the path forward in ML.
[0] https://en.wikipedia.org/wiki/Neuroevolution_of_augmenting_t...
Humans are amazing, we build a hypothetical computing system trying to understand neurons, then find out it’s not really how they do it, but whatever, we still build a paradigm shifting tech around it. And we’re still enhancing it with ideas from that imaginary system