Comment by analogears

Comment by analogears 3 days ago

1 reply

One thing missing from this framing: the feedback loop speed. Animal evolution operates on generational timescales, but LLM "commercial evolution" happens in months. The optimisation pressure might be weaker per-iteration but the iteration rate is orders of magnitude faster.

Curious whether this means LLMs will converge toward something more general (as the A/B testing covers more edge cases) or stay jagged forever because no single failure mode means "death".

BanditDefender 3 days ago

> Animal evolution operates on generational timescales, but LLM "commercial evolution" happens in months.

But LLMs have all been variations on transformer neural networks. And that is simply not true with animals. A nematode brain has 150 neurons, a bee has 600,000. But the bee's individual neurons are much more sophisticated than the nematode. Likewise between insects and fish, between fish and birds, between rodents and primates....

Animal evolution also includes "architectural breakthroughs" but that's not happening with LLMs right now. "Attention Is All You Need" was from 2017. We've been fine-tuning that paper ever since. What we need is a new historically important paper.