Comment by barbarr

Comment by barbarr a day ago

3 replies

It also ignores the possibility of plateau... maybe there's a maximum amount of intelligence that matter can support, and it doesn't scale up with copies or speed.

AlexandrB a day ago

Or scales sub-linearly with hardware. When you're in the rising portion of an S-curve[1] you can't tell how much longer it will go on before plateauing.

A lot of this resembles post-war futurism that assumed we would all be flying around in spaceships and personal flying cars within a decade. Unfortunately the rapid pace of transportation innovation slowed due to physical and cost constraints and we've made little progress (beyond cost optimization) since.

[1] https://en.wikipedia.org/wiki/Sigmoid_function

  • Tossrock a day ago

    The fact that it scales sub linearly with hardware is well known and in fact foundational to the scaling laws on which modern LLMs are built, ie performance scales remarkably closely to log(compute+data), over many orders of magnitude.

pixl97 a day ago

Eh, these mathematics still don't work out in humans favor...

Lets say intelligence caps out at the maximum smartest person that's ever lived. Well, the first thing we'd attempt to do is build machines up to that limit that 99.99999 percent of us would never get close to. Moreso the thinking parts of humans is only around 2 pounds of mush in side of our heads. On top of that you don't have to grow them for 18 years first before they start outputting something useful. That and they won't need sleep. Oh and you can feed them with solar panels. And they won't be getting distracted by that super sleek server rack across the aisle.

We do know 'hive' or societal intelligence does scale over time especially with integration with tooling. The amount of knowledge we have and the means of which we can apply it simply dwarf previous generations.