Comment by wizzwizz4

Comment by wizzwizz4 4 days ago

2 replies

Are we talking about "an AGI", or are we talking about overfitting large transformer models with human-written corpora and scaling up the result?

"An AGI"? I have no idea what that algorithm might look like. I do know that we can cover the majority of cases with not too much effort, so it all depends on the characteristics of that long tail.

ChatGPT-like transformer models? We know what that looks like, despite the AI companies creatively misrepresenting the resource use (ref: https://www.bnnbloomberg.ca/business/technology/2024/08/21/h...). Look at https://arxiv.org/pdf/2404.06405:

> Combining Wu’s method with the classic synthetic methods of deductive databases and angle, ratio, and distance chasing solves 21 out of 30 methods by just using a CPU-only laptop with a time limit of 5 minutes per problem.

AlphaGeometry had an entire supercomputer cluster, and dozens of hours. GOFAI approaches have a laptop and five minutes. Scale that inconceivable inefficiency up to AGI, and the total power output of the sun may not be enough.

aurareturn 4 days ago

When computers first became useful, you needed computers the size of rooms to compute. In 2024, my earphones have more compute.

  • krige 3 days ago

    It's always a hindsight declaration though. Currently we can only say that Intel has reused the same architecture several times already and cranking up the voltage until it breaks because they seem to be yet to find the next design leap, while AMD has been toying around with 3D placement but their latest design is woefully unimpressive. We do not know when the next compute leap will happen until it happens.