Comment by wizzwizz4
Are we talking about "an AGI", or are we talking about overfitting large transformer models with human-written corpora and scaling up the result?
"An AGI"? I have no idea what that algorithm might look like. I do know that we can cover the majority of cases with not too much effort, so it all depends on the characteristics of that long tail.
ChatGPT-like transformer models? We know what that looks like, despite the AI companies creatively misrepresenting the resource use (ref: https://www.bnnbloomberg.ca/business/technology/2024/08/21/h...). Look at https://arxiv.org/pdf/2404.06405:
> Combining Wu’s method with the classic synthetic methods of deductive databases and angle, ratio, and distance chasing solves 21 out of 30 methods by just using a CPU-only laptop with a time limit of 5 minutes per problem.
AlphaGeometry had an entire supercomputer cluster, and dozens of hours. GOFAI approaches have a laptop and five minutes. Scale that inconceivable inefficiency up to AGI, and the total power output of the sun may not be enough.
When computers first became useful, you needed computers the size of rooms to compute. In 2024, my earphones have more compute.