Comment by wizzwizz4
> Isn’t this a good thing since compute and be scaled so that the LLM can do generations of human thinking in a much shorter amount of time?
But it can't. There isn't enough planet.
> The major assumption here is that transformers can indeed solve every problem humans can.
No, the major assumptions are (a) that ChatGPT can, and (b) that we can reduce the resource requirements by many orders of magnitude. The former assumption is highly-dubious, and the latter is plainly false.
Transformers are capable of representing any algorithm, if they're allowed to be large enough and run large enough. That doesn't give them any special algorithm-finding ability, and finding the correct algorithms is the hard part of the problem!
> But it can't. There isn't enough planet.
How much resource are you assuming an AGI would consume?