Comment by rpcope1
I'm sure in the near future the AIs will be smart enough to do literally everything for us, so we can just enjoy fully automated luxury space communism without needing to know anything. /s
I'm sure in the near future the AIs will be smart enough to do literally everything for us, so we can just enjoy fully automated luxury space communism without needing to know anything. /s
>As for the tech, I can't tell if we're on the first half or the second half of the S-curve for the current wave of AI. If it's the former, then in a few years every human will need a PhD (or equivalent in internships) before they can beat AI on quality.
Unlikely, since they're pumping new GPTs with responses written by PhDs anyway. It's becoming more and more of a "Wizard of Oz" situation.
1. There's a lot of superhuman AI out there already, just in narrow domains like protein folding, chess, and so on.
2. Doctorate (and postdoc, and professoral) level responses were already in the training sets.
3. Aim of current AI research is to create a model of the underlying reality which produces the observed signals — what must a PhD candidate have observed for them to write that particular paper, etc.
I have no idea if these AI efforts will succeed or not, hence no idea where we are on the S-curve. But that's the goal.
1. Protein folding is not AGI, and its a technology (much like, say, telecommunications), which no human on their own would be able to perform or even expected to.
3. How can an AI (in a general sense) create a model of underlying reality if the humans who create them do not have access to underlying reality but only the forms of its appearance?
1. I didn't say AGI and neither did the person I replied to.
They said AI, I said AI.
But protein folding is also very much a thing humans did by playing games. Literally: https://en.wikipedia.org/wiki/Foldit
And Transformers are able to learn to use tools, so even mediocre ones can write and then invoke specialist AI.
3. To reach the quality level I specified, that of "You need a PhD to compete with this", it is only necessary to program the AI to be as capable of learning as a PhD student.
This is the standard AI researchers are aiming for, and what I was writing of.
Can they do that? Dunno, and I hope not. But I've bought some tech shares just in case they actually can, because I doubt I'll be able to keep up if that's the near future.
/s noted, how near is "near"?
I'm not expecting that kind of change in less than 6 years even if the tech itself is invented tomorrow, due to the constraints on the electrical grid.
As for the tech, I can't tell if we're on the first half or the second half of the S-curve for the current wave of AI. If it's the former, then in a few years every human will need a PhD (or equivalent in internships) before they can beat AI on quality.