Comment by alexjray
Comment by alexjray 4 days ago
Even if they automate all our current jobs uniquely human experiences will always be valuable to us and will always have demand.
Comment by alexjray 4 days ago
Even if they automate all our current jobs uniquely human experiences will always be valuable to us and will always have demand.
Considering the lengths many people go to help preserve nature and natural areas, yes, I would sayany people care about the uniquely ant experience.
Despite the false advertising in the Tears for Fears song, everybody does _not_ want to rule the world. Omohundro drives are a great philosophical thought experiment and it is certainly plausible to consider that they might apply to AI, but claiming as is common on LessWrong that unlimited power seeking is an inevitable consequence of a sufficiently intelligent system seems to be missing a few proof steps, and is opposed by the example of 99% of human beings.
> Instrumental convergence is the hypothetical tendency of most sufficiently intelligent, goal-directed beings (human and nonhuman) to pursue similar sub-goals (such as survival or resource acquisition), even if their ultimate goals are quite different. More precisely, beings with agency may pursue similar instrumental goals—goals which are made in pursuit of some particular end, but are not the end goals themselves—because it helps accomplish end goals.
'Running the planet' does not derive from instrumental convergence as defined here. Very few humans would wish to 'run the planet' as an instrumental goal in the pursuit of their own ultimate goals. Why would it be different for AGIs?
This is honestly a fantastic question. AGI has no emotions, no drive, anything. Maybe, just maybe, it would want to:
* Conserve power as much as possible, to "stay alive".
* Optimize for power retention
Why would it be further interested in generating capital or governing others, though?
> AGI has no emotions, no drive, anything. > * Conserve power as much as possible, to "stay alive"
Having no drive means there's no drive to "stay alive"
> * Optimize for power retention
Another drive that magically appeared where there are "no drives".
You're consistently failing to stay consistent, you anthropomorphize AI although you seem to understand that you shouldn't do so.
> AGI has no emotions, no drive, anything
why do you say that? ever asked chatgpt about anything?
Tech billionaires is probably the first thing an AGI is gonna get rid of.
Minimize threats, dont rock the boat. We'll finally have our UBI utopia.
Probably more likely. There's at least some evidence that aliens and Tuatha De Danann actually exist.
> Do you care about uniquely ant experience? Bacteria?
Ethology? Biology? We have entire fields of science to these things so obviously we care to some extent.
>Even if they automate all our current jobs uniquely human experiences will always be valuable to us and will always have demand.
I call this the Quark principle. On DS9, there are matter replicators that can perfectly recreate any possible drink imaginable instantly. And yet, the people of the station still gather at Quark's and pay him money to pour and mix their drinks from physical bottles. As long as we are human, some things will never go away no matter how advanced the technology becomes.
In Star Trek lore replicated food/drink is always one down on taste/texture from the real thing.
"The economy" is entirely driven by human needs.
If you "unwind" all the complexities in modern supply chains, there are always human people paying for something they want at the leaf nodes.
Take the food and clothing industries as obvious examples. In some AI singularity scenario where all humans are unemployed and dirt poor, does all the food and clothing produced by the automated factories just end up in big piles because we naked and starving people can't afford to buy them?
There's nothing definitional about the economy being driven by human need. In a future scenario where there are superintelligent AIs, there's no reason why they wouldn't run their own economy for their own needs, collecting and processing materials to service each other's goals, for example of space exploration.
That's an interesting argument. I don't like it, but I can't prove it wrong, so maybe we're approaching a new era where this is true.
But we're clearly not there now, so I stand by my prediction for the medium future!
For economic purposes, "the economy" also includes corporations and governments.
Corporations and governments have counted amongst their property entities that they did not grant equal rights to, sometimes whom they did not even consider to be people. Humans have been treated in the past much as livestock and guide dogs still are.
No doubt the top influencer is doing better than the top plumber, but I'd say the median plumber is streets ahead of the median influencer.
For AI, yes.
For AGI? Do you care about uniquely ant experience? Bacteria?
Why would AGI care? Which now runs the planet?