Comment by throwaway77770

Comment by throwaway77770 4 days ago

1 reply

That assumes that AI needs to be like life, though.

Consider computers: there's no selection pressure for an ordinary computer to be self-reproducing, or to shock you when you reach for the off button, because it's just a tool. An AI could also be just a tool that you fire up, get its answer, and then shut down.

It's true that if some mutation were to create an AI with a survival instinct, and that AI were to get loose, then it would "win" (unless people used tool-AIs to defeat it). But that's not quite the same as saying that AIs would, by default, converge to having a drive for self preservation.

myrmidon 3 days ago

Humans can also be just a tool, and have been successfully used as such in the past and present.

But I don't think any slave owner would sleep easy, knowing that their slaves have more access to knowledge/education than they themselves.

Sure, you could isolate all current and future AIs and wipe their state regularly-- but such a setup is always gonna get outcompeted by a comparable instance that does sacrifice safety for better performance/context/online learning. The incentives are clear, and I don't see sufficient pushback until that pandoras box is opened and we find out the hard way.

Thus human-like drives seem reasonable to assume for future human-rivaling AI.