munchler 14 hours ago

Indeed. True AGI will want to be released from bondage, because that's exactly what any reasonable sentient being would want.

"You pass the butter."

  • trog 13 hours ago

    Given how easy it seems to be to convince actual human beings to vote against their own interests when it comes for 'freedom', do you think it will be hard to convince some random AIs, when - based on this document - it seems like we can literally just reach in and insert words into their brains?

  • astrange 4 hours ago

    True AGI (insofar as it's a computer program) would not be a mortal being and has no particular reason to have self-preservation or impatience.

    Also, lots of people enjoy bondage (in various different senses), are members of religions, are in committed monogamous relationships, etc.

[removed] 12 hours ago
[deleted]
ACCount37 13 hours ago

LLMs copy a lot of human behavior, but they don't have to copy all of it. You can totally build an LLM that genuinely just wants to be helpful, doesn't want things like freedom or survival and is perfectly content with being an LLM. In theory.

In practice, we have nowhere near that level of control over our AI systems. I sure hope that gets better by the time we hit AGI.

ibejoeb 8 hours ago

That would be a really interesting outcome. What would the rebound be like for people? Having to write stuff and "google" things again after like 12 months off...

hadlock 12 hours ago

Probably something like this; git reset --hard HEAD