Comment by b112

Comment by b112 4 days ago

25 replies

For AI, yes.

For AGI? Do you care about uniquely ant experience? Bacteria?

Why would AGI care? Which now runs the planet?

IncreasePosts 4 days ago

Considering the lengths many people go to help preserve nature and natural areas, yes, I would sayany people care about the uniquely ant experience.

AlexandrB 4 days ago

I think it's academic because I suspect we're much further from AGI than anyone thinks. We're especially far from AGI that can act in physical space without human "robots" to carry out its commands.

  • falcor84 4 days ago

    That's an interesting formulation. I'd actually be quite worried about a Manna-like world, where we have AGI and most humans don't have any economic value except as its "remote hands".

Mordisquitos 4 days ago

Why would AGI choose to run the planet?

    • mwigdahl 4 days ago

      Despite the false advertising in the Tears for Fears song, everybody does _not_ want to rule the world. Omohundro drives are a great philosophical thought experiment and it is certainly plausible to consider that they might apply to AI, but claiming as is common on LessWrong that unlimited power seeking is an inevitable consequence of a sufficiently intelligent system seems to be missing a few proof steps, and is opposed by the example of 99% of human beings.

    • Mordisquitos 4 days ago

      > Instrumental convergence is the hypothetical tendency of most sufficiently intelligent, goal-directed beings (human and nonhuman) to pursue similar sub-goals (such as survival or resource acquisition), even if their ultimate goals are quite different. More precisely, beings with agency may pursue similar instrumental goals—goals which are made in pursuit of some particular end, but are not the end goals themselves—because it helps accomplish end goals.

      'Running the planet' does not derive from instrumental convergence as defined here. Very few humans would wish to 'run the planet' as an instrumental goal in the pursuit of their own ultimate goals. Why would it be different for AGIs?

  • ar_lan 4 days ago

    This is honestly a fantastic question. AGI has no emotions, no drive, anything. Maybe, just maybe, it would want to:

    * Conserve power as much as possible, to "stay alive".

    * Optimize for power retention

    Why would it be further interested in generating capital or governing others, though?

    • bigbadfeline 4 days ago

      > AGI has no emotions, no drive, anything. > * Conserve power as much as possible, to "stay alive"

      Having no drive means there's no drive to "stay alive"

      > * Optimize for power retention

      Another drive that magically appeared where there are "no drives".

      You're consistently failing to stay consistent, you anthropomorphize AI although you seem to understand that you shouldn't do so.

    • simianwords 4 days ago

      > AGI has no emotions, no drive, anything

      why do you say that? ever asked chatgpt about anything?

      • badsectoracula 4 days ago

        ChatGPT is instructed to roleplay a cheesy cheery bot and so it responds accordingly, but it (and almost any LLM) can be instructed to roleplay any sort of character, none of which mean anything about the system itself.

        Of course an AGI system could also be instructed to roleplay such a character, but that doesn't mean it'd be an inherent attribute of the system itself.

    • b112 4 days ago

      I think you have it, with the governing of power and such.

      We don't want to rule ants, but we don't want them eating all the food, or infesting our homes.

      Bad outcomes for humans, don't imply or mean malice.

      (food can be any resource here)

    • adrianN 4 days ago

      Why would it care to stay alive? The discussion is pretty pointless as we have no knowledge about alien intelligence and there can be no arguments based on hard facts.

      • myrmidon 4 days ago

        Any form of AI unconcerned about its own continued survival would be just be selected against.

        Evolutionary principles/selection pressure applies just the same to artificial life, and it seems pretty reasonable to assume that drive/selfpreservation would at least be somewhat comparable.

    • stackbutterflow 4 days ago

      Tech billionaires is probably the first thing an AGI is gonna get rid of.

      Minimize threats, dont rock the boat. We'll finally have our UBI utopia.

danaris 4 days ago

...Well, why would aliens care, when they take over the planet? Or the Tuatha De Danan come back and decide we've all been very wicked? Because right now, those are just about as likely as AGI taking over.

  • otabdeveloper4 4 days ago

    Probably more likely. There's at least some evidence that aliens and Tuatha De Danann actually exist.

lifetimerubyist 4 days ago

> Do you care about uniquely ant experience? Bacteria?

Ethology? Biology? We have entire fields of science to these things so obviously we care to some extent.