gcanyon 2 days ago

And like I, Robot, it has numerous loopholes built in, ignores the larger population (Asimov added a law 0 later about humanity), says nothing about the endless variations of the Trolley Problem, assumes that LLMs/bots have a god-like ability to foresee and weigh consequences, and of course ignores alignment completely.

  • SecretDreams 2 days ago

    Hopefully Alan Tudyk will be up for the task of saving humanity with the help of Will Smith.

    • tyre 2 days ago

      I want some answers that Ja Rule might not have right now

  • moralestapia 2 days ago

    Cool!

    I work with a guy like this. Hasn't shipped anything in 15+ years, but I think he'd be proud of that.

    I'll make sure we argue about the "endless variations of the Trolley Problem" in our next meeting. Let's get nothing done!

    • collingreen 2 days ago

      I'm also one of those pesky folks who keeps bringing reality and "thinking about consequences" into the otherwise sublime thought leadership meetings. I pretend it's to keep the company alive by not making massive mistakes but we all know its just pettiness and trying to hold back the "business by spreadsheet", mba on the wall, "idea guys" on the room.

Sharlin 2 days ago

Well, that’s because it paraphrases Asimov’s Three Laws of Robotics, aka Three Plot Devices For Writing Interesting Stories About Robot Ethics.