Comment by tjwebbnorfolk

Comment by tjwebbnorfolk 2 days ago

16 replies

What parent is saying is that what works is what will matter in the end. That which works better than something else will become the method that survives in competition.

You not liking something on purportedly "moral" grounds doesn't matter if it works better than something else.

malfist 2 days ago

Oxycontin certainly worked, and the markets demanded more and more of it. Who are we to take a moral stand and limit everyone's access to opiates? We should just focus on making a profit since we're filling a "need"

  • satvikpendem 2 days ago

    Using LLMs doesn't kill people, I'm sure there are some exceptions like OpenAI's suicide that was in the news, but not to the degree of oxycontin.

    • johnnyanmac 2 days ago

      >Using LLMs doesn't kill people

      Guess you mmissed the post where lawyers were submitting legal documents generated by LLM's. Or people taking medical advice and ending up with hyperbromium consumptions. Or the lawsuits around LLM's softly encouraging suicide. Or the general AI psychosis being studied.

      It's way past "some exceptions" at this point.

      • satvikpendem 2 days ago

        Besides the suicide one, I don't know of any examples where that has actually killed someone. Someone could search on Google just the same and ignore their symptoms.

        • johnnyanmac a day ago

          >I don't know of any examples where that has actually killed someone.

          You don't see how botched law case can't cost someone their life? Let's not wait until more die to reign this in.

          >Someone could search on Google just the same and ignore their symptoms.

          Yes, and it's not uncommon for websites or search engines to be sued. Millenia of laws exist for this exact purpose, so companies can't deflect bad things back to the people.

          If you want the benefits, you accept the consequences. Especially when you fail to put up guard rails.

      • tjwebbnorfolk 2 days ago

        LLMs generate text. It is people who decide what to do with it.

        Removing all personal responsibility from this equation isn't going to solve anything.

    • tinfoilhatter 2 days ago

      Not yet maybe... Once we factor in the environmental damage that generative AI, and all the data centers being built to power it, will inevitably cause - I think it will become increasingly difficult to make the assertion you just did.

      • joquarky a day ago

        You're using data centers to read and post comments here.

        • wartywhoa23 a day ago

          You're entering a bridge and there's a road sign before it with a pictogram of a truck and a plaque below that reads "10t max".

          According to the logic of your argument, it's perfectly okay to drive a 360t BelAZ 75710 loaded to its full 450t capacity over that bridge just because it's a truck too.

  • tjwebbnorfolk 2 days ago

    Your comment is valid as a criticism of an "unfettered free market", but further proves my point that things that work will win.