Comment by codyvoda

Comment by codyvoda a day ago

37 replies

^I like email as an analogy

if I send a death threat over gmail, I am responsible, not google

if you use LLMs to make bombs or spam hate speech, you’re responsible. it’s not a terribly hard concept

and yeah “AI safety” tends to be a joke in the industry

OJFord a day ago

What if I ask it for something fun to make because I'm bored, and the response is bomb-building instructions? There isn't a (sending) email analogue to that.

  • BriggyDwiggs42 21 hours ago

    In what world would it respond with bomb building instructions?

    • __MatrixMan__ 21 hours ago

      If I were to make a list of fun things, I think that blowing stuff up would feature in the top ten. It's not unreasonable that an LLM might agree.

    • QuadmasterXLII 19 hours ago

      if it used search and ingested a malicious website, for example.

      • BriggyDwiggs42 15 hours ago

        Fair, but if it happens upon that in the top search results of an innocuous search, maybe the LLM isn’t the problem.

    • OJFord 21 hours ago

      Why might that happen is not really the point is it? If I ask for a photorealistic image of a man sitting at a computer, a priori I might think 'in what world would I expect seven fingers and no thumbs per hand', alas...

      • BriggyDwiggs42 15 hours ago

        I’ll take the example as an example of an LLM initiating harmful behavior in general and admit that such a thing is perfectly possible. I think the issue is down to the degree to which preventing such initiation impinges on the agency of the user, and I don’t think that requests for information should be refused because it’s lots of imposition for very little gain. I’m perfectly alright with conditioning/prompting the model not to readily jump into serious, potentially harmful targets without the direct request of the user.

kelseyfrog a day ago

There's more than one way to view it. Determining who has responsibility is one. Simply wanting there to be fewer causal factors which result in death threats and bombs being made is another.

If I want there to be fewer[1] bombs, examining the causal factors and affecting change there is a reasonable position to hold.

1. Simply fewer; don't pigeon hole this into zero.

BobaFloutist a day ago

> if you use LLMs to make bombs or spam hate speech, you’re responsible.

What if it's easier enough to make bombs or spam hate speech with LLMs that it DDoSes law enforcement and other mechanisms that otherwise prevent bombings and harassment? Is there any place for regulation limiting the availability or capabilities of tools that make crimes vastly easier and more accessible than they would be otherwise?

  • 3np a day ago

    The same argument could be made about computers. Do you prefer a society where CPUs are regulated like guns and you can't buy anything freer than an iPhone off the shelf?

  • BriggyDwiggs42 21 hours ago

    I mean this stuff is so easy to do though. An extremist doesn’t even need to make a bomb, he/she already drives a car that can kill many people. In the US it’s easy to get a firearm that could do the same. If capacity + randomness were a sufficient model for human behavior, we’d never gather in crowds, since a solid minority would be rammed, shot up, bombed etc. People don’t want to do that stuff; that’s our security. We can prevent some of the most egregious examples with censorship and banning, but what actually works is the fuzzy shit, give people opportunities, social connections, etc. so they don’t fall into extremism.

Angostura a day ago

or alternatively, if I cook myself a cake and poison myself, i am responsible.

If you sell me a cake and it poisons me, you are responsible.

  • kennywinker a day ago

    So if you sell me a service that comes up with recipes for cakes, and one is poisonous?

    I made it. You sold me the tool that “wrote” the recipe. Who’s responsible?

    • Sleaker a day ago

      The seller of the tool is responsible. If they say it can produce recipes, they're responsible for ensuring the recipes it gives someone won't cause harm. This can fall under different categories if it doesn't depending on the laws of the country/state. Willful Negligence, false advertisement, etc.

      Ianal, but I think this is similar to the red bull wings, monster energy death cases, etc.

  • actsasbuffoon 19 hours ago

    Sure, I may be responsible, but you’d still be dead.

    I’d prefer to live in a world where people just didn’t go around making poison cakes.

SpicyLemonZest a day ago

It's a hard concept in all kinds of scenarios. If a pharmacist sells you large amounts of pseudoephedrine, which you're secretly using to manufacture meth, which of you is responsible? It's not an either/or, and we've decided as a society that the pharmacist needs to shoulder a lot of the responsibility by putting restrictions on when and how they'll sell it.

  • codyvoda a day ago

    sure but we’re talking about literal text, not physical drugs or bomb making materials. censorship is silly for LLMs and “jailbreaking” as a concept for LLMs is silly. this entire line of discussion is silly

    • kennywinker a day ago

      Except it’s not, because people are using LLMs for things, thinking they can put guardrails on them that will hold.

      As an example, I’m thinking of the car dealership chatbot that gave away $1 cars: https://futurism.com/the-byte/car-dealership-ai

      If these things are being sold as things that can be locked down, it’s fair game to find holes in those lockdowns.

      • codyvoda a day ago

        …and? people do stupid things and face consequences? so what?

        I’d also advocate you don’t expose your unsecured database to the public internet

  • [removed] a day ago
    [deleted]
loremium a day ago

This is assuming people are responsible and with good will. But how many of the gun victims each year would be dead if there were no guns? How many radiation victims would there be without the invention of nuclear bombs? safety is indeed a property of knowledge.

  • miroljub a day ago

    Just imagine how many people would not die in traffic incidents if the knowledge of the wheel had been successfully hidden?

    • handfuloflight a day ago

      Nice try but the causal chain isn't as simple as wheels turning → dead people.

  • 0x457 a day ago

    If someone wants to make a bomb, chatgpt saying "sorry I can't help with that" won't prevent that someone from finding out how to make one.

    • BobaFloutist a day ago

      Sure, but if ten-thousand people might sorta want to make a bomb for like five minutes, chatgpt saying "nope" might prevent nine-thousand nine-hundred and ninety nine of those, at which point we might have a hundred fewer bombings.

      • BriggyDwiggs42 21 hours ago

        They’d need to sustain interest through the buying process, not get caught for super suspicious purchases, then successfully build a bomb without blowing themselves up. Not a five minute job.

      • 0x457 a day ago

        If ChatGPT provided instructions on how make a bomb, most people would probably blow themsevles up before they finish.

    • HeatrayEnjoyer a day ago

      That's really not true, by that logic LLMs provide no value which is obviously false.

      It's one thing to spend years studying chemistry, it's another to receive a tailored instruction guide in thirty seconds. It will even instruct you how to dodge detection by law enforcement, which a chemistry degree will not.

      • 0x457 a day ago

        > That's really not true, by that logic LLMs provide no value which is obviously false.

        Way to leep to a (wrong) conclusion. I can lookup a word in a Dictionary.app, I can google it or I can pick up a phisical dictionary book and look it up.

        You don't even need to look to far: Fight Club (the book) describes how to make a bomb pretty accurately.

        If you're worrying that "well you need to know which books to pick up at the library"...you can probably ask chatgpt. Yeah it's not as fast, but if you think this is what stops everyone from making a bomb, then well...sucks to be you and live in such fear?

  • [removed] a day ago
    [deleted]