Comment by 9rx

Comment by 9rx 3 months ago

5 replies

> Legally if it's asserted that these images are criminal because they are the result of being the product of an LLM trained on sources that contained CSAM then the requirement would be to prove that assertion.

Legally, possession of CSAM is against the law because there is an assumption that possession proves contribution to market demand, with an understanding that demand incentives production of supply, meaning there that with demand children will be harmed again to produce more content to satisfy the demand. In other words, the intent is to stop future harm. This is why people have been prosecuted for things like suggestive cartoons that have no real-life events behind them. It is not illegal on the grounds of past events. The actual abuse is illegal on its own standing.

The provenance of the imagery is irrelevant. What you need to prove is that your desire to have such imagery won't stimulate yourself or others to create new content with real people. If you could somehow prove that LLM content will satisfy all future thirst, problem solved! That would be world changing.

harshreality 3 months ago

I'm somewhat sympathetic to that argument. However, it doesn't stop there.

Violent video games prove contribution to market demand for FPS-style videos of mass shootings or carjackings, so can/should we ban Call of Duty and Grand Theft Auto now?

(Note that the "market demand" argument is subtly different from the argument that the games directly cause people to become more violent, either in general or by encouraging specific copycat violence. Studies on [lack of] direct violence causation are weak and disputed.)

  • 9rx 3 months ago

    Tell us more about the market that is carrying out mass shootings/carjackings with intent to sell the imagery of it. This is the first I've heard of it. In fact, when mass shootings do occur they are usually explained away for some other reason (e.g. a troubled teen who went off the rails), not that it is the day-to-day operation of a videography business.

    • harshreality 3 months ago

      They don't have to sell imagery, or intend for it to be publicized, for it to have the effect of driving demand for such imagery.

      The link between consumption (without purchase) of CSAM and increased production of CSAM is assumed in the same way, isn't it?

      The motives and situational dynamics of pedophiles filming child pornography, and gangs robbing or carjacking people on video in broad daylight, may be different in some ways, but in some ways they're not: both get direct benefit from the crime, regardless of the recordings; in both cases some (small) subset of people interested in watching the recordings may become more inclined to act out what they see, if it aligns with an already existing inclination toward pedophilia or violent crime. That's independent of whether they can make money or become famous for the crime, although money or infamy as additional motive is an additional problem.

      • 9rx 3 months ago

        > They don't have to sell imagery, or intend for it to be publicized, for it to have the effect of driving demand for such imagery.

        As stated in the first comment, the law here is concerned about future production of imagery to satisfy demand, with its incentive to harm even more children. The business of creating CSAM material is well understood. It is, after all, an extension of the adult porn industry. The law didn't emerge out of someone sitting around dreaming up hypotheticals, it was in reaction to actual situations that the public didn't want to see continue. What is done is done, and there are other laws to deal with what is already done, but the law also seeks to prevent future harm. Outlawing the product so that there is no market for producers to sell into is the only way it knows how. It's overly blunt, is far from perfect, but it tries.

        The business of committing mass shootings/carjackings to produce salable imagery is not well understood. Frankly, I've never heard of it before, and since you can't speak to it I take it that you haven't either. It is unclear why you decided to make it up. As before, I cannot think of any mass shooting that was considered "producing a product to sell" and not "deranged person acting out". Most importantly, even if it is a thing, since nobody is aware that it is happening there is no public will to see that line of business come to an end. Your attempt to find parallel is poorly considered.

        The closest analogy, which still isn't great but is at least in the same universe, that might be found in your ramblings is a case of people wanting to own stolen car parts, incentivizing others to carry out carjackings to get ahold of stolen parts to sell. In reaction, you might make it illegal to possess stolen car parts with the intent to create a situation where someone trying to steal and sell stolen car parts ultimately doesn't have anyone to sell to, diminishing the incentive to carjack in the future. Of course, the law already does exactly that, so...

Ferret7446 2 months ago

Such an assumption is wrong in a world with AI generated CSAM. Why would suppliers go through the risk/cost of producing "actual" CSAM if they could AI generate it? Especially if the demand is for AI generated CSAM (someone who has AI generated CSAM is stimulating demand for AI generated CSAM by definition).

Even for regular porn, which is far lower risk/cost, AI generation is becoming preferable (as with most technologies, the leading use case for AI is porn).