Comment by dragonwriter

Comment by dragonwriter a day ago

6 replies

Or treat AI training as within the coverage of the current fair use regime (which is certainly defensible within the current copyright regime), while prosecuting the use of AI models to create infringing copies and derivative works that do not themselves have permission or a reasonable claim to be within the scope of fair use as a violation (and prosecuted hosted AI firms for contributory infringement where their actions with regard to such created infringements fit the existing law on that.)

Wowfunhappy a day ago

^ I feel like I almost never see this take, and I don't understand why because frankly, it strikes me at patently obvious! Of course the tool isn't responsible, and the person who uses it is.

  • srveale a day ago

    I think the tricky bit is that AI companies make money off the collected works of artists, regardless of user behaviour. Suppose I pay for an image generator because I like making funny pictures in Ghibli style, then the AI company makes money because of Ghibli's work. Is that ethical? I can see how an artist would get upset about it.

    On the other hand, suppose I also like playing guitar covers of songs. Does that mean artists should get upset at the guitar company? Does it matter if I do it at home or at a paid gig? If I record it, do I have to give credit to the original creator? What if I write a song with a similar style to an existing song? These are all questions that have (mostly) well defined laws and ethical norms, which usually lean towards what you said - the tool isn't responsible.

    Maybe not a perfect analogy. It takes more skill to play guitar than to type "Funny meme Ghibli style pls". Me playing a cover doesn't reduce demand for actual bands. And guitar companies aren't trying to... take over the world?

    At the end of the day, the cat is out of the bag, generative AI is here to stay, and I think I agree that we're better off regulating use rather than prohibition. But considering the broader societal impacts, I think AI is more complicated of a "tool" than other kinds of tools for making art.

    • codedokode a day ago

      > I think the tricky bit is that AI companies make money off the collected works of artists,

      There is also a chance that AI companies didn't obtain the training data legally; in that case it would be at least immoral to build a business on stolen content.

prawn a day ago

I see AI training on public material like I would upcoming artists being inspired by the artists before them. Obviously the scale is very different. I don't mind your scenario because an AI firm, if they couldn't stay on top of what their model was creating, could voluntarily reduce the material used to train it.

  • codedokode a day ago

    You imply that AI model is creating new works and not merely rearranging pieces from other works you never saw and therefore might consider novel. AI model is not a model of a creative human currently: a human doesn't need to listen to million songs to create his own.