Comment by hrimfaxi

Comment by hrimfaxi 3 hours ago

2 replies

You had originally said the user should be liable instead of OpenAI being liable.

> However, the lyrics are shown because the user requested them, shouldn't be the user be liable instead?

I would imagine the sociological rationale for allowing sex work would not map to a multi-billion-dollar company.

And to add, the social network example doesn't map because the user is producing the content and sharing it with the network. In OpenAI's case, they are creating and distributing copyrighted works.

estebarb 2 hours ago

No, the edited wording still conveys the same meaning. My edit was to fix another grammar typo.

The social networks are distributing such content AND benefiting from selling ads on them. Adding ads on top is a derivative work.

Personally I'm on the side of penalizing the side that provides the input, not the output:

- OpenAI training on copyrighted works. - Users requesting custom works based on copyrighted IP

That is my opinion on how it should be layered, that's it. I'm happy to discuss why it should be that way or why not. As I put in other comment, my concern is that mandating copyright filtering o each generative tool would end up propagating to every single digital tool, which as society we don't really want.

  • hrimfaxi 2 hours ago

    I am curious why you are of the opinion that the user should be in trouble for requesting the copyright material and not the provider of the material. I feel like there is a distinction in something that was local-first compared to a SaaS. Like a local AI model that reproduced copyrighted works for your own use might not be problematic compared to a remote model reproducing a copyrighted work and distributing it over the internet to you. Most jurisdictions treat remote access across jurisdictional boundaries differently than completely local acts.