enopod_ 14 hours ago

Looks to me like OpenAI drew their guardrails somewhere along a financial line. Generate a Micky Mouse or a Pikachu? Disney and Pokemon will sue the sh*t out of you. Ghibli? Probably not powerful enough to risk a multimillion years long court battle.

  • nticompass 11 hours ago

    I thought Disney had the rights to publish Ghibli movies in the US.

    • davidhaymond 11 hours ago

      They did, but the rights expired. GKIDS now has the theatrical and home video rights to Studio Ghibli films in the US (except for Grave of the Fireflies).

  • bufferoverflow 11 hours ago

    Mickey Mouse (the original one) is out of copyright, as of last year, AFAIR.

  • briandear 11 hours ago

    Ghibli isn’t a character, but a style. You can’t copyright it.

    • sejje 9 hours ago

      Yes, the only test will eventually be "Can you train AI on copyrighted works"

      • contravariant 8 hours ago

        I consider this article quite strong proof that generative AI is closer to copying than it is to creating a new derivative work.

    • briandear 10 hours ago

      For the downvotes:

      https://www.copyright.gov/circs/circ01.pdf

      “Copyright does not protect • Ideas, procedures, methods, systems, processes, concepts, principles, or discoveries”

      Not sure why this is even controversial, this has been the case for a hundred years.

simianparrot 9 hours ago

So many arguing that "copyright shouldn't be a thing" etc., ad nauseam, which is a fine philosophical debate. But it's also the law. And that means ChatGPT et. al. also have to follow the law.

I really, really hope the multimedia-megacorps get together and class-action ChatGPT and every other closed, for-profit LLM corporation into oblivion.

There should not be a two-tier legal system. If it's illegal for me, it's illegal for Sam Altman.

Get to it.

  • nucleogenesis 9 hours ago

    > There should not be a two-tier legal system.

    That’s a fine philosophical debate, but the law is designed by the rich to favor the rich and while there are a number of exceptions there is little you can do with the legal system without money and lots of it. So while having a truly just system would be neat it just isn’t in the cards for humanity (IMHO) so long as we allow entities to amass “fuck you” money and wield it to their liking.

  • fishpen0 6 hours ago

    There is more to it than copyright when you start going down the path of photorealism. As much as it is a picture of Indiana jones, it is also a picture of Harrison Ford. As fun as it is to make hilarious videos of presidents sucking ceo toes, there has to be a line.

    There is a lack of consent here that runs even deeper than what copyright was traditionally made to protect. It goes further than parody. We can't flip our standards back and forth depending on who the image is made to reproduce

    • simianparrot 6 hours ago

      I fully agree. But since the average Joe has no chance legally against ChatGPT, at least Disney and other megacorps could.

  • dartos 9 hours ago

    Sorry, but have you paid attention to the legal system in the states?

    Large corporations and their execs live by different laws than the rest of us.

    That’s how it is.

    Anything is else is, unfortunately, a fiction in this country.

    • simianparrot 6 hours ago

      And? Two wrongs don’t make a right.

      • dartos 6 hours ago

        There’s no “and.”

        I’m just stating a fact. No discussion of wrong or right or whatever.

        Just pointing out how there is no more rule of law in the US. Idk when exactly it disappeared, but it’s definitely not present anymore

KronisLV 18 hours ago

I think the cat is out of the bag when it comes to generative AI, the same way how various LLMs for programming have been trained even on codebases that they had no business using, yet nobody hasn’t and won’t stop them. It’s the same as what’s going to happen with deepfakes and such, as the technology inevitably gets better.

> Hayao Miyazaki’s Japanese animation company, Studio Ghibli, produces beautiful and famously labor intensive movies, with one 4 second sequence purportedly taking over a year to make.

It makes me wonder though - whether it’s more valuable to spend a year on a scene that most people won’t pay that much attention to (artists will understand and appreciate, maybe pause and rewind and replay and examine the details, the casual viewer just enjoy at a glance) or use tools in addition to your own skills to knock it out of the park in a month and make more great things.

A bit how digital art has clear advantages over paper, while many revere the traditional art a lot, despite it taking longer and being harder. The same way how someone who uses those AI assisted programming tools can improve their productivity by getting rid of some of the boilerplate or automate some refactoring and such.

AI will definitely cheapen the art of doing things the old way, but that’s the reality of it, no matter how much the artists dislike it. Some will probably adapt and employ new workflows, others stick to tradition.

  • M95D 18 hours ago

    It's a very clear difference between a cheap animation and Ghibli. Anyone can see it.

    In the first case, there's only one static image for an entire scene, scrolled and zoomed, and if they feel generous, there would be an overlay with another static image that slides over the first at a constant speed and direction. It feels dead.

    In the second case, each frame is different. There's chaotic motions such as wind and there's character movement with a purpose, even in the background, there's always something happening in the animation, there's life.

    • paulluuk 16 hours ago

      There is a huge middle ground between "static image with another sliding static image" and "1 year of drawing per 4 second Ghibli masterpiece". From your comment is almost looks like you're suggesting that you have to choose either one or the other, but that is of course not true.

      I bet that a good animator could make a really impressive 4-second scene if they were given a month, instead of a year. Possibly even if they were given a day.

      So if we assume that there is not a binary "cheap animation vs masterpiece" but rather a sort of spectrum between the two, then the question is: at what point do enough people stop seeing the difference, that it makes economic sense to stay at that level, if the goal is to create as much high-quality content as possible?

      • M95D 13 hours ago

        Yes, that the current trend in the western world. Money is all that matters. There's only lowest accepted quality. Anything above that is a waste of money, profits that are lost. Nobody wants masterpieces. There is no market for that.

        That lowest-accepted quality also declines over time, as generations after generations of people become used to rock-bottom quality. In the end, there's only slop and AI will make the cheapest slop ever. Welcome to a brave new world. We don't even need people anymore. They're too expensive.

        • pmyteh 10 hours ago

          To be fair, we've already been through this cycle at least once with animation. The difference between early Disney or even Looney Tunes and (say) late '60s Hanna-Barbera or '80s He-Man is enormous. Since then there has been generally higher-quality animation rather than lower (though I know it varies a lot by country, genre etc.)

          It's not inevitable that it's a race to the cheapest and shittest. That's just one (fairly strong) commercial force amongst many.

    • zipmapfoldright 17 hours ago

      anyone _can_ see it, but _most_ people don't (and don't care)

      To be clear, I am not saying it's not valuable, only that to the vast majority, it's not.

      • soneca 17 hours ago

        I wonder if really great stuff are always for a minority. You have to have listened a lot of classical music to notice a great interpretation of Mozart from a good one. To realize how great was a chess move, how magical was a soccer play, how deep was the writing of a philosopher. Not only for stuff that requires previous effort, but also the subjectiveness of art. Picasso will be really moving for a minority of people. The Godfather. Even Shakespeare.

        Social media and generative AI may be good business because the capture the attention of the majority, but maybe they are not valuable to anyone.

      • whywhywhywhy 17 hours ago

        > but _most_ people don't (and don't care)

        Perhaps it's not for everyone.

      • umko21 17 hours ago

        I think you’re right that most people don’t notice, but without the extra effort, it would’ve ended up as just another mediocre animation. And standing out from mediocrity is what made it appealing to many people.

      • M95D 16 hours ago

        Who cares if it's valuable for the majority? What do you think this is? Stock market for slop?

        This is art.

  • IanCal 16 hours ago

    Fundamentally I think this comes down to answering the question of "why are you creating this?".

    There are many valid answers.

    Maybe you want to create it to tell a story, and you have an overflowing list of stories you're desperate to tell. The animation may be a means to an end, and tools that help you get there sooner mean telling more stories.

    Maybe you're pretty good at making things people like and you're in it for the money. That's fine, there are worse ways to provide for your family than making things people enjoy but aren't a deep thing for you.

    Maybe you're in it because you love the act of creating it. Selling it is almost incidental, and the joy you get from it comes down to spending huge amounts of time obsessing over tiny details. If you had a source of income and nobody ever saw your creations, you'd still be there making them.

    These are all valid in my mind, and suggest different reasons to use or not to use tools. Same as many walks of life.

    I'd get the weeds gone in my front lawn quickly if I paid someone to do it, but I quite enjoy pottering around on a sunny day pulling them up and looking back at the end to see what I've achieved. I bake worse bread than I could buy, and could buy more and better bread I'm sure if I used the time to do contracting instead. But I enjoy it.

    On the other hand, there are things I just want done and so use tools or get others to do it for me.

    One positive view of AI tools is that it widens the group of people who are able to achieve a particular quality, so it opens up the door for people who want to tell the story or build the app or whatever.

    A negative side is the economics where it may be beneficial to have a worse result just because it's so much cheaper.

  • mytailorisrich 18 hours ago

    > It makes me wonder though - whether it’s more valuable to spend a year on a scene that most people won’t pay that much attention to

    In this case, yes it is.

    People do pay attention to the result overall. Studio Ghibli has got famous because people notice what they produce.

    Now people might not notice every single detail but I believe that it is this overall mindset and culture that enables the whole unique final product.

    • xandrius 18 hours ago

      I think most like the vibes, not the fact it took ages to make.

      • [removed] 18 hours ago
        [deleted]
      • Qualitionion 18 hours ago

        Its the quality or level of detail.

        Which might indicate an environment were quality is above quantity

  • happyraul 18 hours ago

    To me the question of what activity/method is more "valuable" in the context of art is kind of missing the point of art.

  • AlienRobot 5 hours ago

    >It makes me wonder though - whether it’s more valuable to spend a year on a scene that most people won’t pay that much attention to (artists will understand and appreciate, maybe pause and rewind and replay and examine the details, the casual viewer just enjoy at a glance) or use tools in addition to your own skills to knock it out of the park in a month and make more great things.

    If they didn't spend a year on it they wouldn't be copied now.

bartread 41 minutes ago

Interesting. So when I tried the “Indiana Jones” prompt I got an image back that looked a lot like Indiana Jones but with a face much more similar to Nathan Drake. Whereas the predator prompt generated an image of the predator but, unlike the article, wearing his mask.

So there’s clearly some amount of random chance in there, but the trope is still very clear in the generated image, so it seems like you’re going to get an archetype.

flessner a day ago

Everyone is talking about theft - I get it, but there's a more subtler point being made here.

Current generation of AI models can't think of anything truly new. Everything is simply a blend of prior work. I am not saying that this doesn't have economic value, but it means these AI models are closer to lossy compression algorithms than they are to AGI.

The following quote by Sam Altman from about 5 years ago is interesting.

"We have made a soft promise to investors that once we build this sort-of generally intelligent system, basically we will ask it to figure out a way to generate an investment return."

That's a statement I wouldn't even dream about making today.

  • nearbuy 21 hours ago

    > Current generation of AI models can't think of anything truly new.

    How could you possibly know this?

    Is this falsifiable? Is there anything we could ask it to draw where you wouldn't just claim it must be copying some image in its training data?

    • mjburgess 16 hours ago

      Novelty in one medium arises from novelty in others, shifts to the external environment.

      We got brass bands with brass instruments, synth music from synths.

      We know therefore, necessarily, that they can be nothing novel from an LLM -- it has no live access to novel developments in the broader environment. If synths were invented after its training, it could never produce synth music (and so on).

      The claim here is trivially falsifiable, and so obviously so that credulous fans of this technology bake it in to their misunderstanding of novelty itself: have an LLM produce content on developments which had yet to take place at the time of its training. It obviously cannot do this.

      Yet an artist which paints with a new kind of black pigment can, trivially so.

      • nearbuy 10 hours ago

        Kind of a weird take that excludes the vast majority of human artwork that most people would consider novel. For all the complaints one might have of cubism, few would claim it's not novel. And yet it's not based on any new development in the external world but rather on mashing together different perspectives. Someone could have created the style 100 years earlier if they were so inclined, and had Picasso never existed, someone could create the novel style today just by "remixing" ideas from past art in that very particular way.

      • moffkalast 14 hours ago

        > arises from novelty in others, shifts to the external environment

        > Everything is simply a blend of prior work.

        I generally consider these two to be the same thing. If novelty is based on something else, then it's highly derivative and its novelty is very questionable.

        A quantum random number generator is far more novel than the average human artist.

        > have an LLM produce content on developments which had yet to take place at the time of its training. It obviously cannot do this.

        Put someone in jail for the last 15 years, and ask them to make a smartphone. They obviously cannot do it either.

        • mjburgess 13 hours ago

          So if your point is an LLM is something like a person kept in a coma inside solitary confinement -- sure? But I don't believe that's where we set the bar for art: we arent employing comatose inmates to do anything.

          > I generally consider these two to be the same thing.

          Sure words themselves bend and break under the weight of hype. Novelty is randomness. Everything is a work of art. For a work of art to be non-novel it can only incorporate randomness.

          The fallacies of ambiguity abound to the point where speaking coherently disappears completely.

          An artist who finds a cave half-collapsed for the first time has an opportunity to render that novel physical state of the universe into art. Every moment which passes has a near infinite amount of such novel circumstances.

          Since an LLM cannot do that, we must wreck and ruin our ability to describe this plain and trivial situation. Poke our eyes and skewer our brains.

  • jedimastert a day ago

    The problem with generating genuinely new art is it requires "inputs" that aren't art. It's requires life experiences.

  • Davidzheng a day ago

    I beseech you, in the bowels of Christ, think it possible that you may be mistaken.

    • kubanczyk 18 hours ago

      Oliver Cromwell, a letter to the General Assembly of the Church of Scotland, 3 August 1650

  • bbor a day ago

    Disregarding the (common!) assumption that AGI will consist of one monolithic LLM instead of dozens of specialized ones, I think your comment fails to invoke an accurate, consistent picture of creativity/"truly new" cognition.

    To borrow Chomsky's framework: what makes humans unique and special is our ability to produce an infinite range of outputs that nonetheless conform to a set of linguistic rules. When viewed in this light, human creativity necessarily depends on the "linguistic rules" part of that; without a framework of meaning to work within, we would just be generating entropy, not meaningful expressions.

    Obviously this applies most directly to external language, but I hope it's clear how it indirectly applies to internal cognition and--as we're discussing here--visual art.

    TL;DR: LLMs are definitely creative, otherwise they wouldn't be able to produce semantically-meaningful, context-appropriate language in the first place. For a more empirical argument, just ask yourself how a machine that can generate a poem or illustration depicting [CHARACTER_X] in [PLACE_Y] doing [ACTIVITY_Z] in [STYLE_S] without being creative!

    [1] Covered in the famous Chomsky v. Foucault debate, for the curious: https://www.youtube.com/watch?v=3wfNl2L0Gf8

    • flessner 9 hours ago

      This may not be apparent to an english speaker as the language has a rather fixed set of words, but in German, where creating new words is common, the lack of linguistic creativity is obvious.

      As an example, let's talk about "vibe coding" - It's a new term describing heavy LLM usage in programming, usually associated with Generation Z.

      If I am asking an LLM to generate a German translation for "vibe coder" it comes up with the neutral "Vibe-Programmierer". When asking it to be more creative it came up with "Schwingungsschmied" ("vibration smith"?) - What?

      I personally came up with the following words:

      * Gefühlsprogrammierer ("A programmer, that focuses on intuition and feeling.")

      * Freischnauzeprogrammierer ("Free-mouthed programmer - highlighting straightforwardness and the creative expression of vibe coding." - colloquial)

      Interesstingly, LLMs can describe both these terms, they just can't create them naturally. I tested this on all major LLMs and the results were similar. Generating a picture of a "vibe coder" also highlights more of a moody atmosphere instead of the Generation Z aspects that are associated with it on social media nowadays.

    • Peritract 13 hours ago

      > a machine that can generate a poem or illustration depicting [CHARACTER_X] in [PLACE_Y] doing [ACTIVITY_Z] in [STYLE_S] without being creative

      Your example disproves itself; that's a madlib. It's not creative, it's just rolling the dice and filling in the blanks. Complex die and complex blanks are a difference of degree only, not creativity.

      • bbor 5 hours ago

        It's not filling in the blanks that's impressive, it's meaningfully combining them all into an objectively unique narrative, building upon those blanks at length.

        Definitions are always up for debate on instrumental grounds, but I'm dubious of any definition of "creative" that excludes truly unique yet meaningful artifacts. The only thing past that is ineffable stuff, which is inherently not very helpful for scientific discussion.

burnished a day ago

Oooh those guardrails make me angry. I get why they are there (dont poke the bear) but it doesn't make me overlook the self serving hypocrisy involved.

Though I am also generally opposed to the notion of intellectual property whatsoever on the basis that it doesn't seem to serve its intended purpose and what good could be salvaged from its various systems can already be well represented with other existing legal concepts, i.e deceptive behaviors being prosecuted as forms of fraud.

  • teddyh a day ago

    The problem is people at large companies creating these AI models, wanting the freedom to copy artists’ works when using it, but these large companies also want to keep copyright protection intact, for their regular business activities. They want to eat the cake and have it too. And they are arguing for essentially eliminating copyright for their specific purpose and convenience, when copyright has virtually never been loosened for the public’s convenience, even when the exceptions the public asks for are often minor and laudable. If these companies were to argue that copyright should be eliminated because of this new technology, I might not object. But now that they come and ask… no, they pretend to already have, a copyright exception for their specific use, I will happily turn around and use their own copyright maximalist arguments against them.

    (Copied from a comment of mine written more than three years ago: <https://news.ycombinator.com/item?id=33582047>)

    • ToValueFunfetti a day ago

      I don't care for this line of argument. It's like saying you can't hold a position that trespassing should be illegal while also holding that commercial businesses should be legally required to have public restrooms. Yes, both of these positions are related to land rights and the former is pro- while the latter is anti-, but it's a perfectly coherent set of positions. OpenAI can absolutely be anti-copyright in the sense of whether you can train an an NN on copyrighted data and pro-copyright in the sense of whether you can make an exact replica of some data and sell it as your own without making it into hypocrisy territory. It does suggest they're self-interested, but you have to climb a mountain in Tibet to find anybody who isn't.

      Arguments that make a case that NN training is copyright violation are much more compelling to me than this.

      • belorn a day ago

        The example you gave with public restroom do not work because of two main concept: They are usually getting paid for it by the government, and operating a company usually holds benefits given by the government. Industry regulations as a concept is generally justified in that industry are getting "something" from society, and thus society can put in requirements in return.

        A regulation that require restaurants to have a public bathroom is more akin to regulation that also require restaurants to check id when selling alcohol to young customers. Neither requirement has any relation with land rights, but is related to the right of operating a company that sell food to the public.

        • trentlott a day ago

          But what if businesses got benefits from society and tax money and were free to ignore the needs/desires of those who pay taxes and who society consists of? That seems just about right.

      • TremendousJudge a day ago

        No, the exception they are asking for (we can train on copyrighted material and the image produced is non-copyright infringing) is copyright infringing in the most basic sense.

        I'll prove it by induction: Imagine that I have a service where I "train" a model on a single image of Indiana Jones. Now you prompt it, and my model "generates" the same image. I sell you this service, and no money goes to the copyright holder of the original image. This is obviously infringment.

        There's no reason why training on a billion images is any different, besides the fact that the lines are blurred by the model weights not being parseable

    • jofla_net a day ago

      I guess the best explanation for what we're witnessing is the notion that 'Money Talks', and sadly nothing more. To think thats all that fair use activists lacked in years passed..

  • theshrike79 20 hours ago

    It's not just the guardrails, but the ham-fisted implementation.

    Grok is supposed to be "uncensored", but there are very specific words you just can't use when asking it to generate images. It'll just flat out refuse or give an error message during generation.

    But, again, if you go in a roundabout way and avoid the specific terms you can still get what you want. So why bother?

    Is it about not wanting bad PR or avoiding litigation?

    • mrweasel 17 hours ago

      The implementation is what gets to me too. Fair enough that a company doesn't want their LLM used in a certain way. That's their choice, even if it's just to avoid getting sued.

      How they then go about implementing those guardrails is pretty telling about their understand and control over what they've build and their line of thinking. Clearly, at no point before releasing their LLMs onto the world did anyone stop and ask: Hey, how do we deal with these things generating unwanted content?

      Resorting to blocking certain terms in the prompts is like searching for keywords in spam emails. "Hey Jim, I got another spam email from that Chinese tire place" - "No worry boss, I've configured the mail server to just delete any email containing the words China or tire".

      Some journalist should go to a few of these AI companies and start asking questions about the long term effectiveness and viability of just blocking keywords in prompts.

neomantra 16 hours ago

> Maybe Studio Ghibli making it through the seemingly deterministic GPT guardrails was an OpenAI slip up, a mistake,

The author is so generous... but Sam Altman literally has a Ghibli-fied Social profile and in response to all this said OpenAI chooses its demos very carefully. His primary concern is that Ghibli-fying prompts are over-consuming their GPU resources, degrading the service by preventing other ChatGPT tasks.

  • gambiting 16 hours ago

    The official White House account has been posting ghiblified images too, Altman knows that as long as he's not critical of the current administration he's untouchable.

    • slig 16 hours ago

      >he's untouchable

      Doesn't he have a pretty bad disagreement with Elon?

coderenegade a day ago

I don't see why this is an issue? The prompts imply obvious and well-known characters, and don't make it clear that they want an original answer. Most humans would probably give you similar answers if you didn't add an additional qualifier like "not Indiana Jones". The only difference is that a human can't exactly reproduce the likeness of a famous character without significant time and effort.

The real issue here is that there's a whole host of implied context in human languages. On the one hand, we expect the machine to not spit out copyrighted or trademarked material, but on the other hand, there's a whole lot of cultural context and implied context that gets baked into these things during training.

  • dgunay a day ago

    I think the point is that for a lot of them there are endless possible alternatives to the character design, but it still generates one with the exact same design. Why can't, for example, the image of Tomb Raider have a different colored tank top? Why is she wearing a tank top and not a shirt? Why does she have to have a gun? Why is she a busty, attractive brunette? These are all things that could be different but the dominance of Lara Croft's image and strong association with the words "tomb raider" in popular culture clearly influences the model's output.

    • coderenegade a day ago

      Because it's not clear that that's what you want. What's the context? Are we playing a game where I guess a character? Is it a design session for a new character based on a well known one, maybe a sidekick? Is it a new take on an old character? Are you just trying to remember what a well-known character looks like, and giving a brief prompt?

      It's not clear what the asker wants, and the obvious answer is probably the culturally relevant one. Hell, I'd give you the same answers as the AI did here if I had the ability to spit out perfect replicas.

    • echoangle a day ago

      And how is that bad or surprising? It’s actually what I would expect from how AI works.

      • SV_BubbleTime a day ago

        Exactly. We designed systems that work on attention and inference… and then surprised that it returns popular results?

  • mvieira38 a day ago

    It's an IP theft machine. Humans wouldn't be allowed to publish these pictures for profit, but OpenAI is allowed to "generate" them?

    • victorbjorklund a day ago

      I would 100% be allowed to draw an image of Indiana Jones in illustrator. There is no law against me drawing his likeness.

    • why_at a day ago

      I'm honestly trying to wrap my head around the law here because copyright is often very confusing.

      If I ask an artist to draw me a picture of Indiana Jones and they do it would that be copyright infringement? Even if it's just for my personal use?

      • bawolff a day ago

        Probably that would be a derrivative work. Which means the original owner would have some copyright in it.

        It may or may not be fair use, which is a complicated question (ianal).

      • Avicebron a day ago

        IANAL, but if OpenAI makes any money/commercial gains from producing a Ghibli-esque image when you ask, say you pay a subscription to OpenAI. What percentage of that subscription is owed to Ghibli for running Ghibli art through OpenAI's gristmill and providing the ability to create that image with that "vibe/style" etc. How long into perpetuity is OpenAI allowed to re-use that original art whenever their model produces said similar image. That seems to be the question.

      • xboxnolifes a day ago

        I would think yes. Consider the alternate variation where the artist proactively draws Indiana Jones, in all his likeness, and attempts to market and sell it. The same exchange is ultimately happening, but this clearly is copyright infringement.

    • pwarner a day ago

      To me a lot has to do with what a human does with them one the tool generates them no?

    • Smithalicious a day ago

      Won't somebody think of the billionaire IP holders? The horror.

      • asadotzler a day ago

        And the small up and coming artists whose work is also stolen, AI-washed, and sold to consumers for a monthly fee, destroying the market for those up and coming artists to sell original works. You don't get to pretend this is only going to hurt big players when there are already small players whose livelihoods have been ruined.

  • jmull a day ago

    Normally (well, if you're ethical) credit is given.

    Also, there are IP limits of various sorts (e.g. copyright, trademark) for various purposes (some arguably good, some arguably bad), and some freedoms (e.g., fair use). There's no issue if this follows the rules... but I don't see where that's implemented here.

    It looks like they may be selling IP they don't own the right to.

samspot 10 hours ago

This makes AI image generation very boring. I don't want to generate pictures I can find on google, I want to make new pictures.

I found apple's tool frustrating. I have a buzzed haircut, but no matter what I did, apple was unable to give me that hairstyle. It wants so bad for my avatar to have some longer hair to flourish, and refuses to do anything else.

mlsu a day ago

I was really hoping that the conversation around AI art would at least be partially centered on the perhaps now dated "2008 pirate party" idea that intellectual property, the royalty system, the draconian copyright laws that we have today are deeply silly, rooted in a fiction, and used over and over again, primarily by the rich and powerful, to stifle original ideas and hold back cultural innovation.

Unfortunately, it's just the opposite. It seems most people have fully assimilated the idea that information itself must be entirely subsumed into an oppressive, proprietary, commercial apparatus. That Disney Corp can prevent you from viewing some collection of pixels, because THEY own it, and they know better than you do about the culture and communication that you are and are not allowed to experience.

It's just baffling. If they could, Disney would scan your brain to charge you a nickel every time you thought of Mickey Mouse.

  • kokanee a day ago

    The idea of open sourcing everything and nullifying patents would benefit corporations like Disney and OpenAI vastly more than it would benefit the people. The first thing that would happen is that BigCorp would eat up every interesting or useful piece of art, technology, and culture that has ever been created and monetize the life out of it.

    These legal protections are needed by the people. To the Pirate Party's credit, undoing corporate personhood would be a good first step, so that we can focus on enforcing protections for the works of humans. Still, attributing those works to CEOs instead of corporations wouldn't result in much change.

    • pixl97 a day ago

      >The first thing that would happen is that BigCorp would eat up every interesting or useful piece of art, technology, and culture that has ever been created and monetize the life out of it.

      Wait, I'm still trying to figure out the difference between your imaginary world and the world we live in now?

      • Lerc a day ago

        I think the main difference is if everything were freely available they may attempt to monetize the life out of it, but they will fail if they can't actually provide something people actually want. There's no more "You want a thing so you're going to buy our thing because we are the exclusive providers of it. That means we don't even have to make it very good"

        If anyone in the world could make a Star Wars movie, the average Star Wars movie would be much worse, but the best 10 Star Wars movies might be better that what we currently have.

      • dragontamer a day ago

        Thor would have red hair in the imaginary world, rather than being a Blonde man which was made to be a somewhat distinguished comic book character.

        The Disney or otherwise copyrighted versions allow for unique spins on these old characters to be re-copyrighted. This Thor from Disney/Marvel is distinguished from Thor from God of War.

        • runarberg a day ago

          > “Before starting the series, we stuffed ourselves to the gills with Norse mythology, as well as almost every other type of mythology – we love it all! But you’ve got to remember that these are legendary tales – myths – and no two versions are ever exactly the same. We changed a lot of things – for example, in most of the myths Thor has red hair, Odin has one eye, etc. But we preferred doing our own version.”

          https://scifi.stackexchange.com/questions/54400/why-did-earl...

          Huh, did not know that. As an Icelandic person I knew about Þór the Norse god much earlier than Thor the marvel character. I never really pictured his hair color, nor knew he had a specific hair color in the mythology. I actually always pictured him with a beard though. What mostly mattered though was his characteristics. His ill temper and drinking habits, and the fact that he was not a nice person, nor a hero, but rather a guy who starts shit that gets everyone else in trouble, he also wins every fight except one (he looses one against Elli [the personification of old age]). The little I’ve seen of him in the Marvel movies, he keeps almost none of these characteristics.

          EDIT: My favorite story of him is the depiction of the fall of Ásgarður, where Loki and some Jötun are about to use the gods vanity against them and con them out of stuff they cannot actually pay for a wall around Ásgarður. Þór, being the way he is, cannot be around a Jötun without fighting and killing him. So rather than paying up (which the gods cannot do) Þór is sent to see this Jötun, knowing very well that he will be murdered. This betrayal is marked as the beginning of the end in Völuspá (verse 26).

    • dcow a day ago

      How do restaurants work, then? You can’t copyright a recipe. Instructions can’t generally be copyrighted, otherwise someone would own the fastest route from A to B and charge every person who used it. The whole idea of intellectual property gets really weird when you try to pinpoint what exactly is being owned.

      I do not agree with your conjecture that big corps would win by default. Ask why would people need protection from having their work stolen when the only ones welding weaponized copyright are the corporations. People need the freedom to wield culture without restriction, not protection from someone having the same idea as them and manifesting it.

      • apersona 6 hours ago

        > I do not agree with your conjecture that big corps would win by default.

        Why wouldn't big corps win by default? They have the brand name, own the resources to make more polished version of any IP, and have better distribution channels than anyone else.

        Can you tell me how this scenario won't play out?

        1. Big corporation has people looking for new and trending IP.

        2. Instead of buying the rights to it, they get their army of people to produce more polished versions of it.

        3. Because they have branding and a better distribution channel, the money goes 100% to them.

        > Ask why would people need protection from having their work stolen when the only ones welding weaponized copyright are the corporations.

        People working in the field sell their copyright like Gravity Falls' Alex Hirsch: https://x.com/_AlexHirsch/status/1906915851720077617

      • singleshot_ a day ago

        It’s more reasonable to say that the idea of intellectual property is challenging for nonlawyers because of the difficulty in understanding ownership not as one thing, but as a bundle of various elements of control, exclusion, obligation, or entitlement, even some of which spring into existence out of nowhere.

        In other words, the challenge is not to understand “what exactly is being owned,” and instead, to understand “what exactly being owned is.”

      • ipsento606 a day ago

        > How do restaurants work, then?

        Primarily because recipe creation is not one of the biggest cost centers for restaurants?

      • hammock a day ago

        > How do restaurants work, then? You can’t copyright a recipe.

        They barely work. Recipes are trade secrets, and the cooks who use them are either paid very well, given NDAs or given only part of the most guarded recipes

      • api a day ago

        A restaurant is a small manufacturing facility that produces a physical product. It’s not the same at all.

      • awesome_dude a day ago

        Closed source - when was the last time your restaurant told you what was in, and how to make, your favourite dish?

        What's in Coca Cola?

        What are the 11 herbs and spices in Kentucky Fried Chicken?

        How do I make the sauce in a Big Mac?

    • dragonwriter a day ago

      > The idea of open sourcing everything and nullifying patents would benefit corporations like Disney and OpenAI vastly more than it would benefit the people.

      Disney would be among the largest beneficiaries if the right to train AI models on content was viewed as an exclusive right of the copyright holder; they absolutely do not benefit from AI training being considered fair use.

    • julianeon a day ago

      Maybe now, post-AI.

      But if you'd asked this question in 2015 or earlier, everyone would have said Disney -> pro-patent, average people & indie devs -> anti-patent. Microsoft was famously pro-patent, as were a number of nuisance companies that earned the label "patent troll."

      Honestly, this idea of "patents to protect the people" would've come across as a corporate lawyer trick pre-2015.

    • csallen a day ago

      This is the exact opposite of the truth.

      Look at YouTube. Look at SoundCloud. Look at all the fan fiction sites out there, internet mangas and manwhas and webtoons, all the podcasts, all the influencers on X and Instagram and TikTok and even OnlyFans, etc etc. Look at all the uniquely tiny distribution channels that small companies and even individuals are able to build in connection with their fans and customers.

      There is endless demand for the endless variety of creativity and content that's created by normal people who aren't Disney, and endless ways to get it into people's hands. It is literally impossible for any one company to hoover all of it up and somehow keep it from the people.

      In fact, the ONLY thing that makes it possible for them to come close to doing that is copyright.

      And the only reason we have such a huge variety of creativity online is because people either (a) blatantly violate copyright law, or (b) work around gaps in copyright law that allow them to be creative without being sued.

      The idea that we need copyrights to protect us from big companies is exactly wrong. It's the opposite. Big companies need copyright to protect their profits from the endless creativity and remixing of the people.

    • echelon a day ago

      The original claim is false,

      > intellectual property [...] used over and over again, primarily by the rich and powerful, to stifle original ideas and hold back cultural innovation.

      There's nothing about IP which prevents you from creating your own. There are, in fact, a near infinite number of things you can create. More things than there exist stars in our galaxy.

      The problem with ideas is that they have to be good. They have to be refined. They have to hit the cultural zeitgeist, solve a particular problem, or just be useful. That's the hard part that takes the investment of time and money.

      In the old world before Gen AI, this was the hard thing that kept companies in power. That world is going away fast, and now creation will be (relatively) easy. More taste makers will be slinging content and we'll wind up in a land of abundance. We won't need Disney to give us their opinion on Star Wars - we can make our own.

      The new problem is distributing that content.

      > The idea of open sourcing everything and nullifying patents would benefit corporations like Disney and OpenAI vastly more than it would benefit the people. The first thing that would happen is that BigCorp would eat up every interesting or useful piece of art, technology, and culture that has ever been created and monetize the life out of it.

      Unless the masses can create and share on equal footing, you're 100% right.

      If it turns out, however, that we don't need Google, OpenAI, or big tech to make our own sci-fi epics, share them with a ton of people, and interact with friends and audiences, then the corporations won't be able to profit off of it.

      If social networks were replaced with common carriers and protocols.

      If Gen AI could run at the edge without proprietary models or expensive compute.

      If the data of YouTube, Reddit, Twitter, Instagram didn't require hyperscaler infra to store, search, and serve.

      Unfortunately, there are too many technical reasons why the giants will win. And network effects will favor the few versus many. Unless those parameters change, we'll be stuck with big tech distribution.

      Even if the laws around IP change, the hard tech challenges keep the gatekeepers in power. The power accrues to those who can dominate creation (if creation is unilateral), or even more so, to the distributors of that content.

      • dcow a day ago

        > We won't need Disney to give us their opinion on Star Wars - we can make our own.

        Disney would say that you can’t. And in the current copyright regime, it’s not unlikely that they’d convince the court that they’re right.

        • echelon a day ago

          > Disney would say that you can’t.

          Disney won't have any control. I can already generate images and videos locally on my hardware.

          Maybe they'll try to stop distribution? There will be quite a lot of people making these, though.

      • api a day ago

        This is the same argument we made in the 90s about what the web was going to do. What ended up happening was the growth of aggregators and silos like Facebook that baited everyone with ease of use into putting everything into their walled garden and then monetized it. The creators, namely the posters of the content, got nothing.

        The same is happening already with AI creations. Doing it yourself is work and takes some technical skill, so most people use hosted AI services. Guess who makes all the money?

        You will be able to create and share your own spin on Star Wars. You won’t see anything for that except maybe cred or some upvotes. The company that hosts it and provides the gateway and controls the algorithms that show it to people will get everything.

      • codedokode a day ago

        > The problem with ideas is that they have to be good.

        No they don't, look at music popular in social networks.

        > and now creation will be (relatively) easy. More taste makers will be slinging content and we'll wind up in a land of abundance.

        Even before the generative AI, I think we live in the era where there are more creators than ever in history: everybody today can publish their music or art without any large investments (except for instruments: they are expensive as always). I would prefer we have cheaper pianos, samples and microphones instead of worthless music-copying models.

  • eaglelamp a day ago

    If we are going to have a general discussion about copyright reform at a national level, I'm all for it. If we are going to let billion dollar corporations break the law to make even more money and invent legal fictions after the fact to protect them, I'm completely against it.

    Training a model is not equivalent to training a human. Freedom of information for a mountain of graphics cards in a privately owned data center is not the same as freedom of information for flesh and blood human beings.

    • r3trohack3r a day ago

      You’re setting court precedent that will apply equally to OpenAI as it does to the llama.cpp and stable diffusion models running on your own graphics card.

      • photonthug a day ago

        I don’t know about that, we seem to be so deeply into double standards for this stuff that we’ve forgotten they are double standards. If I aggressively scrape content from anywhere and everywhere ignoring robots.txt and any other terms and conditions, then I’ll probably be punished. Corporate crawlers that are feeding the beast just do this on a massive scale and laugh off all of the complaints, including those from smaller corporations who hire lawyers..

      • munificent a day ago

        SGTM.

        Honestly, seriously. Imagine some weird Thanos showed up, snapped his fingers and every single bit of generative AI software/models/papers/etc. were wiped from the Earth forever.

        Would that world be measurably worse in any way in terms of meaningful satisfying lives for people? Yes, you might have to hand draw (poorly) your D&D character.

        But if you wanted to read a story, or look at an image, you'd have to actually connect with a human who made that thing. That human would in turn have an audience for people to experience the thing they made.

        Was that world so bad?

      • codedokode a day ago

        Can stable diffusion be created without using copyrighted content? Maybe we should have some exemption for non-commercial research but definitely not for commercial exploitation or generating copyrighted images using open-source models.

    • robocat a day ago

      > invent legal fictions after the fact

      You're reading into the situation...

      For the US getting legislators to do anything is impossible: even the powerful fail.

      When a legal system is totally roadblocked, what other choice is there? The reason all startups ask forgiveness is that permission is not available.

      (edit). Shit. I guess that could be a political statement. Sorry

  • tastyface a day ago

    A different way of looking at it: AI, by design, defaults to regurgitating the poppiest of pop culture content. Every whip-wielding archaeologist is now Harrison Ford. Every suave British spy is now Daniel Craig. With the power of AI, creativity is dead and buried.

    • slg a day ago

      This is what was often missed in the previous round of AI discourse that criticized these companies for forcing diversity into their systems after the fact. Every suave spy being Daniel Craig is just the apolitical version of every nurse being a woman or every criminal being Black. Converging everything to the internet's most popular result represents an inaccurate and a dumped down version of the world. You don't have to value diversity as a concept at all to recognize this as a systemic flaw of AI, it is as easy as recognizing that Daniel Craig isn't the only James Bond let alone the only "suave English spy".

      • dcow a day ago

        It’s only a flaw insofar as it’s used in ways in which the property of the tool is problematic. Stereotypes are use for good and bad all the time, let’s not pretend that we have to attack every problem with a funky shaped hammer because we can’t admit that it’s okay to have specialized tools in the tool belt.

    • sejje 7 hours ago

      Why does the AI have to inject the creativity? It's supposed to guess what you want and generate it. The prompts in the article make it clear the author wants Harrison Ford.

      If you ask it for a female adventure-loving archaeologist with a bullwhip, you think you'll get Harrison Ford?

      What if you ask for a black man? Etc etc.

      You're talking about how unoriginal it is when the human has asked it in the least creative way. And it gives what you want (when the content filters don't spot it)

    • card_zero a day ago

      The backlash against AI compels creative types to be more original, maybe. It could be that AI improves culture by reflecting it in insipid parody, with the implicit message "stop phoning it in".

    • darioush a day ago

      don't you think it is empowering and aspiring for artists? they can try several drafts of their work instantaneously, checking out various compositions etc before even starting the manual art process.

      they could even input/train it on their own work. I don't think someone can use AI to copy your art better than the original artist.

      Plus art is about provenance. If we could find a scrap piece of paper with some scribbles from Picasso, it would be art.

      • Kim_Bruning a day ago

        This does seem to work for writing. Feed your own writing back in and try variations / quickly sketch out alternate plots, that sort of thing.

        Then go back and refine.

        Treat it the same as programming. Don't tell the AI to just make something and hope it magically does it as a one-shot. Iterate, combine with other techniques, make something that is truly your own.

    • SirMaster 11 hours ago

      But why Daniel Craig and not Pierce Brosnan?

    • autoexec a day ago

      > A different way of looking at it: AI, by design, defaults to regurgitating the poppiest of pop culture content.

      That's the whole problem with AI. It's not creative. There's no "I" in AI. There's just what we feed it and it's a whole lot of "garbage in, garbage out". The more the world is flooded with derivative AI slop the less there will be of anything else to train AI on and eventually we're left with increasingly homogenized and uncreative content drowning out what little originality is still being made without AI.

  • II2II a day ago

    > That Disney Corp can prevent you from viewing some collection of pixels, because THEY own it

    A world without copyright is just as problematic as a world with copyright. With copyright, you run into the problem of excessive control. This wasn't too much of a problem in the past. If you bought a book, record, or video recording, you owned that particular copy. You could run into disagreeable situations because you didn't own the rights, but it was difficult to prevent anyone from from viewing a work once it had been published. (Of course, modern copyright laws and digitial distribution has changed that.)

    On the flip side, without copyright, it would be far easier for others to exploit (or even take credit) for the work of another person without compensation or recourse. Just look at those AI "generated" images, or any website that blatently rips off the content created by another person. There is no compensation. Heck, there isn't even credit. Worse yet, the parties misrepresenting the content are doing their best to monetize it. Even authors who are more than willing to give their work away have every right to feel exploited under those circumstances. And all of that is happening with copyright laws, where there is the opportunity for recourse if you have the means and the will.

    • dcow a day ago

      You don’t need credit to talk about pop culture. I don’t need to credit the Indian Jones copyright holder when I paint a stunning likeness of Ford in a kaki outfit with a whip, even if the holder might try to sue me over it. Copyright and credit aren’t the same.

      • paulryanrogers a day ago

        There are also trademark protections. I heard Ford actually trademarked his likeness to ensure he got a piece of the merchandise action.

    • singpolyma3 a day ago

      To reply to the parenthetical, copyright has nothing to do with credit. Taking credit for someone else's work is banned in some places in some contexts (they call this a moral rights regime) but not the same thing as what is being talked about when people say copyright (which is about copying and performing)

    • idiotsecant a day ago

      The idea that someone can't use ideas without someone else making money from it is a really, really, radically weird idea and is very new in the history of human society.

  • xorcist a day ago

    I think what you observe is more like a natural blowback to the prevailing idea that this is somehow beyond critique because it will fundamentally change culture and civilization forever.

    There's a bit of irony here too. The intellectual discourse around intellectural property, a diverse and lively one from an academic standpoint, the whole free and open source software movements, software patents, the piracy movement and so on have analyzed the history, underlying ideas and values in detail for the past thirty years. Most people know roughly what is at stake, where they stand, and can defend their position in an honest way.

    Then comes new technology, everyone and their mother gets excited about it, and steamrolls all those lofty ideas into "oh look at all the shiny things it can produce!". Be careful what you wish for.

    • achierius a day ago

      Let's be clear. You can be for free software, against copyright, etc., and STILL be in favor of these firms being punished for violating copyright as they have. Because frankly, we -- normal people -- have always known that we would be punished if we did anything close to this: so many people have been thrown in jail, even killed themselves, because they distributed some film or hosted some books. But now, when a big corporation does it, and in doing so seeks to replace and impoverish thousands, millions of hard-working, law-abiding people, now is when we should expect the government to finally say -- oh, that copyright thing was silly all along? No. Perhaps if the deal was that the whole system would go away entirely -- that we, too, could do what these firms have done. But that's not what's being proposed. That will not happen. They want the laws to be for them, not for us, and I will always be opposed to attempts at actualizing that injustice.

      • FeepingCreature 19 hours ago

        IMO the natural effect of this will be to massively devalue any individual cultural artifact, and that this will also achieve the benefit of defanging the big copyright holders. Is it the right way to go about it? No. Is it an insult to anyone who ever got nabbed for piracy? Sure. But tbh as a pirate voter I'll still very much take it.

  • ryandrake a day ago

    Not just some particular collection of pixels, but an infinite number of combinations of collections of pixels, any of which remotely invoke a shadow of similarity to hundreds of "properties" that Disney lays claim to.

    • codedokode a day ago

      But why do you want to make a collection of pixels that resembles existing characters and not create your own?

  • chimpanzee a day ago

    Essentially: “information wants to be free”.

    I agree.

    But this must include the dissolution of patents. Otherwise corporations and the owners of the infrastructure will simply control everything, including the easily replicable works of individuals.

    • j-bos a day ago

      At least patents only last 20 years as opposed to nearly over a century for copyright.

      • paulryanrogers a day ago

        In practice it's often longer. Drug companies queue up minor tweaks to their formulas and can threaten to sue anyone even close to the new way, even carbon copies of the now expired patent. Few can afford to win a lawsuit.

        We need more courts and judges to speed the process, to make justice more accessible, and universal SLAPP protections to weed out frivolous abuse.

    • codedokode a day ago

      I am against dissolution of patents if the technology took lot of research. In this case the patent protects from others copying the result of research.

      However, obvious patents like "a computer system with a display displaying a product and a button to order it" should not be allowed. Also, software patents should not exist (copyright is enough).

      • wsintra2022 a day ago

        What if all that research led to some incredible world changing for the better idea/concept/product in an open society that would benefit everyone, in the closed society only those allowed to use the patent benefit

  • r0s a day ago

    It's not baffling in the least.

    No matter the extent you believe in the freedom of information, few believe anyone should then be free to profit from someone else's work without attribution.

    You seem to think it would be okay for disney to market and charge for my own personal original characters and art, claiming them as their own original idea. Why is that?

    • raspyberr a day ago

      Yes. I 100% unironically believe that anyone should be able to use anyone else's work royalty/copyright free after 10-20 years instead of 170 in the UK. Could you please justify why 170 years is in any way a reasonable amount of time?

      • card_zero a day ago

        The copyright last 70 years after the death of the author, so 170 years would be rare (indeed 190 years would be possible). This was an implementation of a 1993 EU directive:

        https://en.wikipedia.org/wiki/Copyright_Duration_Directive

        That itself was based on the 1886 Berne Convention. "The original goal of the Berne Convention was to protect works for two generations after the death of the author". 50 years, originally. But why? Apparently Victor Hugo (he of Les Miserables) is to blame. But why was he bothered?

        Edit: it seems the extension beyond the death of the author was not what Hugo wanted. "any work of art has two authors: the people who confusingly feel something, a creator who translates these feelings, and the people again who consecrate his vision of that feeling. When one of the authors dies, the rights should totally be granted back to the other, the people." So I'm still trying to figure out who came up with it, and why.

      • r0s 5 hours ago

        "use" vs. sell is the problem here. Or do you think they are the same?

      • codedokode a day ago

        May I ask why you want to use someone's work instead of creating your own?

  • furyofantares a day ago

    I think we have all grown up with pervasive strong IP rights, and most people have come to internalize it as a matter of fairness or an almost natural right, rather than a practical tool designed to incentivize creation.

    And then even if you get past that, the world is filled with lots of IP we love, and it is easy to imagine weakened IP rights taking that away, but quite difficult to imagine what weakened IP rights might buy us.

    I do have some hope still that this generative AI stuff will give a glimpse into the value of weaker IP rights and maybe inspire more people to think critically about it. But I think it is an uphill battle. Or maybe it will take younger people growing up on generative AI to notice.

  • gerdesj a day ago

    How do you suggest you protect your "thing"?

    * If I make a thing that is different and I get a patent - cool. * If I create a design that is unusual and I get copyright on it - is that cool?

    Both concepts - patent and copyright - are somewhat controversial, for multiple reasons.

    If you invented a thingie, would you not want some initial patent related protection to allow you to crack on with some sort of clout against cough CN? If you created a film/creative thang, would you not want some protection against your characters being ... subverted.

    Patents and copywrite are what we have - do you have any better ideas?

  • serviceberry a day ago

    What's the damage to the society done by Disney holding the rights to Mickey Mouse? Like, if we're being honest?

    Patents, sure. They're abused and come at a cost to the society. But all we've done here is created a culture where, in some sort of an imagined David-vs-Goliath struggle against Disney, we've enabled a tech culture where it's OK to train gen AI tech on works of small-scale artists pilfered on an unprecedented scale. That's not hurting Disney. It's hurting your favorite indie band, a writer you like, etc.

    • fiddlerwoaroof a day ago

      It’s worse in music: the folk music that came before recorded music had a long history of everyone borrowing and putting their own spin on someone else’s tune and, today, this is viewed as some kind of assault on the originator of the tune.

      If companies can’t gatekeep our artistic culture for money, we’ll be better able to enjoy it.

  • WhyOhWhyQ a day ago

    We're about to witness a fundamental shift in the human experience. Some time in the near future there will not be a single act of creation you can do that isn't trivial compared to the result of typing "make cool thing please now" into the computer. And your position is to add to the problem because with your policy anything I create should get chucked into the LLM grinder by any and everybody. How do I get my human body to commit to doing hard things with that prospect at hand? This is the end of happiness.

    • redwood a day ago

      This is why I love making bread

      • GPerson a day ago

        We can’t all be bread making hedonists. Some of us want these finite lives to mean more than living constantly in the moment in a state of baking zen.

    • card_zero a day ago

      I don't know, that sounds like the basic argument for copyright: "I created a cool thing, therefore I should be able to milk it for the rest of my life". Without this perk, creatives are less motivated. Would that be bad? I guess an extreme version would be a world where you can only publish anonymously and with no tangible reward.

      • jkhdigital a day ago

        I hate to paint with such a broad brush, but I’d venture that “creatives” are not primarily motivated by profit. It is almost a truism that money corrupts the creative endeavour.

        • card_zero a day ago

          There are various ways to turn creativity into money, even without publishing any kind of artwork. Basically all skilled jobs and entrepreneurial enterprises require creativity. And if you do have an artwork, you can still seek profit through acclaim, even without copyright: interviews, public appearances. Artists once had patrons - but that tends to put aristocrats in control of art.

          So money will motivate a lot of the creativity that goes on.

          Meanwhile, if you dabble in some kind of art or craft while working in a factory to make ends meet, that kind of limits you to dabbling, because you'll have no time to do it properly. Money also buys equipment and helpers, sometimes useful.

          On the other hand, yes, it ruins the art. There's a 10cc song about that. https://en.wikipedia.org/wiki/Art_for_Art%27s_Sake_(song)

          Though, this reminds me of an interesting aside: the origin of the phrase "art for art's sake" was not about money, but about aesthetics. It meant something like "stop pushing opinions, just show me a painting".

  • masfuerte a day ago

    I don't really care.

    Either enforce the current copyright regime and sue the AI companies to dust.

    Or abolish copyright and let us all go hog wild.

    But this halfway house where you can ignore the law as long as you've got enough money is disgusting.

    • dragonwriter a day ago

      Or treat AI training as within the coverage of the current fair use regime (which is certainly defensible within the current copyright regime), while prosecuting the use of AI models to create infringing copies and derivative works that do not themselves have permission or a reasonable claim to be within the scope of fair use as a violation (and prosecuted hosted AI firms for contributory infringement where their actions with regard to such created infringements fit the existing law on that.)

      • Wowfunhappy a day ago

        ^ I feel like I almost never see this take, and I don't understand why because frankly, it strikes me at patently obvious! Of course the tool isn't responsible, and the person who uses it is.

      • prawn a day ago

        I see AI training on public material like I would upcoming artists being inspired by the artists before them. Obviously the scale is very different. I don't mind your scenario because an AI firm, if they couldn't stay on top of what their model was creating, could voluntarily reduce the material used to train it.

        • codedokode a day ago

          You imply that AI model is creating new works and not merely rearranging pieces from other works you never saw and therefore might consider novel. AI model is not a model of a creative human currently: a human doesn't need to listen to million songs to create his own.

    • ryandamm a day ago

      This may not be a particularly popular opinion, but current copyright laws in the US are pretty clearly in favor of training an AI as a transformative act, and covered by fair use. (I did confirm this belief in conversation with an IP attorney earlier this week, by the way, though I myself am not a lawyer.)

      The best-positioned lawsuits to win, like NYTimes vs. OpenAI/MS, is actually based on violating terms of use, rather than infringing at training time.

      Emitting works that violate copyright is certainly possible, but you could argue that the additional entropy required to pass into the model (the text prompt, or the random seed in a diffusion model) is necessary for the infringement. Regardless, the current law would suggest that the infringing action happens at inference time, not training.

      I'm not making a claim that the copyright should work that way, merely that it does today.

      • codedokode a day ago

        > Regardless, the current law would suggest that the infringing action happens at inference time, not training.

        Zuckerberg downloading a large library of pirated articles does not violate any laws? I think you can get a life sentence for merely posting links to the library.

      • photonthug a day ago

        > The best-positioned lawsuits to win, like NYTimes vs. OpenAI/MS, is actually based on violating terms of use, rather than infringing at training time.

        I agree with this, but it's worth noting this does not conflict with and kind of reinforces the GP's comment about hypocrisy and "[ignoring] the law as long as you've got enough money".

        The terms of use angle is better than copyright, but most likely we'll never see any precedent created that allows this argument to succeed on a large scale. If it were allowed then every ToS would simply begin to say Humans Only, Robots not Welcome or if you're a newspaper then "reading this you agree that you're a human or a search engine but will never use content for generative AI". If github could enforce site terms and conditions like that, then they could prevent everyone else from scraping regardless of individual repository software licenses, etc.

        While the courts are setting up precedent for this kind of thing, they will be pressured to maintain a situation where terms and conditions are useful for corporations to punish people. Meanwhile, corporations won't be able to punish corporations for the most part, regardless of the difference in size. But larger corporations can ignore whatever rules they want, to the possible detriment of smaller ones. All of which is more or less status quo

      • o11c a day ago

        Training alone, perhaps. But the way the AIs are actually used (regardless of prompt engineering) is a direct example of what is forbidden by the case that introduced the "transformative" language.

        > if [someone] thus cites the most important parts of the work, with a view, not to criticize, but to supersede the use of the original work, and substitute the review for it, such a use will be deemed in law a piracy.

        Of course, we live in a post-precedent world, so who knows?

    • mlsu a day ago

      The hypocrisy is obviously disgusting.

      It also shows how, at the end of the day, none of the justifications for this intellectual property crap are about creativity, preserving the rights of creators, or any lofty notion that intellectual property actually makes the world a better place, but rather, it is a naked power+money thing. Warner Bros and Sony can stop you from publishing a jpeg because they have lawyers who write the rulebook. Sam Altman can publish a jpeg because the Prince of Saud believes that he is going build for corporate America a Golem that can read excel spreadsheets.

  • rglullis a day ago

    > It seems most people have fully assimilated the idea that information itself must be entirely subsumed into an oppressive, proprietary, commercial apparatus.

    No, the idea is that rules needed to be changed in a way that can are valid for everyone, not just for mega corporations who are trying to exploit other's works and gatekeep the it behind "AI".

  • kevin_thibedeau a day ago

    Consider that one day you may wish to author a creative work and derive financial benefit from that labor. There is legitimate use for limited time ownership of reproducible cultural artifacts. Extending that to 95 years is the problem.

    • narcraft a day ago

      I wish to one day derive financial benefit from hitting myself with a hammer for 8 hours a day. Should we construct a legal apparatus to guarantee that I am able to do so?

      Edit: the point I want to illustrate is that we do not get to choose what others value, or to dictate what is scarce and no one is entitled to make a living in a specific way even if they really want to

      • loki-ai a day ago

        It is bad analogy specially because we value that so much that we are even discussing on how to have more of it.

  • boplicity a day ago

    > used over and over again, primarily by the rich and powerful

    This is where the argument falls apart. Not because the copyright isn't used by the rich and powerful, but because it misses the fact that copyright also grants very powerful rights to otherwise powerless individuals, thus allowing for many small businesses and individuals to earn a living based on the rights granted by our current copyright system.

    • fiddlerwoaroof a day ago

      Rights you basically can’t use without a lot of money

      • boplicity 11 hours ago

        Um, no. I use these rights all of the time, and often enforce them, and am not at all wealthy.

  • [removed] a day ago
    [deleted]
  • a_bonobo a day ago

    >information itself must be entirely subsumed into an oppressive, proprietary, commercial apparatus

    I think that's the reason why I've (and probably many others?) have cooled down on general open source coding.

    Open source started when well-paid programmers used their stable positions and ample extra time to give back to the community. What happened then is that corporations then siphoned up all that labor and gave nothing back, just like the AI bros siphoned up all data and gave nothing back. The 'contract' of mutual exchange, of bettering each other, was always a fantasy. Instead the companies took away that ample extra time and those stable positions.

    Here we are in 2025 and sometimes I can't afford rent but the company C-tier is buying itself their hundredth yacht. Why should I contribute to your system?

  • onlyrealcuzzo a day ago

    > to stifle original ideas and hold back cultural innovation.

    How is copyright stifling innovation?

    You could not rip something off more blatantly than Gravity, which had the lawsuit dismissed entirely.

    Taurus vs Stairway to Heaven, the list goes on and on and on.

    You can often get away with nearly murder ripping off other people's stuff.

    • ppseafield a day ago

      Copyright makes the legality of arXiv and SciHub questionable at best. It locks publicly funded research behind paywalls. It makes being able to search the law (including case law) of the US incredibly expensive. It puts a burden on platforms to be beholden to DMCA takedowns, lest the content owner go to their hosting or DNS provider, has happened to itch.io. It adds licensing fees onto public musical performances (ASCAP).

      Additionally plenty of people making videos for YouTube have had their videos demonetized and their channels even removed because of the Content ID copyright detection scheme and their three strikes rule. In some cases to a ridiculous extent - some companies will claim ownership of music that isn't theirs and either get the video taken down or take a share of the revenue.

      I watched a video where someone wrote a song and registered it via CDBaby, which YouTube sources for Content ID. Then someone claimed ownership of the song, so YouTube assigned the third party 50% of the ad revenue of the video.

      • apersona 7 hours ago

        Let's separate the implementation of copyright and the concept of copyright. I don't think you would find anyone who would say the US's implementation of copyright is flawless, but the OP seems to be talking about the concept itself.

        > Additionally plenty of people making videos for YouTube have had their videos demonetized and their channels even removed because of the Content ID copyright detection scheme and their three strikes rule. In some cases to a ridiculous extent - some companies will claim ownership of music that isn't theirs and either get the video taken down or take a share of the revenue.

        Let's take YouTube videos as an example. If the concept of copyright doesn't exist, there is nothing stopping a YouTuber with millions more subscribers from seeing a trending video you made and uploading it themselves. Since they're the one with the most subs, they will get the most views.

        The winner of the rewards will always go to the brand that people know most rather than the video makers.

      • codedokode a day ago

        > Copyright makes the legality of arXiv

        Why? I thought that authors post the articles to arxiv themselves.

        > It locks publicly funded research behind paywalls.

        It is not copyright, it is scientists who do not want to publish their work (that they got paid for) in open access journals. And it seems the reason is that we have the system where your career advances better if you publish in paid journals.

    • fragmede a day ago

      Because it's self indulgent wankery. If I, as writer and an artist, have just the most absolutely brilliant thoughts, and write them down into a book or draw the most beautiful artwork, I can earn money off that well into my afterlife with copyright. Meanwhile the carpenter who is no less bright, can only sell the chair he's built once. In order to make money off of it, he must labor to produce a second or even a third chair. Why does one person have to work harder than the other because of the medium they chose?

      Meanwhile in China, just because you invented a thing, you don't get to sit back and rest on your laurels. sipping champagne in hot tubs, because your competitor isn't staying put. He's grinding and innovating off your innovation so you'd also better keep innovating.

      • TheOtherHobbes a day ago

        The only people making chairs by hand today are exceptionally well-paid artisanal craft carpenters and/or designers/studios.

        It's not at all unusual for popular/iconic furniture designs to be copyrighted.

        Reality is people who invent truly original, useful, desirable things are the most important human beings on the planet.

        Nothing that makes civilisation what it is has happened without original inventiveness and creativity. It's the single most important resource there is.

        These people should be encouraged and rewarded, whether it's in academia, industry, as freelance inventors/creators, or in some other way.

        It's debatable if the current copyright system is the best way to do that, because often it isn't, for all kinds of reasons.

        But the principle remains. Destroy rewards for original invention and creativity and you destroy all progress.

      • salynchnew a day ago

        One reason so many people are amenable to the copyright argument is at least partly because of these counterarguments that posit that every writer must be an elitist or fabulously wealthy vs. instead of someone who spent X years toiling away at their craft or skill while working menial/multiple jobs.

        • fragmede a day ago

          yeah we should abolish copyright and make it so that creators get paid for every eyeball that's looking at your content. first, we establish a total panopticon. and then you get paid when people engage with your content, like, the system records that a person watches your movie, doesn't matter how they got a copy of your movie, but this person watches your movie, and that watch gets sent into the system and you get paid out from it. no more copyright, just horribly invasive tracking of everything everywhere. Call it copythrough.

          That would never work, but like writing sci-fi.

      • onlyrealcuzzo a day ago

        This has nothing to do with stifling innovation.

        I am yet to meet a writer who doesn't even attempt to write for fear that whatever they write will be found to be in violation of copyright (unless they are the type of writer that is always finding excuses not to write).

        Several people have made successful careers out of fan fiction...

      • codedokode a day ago

        I don't think it is that easy. Take musicians for example. There are several thousands most popular and rich, some that can only gather a small club and a long tail of people who can only play music on their day off. And now with development of generative models their financial situation is going to get only worse.

      • absolutelastone a day ago

        The income from the book is scaling by its number of customers, versus roughly one person at a time who can enjoy the chair. It incentivizes finding ways to entertain more people with your effort.

  • mvdtnz a day ago

    I'm guessing you've never created something of value before. People are entitled to the fruits of their labour and control of their intellectual property.

    • jim-jim-jim a day ago

      If I paint a picture on a physical canvas, I can charge people to come into my house and take a look. If I bring the canvas to a park, I'm not entitled to say "s-stop looking at my painting guys!"

      If you're worried about your work being infinitely reproduced, you probably shouldn't work in an infinitely-reproducible medium. Digitized content is inherently worthless, and I mean that in a non-derisive way. The sooner we realize this, the richer culture will be.

      Really all content is worthless. Historically, we've always paid for the transmission medium (tape, CD) and confused it for the cost of art itself.

      • loki-ai a day ago

        and how do you reconcile any work in software development? If someone isn’t willing to work for free, should they just not work in the field at all? Do you think software culture would really be richer?

    • adamredwoods a day ago

      Accusatory clause aside, but I agree, this is how a lot of "starving artists" get out of being starving.

    • HideousKojima a day ago

      >People are entitled to the fruits of their labour and control of their intellectual property.

      No they aren't, intellectual property is a legal fiction and ideas belong to all of humanity. Humanity did fine without intellectual property for thousands of years, it's a relatively recent creation.

    • Kim_Bruning a day ago

      > I'm guessing you've never created something of value before

      That's an interesting speculation. You realize that it could also be turned against you, right? Never a good idea!

      So, let's focus on the arguments rather than making assumptions about each other's backgrounds.

      > People are entitled to the fruits of their labour and control of their intellectual property.

      People are absolutely entitled to the fruits of their labour. The crucial question is whether the current system of 'IP' control – designed for scarcity – is the best way to ensure that, especially when many creators find it hinders more than it helps. That's why many people explore and use other models.

  • egypturnash a day ago

    Getting the megacorporations to sit up and take notice of this is about the only way the average independent artist has any hope of stopping this crap from destroying half our jobs. What'm I gonna do, sue OpenAI? Sam Altman makes more money sitting on the toilet taking a dump than I do in an entire year.

    I have no love for the Mouse but if I can get them and the image slop-mongers to fight then that's absoutely fine. It would be nice to have a functioning, vibrant public domain but it is also nice to not have some rich asshole insisting that all copyright laws must be ignored because if they properly licensed even a fraction of what they've consumed then it would be entirely too expensive to train their glorified autocomplete databases on the entire fucking internet for the purpose of generating even more garbage "content" designed to keep your attention when you're mindlessly scrolling their attention farms, regardless of how it makes you feel, and if I can choose one or the other then I am totally behind the Mouse.

  • rthomas6 a day ago

    More than giant corporations make IP. What about independent artists making original art?

  • myhf a day ago

    The problem with this kind of plagiarism isn't that it violates someone's specific copyright.

    But the discussion around plagiarism calls attention to the deeper issue: "generative" AI does not have emergent thinking or reasoning capabilities. It is just very good at obfuscating the sources of its information.

    And that can cause much bigger problems than just IP infringement. You could make a strategic decision based on information that was deliberately published by an adversary.

  • elicksaur a day ago

    Gonna submit that business model to a YC 2026 batch.

  • ToucanLoucan a day ago

    I can't speak for everyone obviously, but my anti-AI sentiment in this regard is not that IP law is flawless and beyond reproach, far from it. I'm merely saying that as long as we're all required to put up with it, that OpenAI and company should also have to put up with it. It's incredibly disingenuous the way these companies have taken advantage of publicly available material on an industrial scale, used said material to train their models "for research" and as soon as they had something that vaguely did what they wanted, began selling access to them.

    If they are indeed the output of "research" that couldn't exist without the requisite publicly available material, then they should be accessible by the public (and arguably, the products of said outputs should also be inherently public domain too).

    If they are instead created products to be sold themselves, then what is utilized to create them should be licensed for that purpose.

    Additionally, if they can be used to generate IP violating material, then IMHO, makes perfect sense for the rights holders of those IPs to sue their asses like they would anyone else who did that and sold the results.

    Again, for emphasis: I'm not endorsing any of the effects of IP law. I am simply saying that we should all, from the poorest user to the richest corporation, be playing by the same rules, and it feels like AI companies entire existence is hinging on their ability to have their IP cake and eat it too: they want to be able to restrict and monetize access to their generative models that they've created, while also having free reign to generate clearly, bluntly plagiarizing material, by way of utilizing vast amounts of in-good-faith freely given material. It's gross, and it sucks.

    • flats a day ago

      Very well put. I’m open to a future in which nothing is copyrighted & everything is in the public domain, but the byproduct of that public domain material should _also_ be owned by the public.

      Otherwise, we’re making the judgement that the originators of the IP should not be compensated for their labor, while the AI labs should be. Of course, training & running the models take compute resources, but the ultimate aim of these companies is to profit above & beyond those costs, just as artists hope to be compensated above & beyond the training & resources required to make the art in the first place.

      • loki-ai 21 hours ago

        as an artist, I totally agree with this approach. the whole idea of trying to pay artists for their contributions in training data is just impractical.

        if the data’s pulled from the public domain, the model built from this human knowledge should be shared with all creators too, meaning everyone should get access to it

    • Kim_Bruning a day ago

      Beware of pushing for rules that you don't personally believe in. You just might succeed a little too well, and have to live with the consequences.

  • soulofmischief a day ago

    It smells like a psyop, to be honest. Doesn't take much to get the ball rolling. Just more temporarily embarrassed millionaires sticking up for billionaires and corporations, buying their propaganda hook line and sinker, and propagating it themselves for free. Copyright is a joke, DMCA is a disgusting, selectively applied tool of the elite.

  • [removed] a day ago
    [deleted]
  • fullshark a day ago

    All those ideas were rationalizations because people didn’t want to pay for stuff, just like your post effectively blaming the victim of IP theft cause corporations undeniably do suck so we shouldn’t care if they suffer.

  • codedokode a day ago

    I don't understand how protecting Disney characters prevents development of art or science. Why do you need them at all? There is lot of liberally licensed art and I think today there are more artists than ever in history.

    Also making a billion dollar business by using hard work of talented people for free and without permission is not cool. The movie they downloaded from Pirate Bay for free took probably man-years of work to make.

    Also I wonder how can we be sure that the images produced by machine are original and are not a mix of images from unknown artists at DeviantArt. Maybe it is time to make a neural image origin search engine?

    • CaptainFever 18 hours ago

      For the last paragraph, it already exists: Stable Attribution.

      It doesn't work. If you put your handmade drawing inside, it'll also tell you what images were mixed to make it, even though it was entirely human-made.

  • Peritract 13 hours ago

    The issue here is that you think the problem is

    > intellectual property

    rather than

    > used over and over again, primarily by the rich and powerful, to stifle original ideas and hold back cultural innovation

    You're using those "2008 ideas now to defend the rich and powerful exploiting and stifling creativity; the problem hasn't changed, you've just swapped sides.

    OpenAI isn't the underdog here.