Comment by ben_w
By that tautology, so are rocks. Rocks don't get natural rights.
Also, you may note that "human rights" is a recent invention and not actually enforced worldwide even today.
By that tautology, so are rocks. Rocks don't get natural rights.
Also, you may note that "human rights" is a recent invention and not actually enforced worldwide even today.
> Consider that countries known for stronger interpretation of human rights and freedoms, including intellectual property rights, are also the countries at the forefront of innovation, including technical innovation that laid the foundation for LLMs in the first place. I think that is not a coincidence, and we should keep it in mind when there is a push to be dismissive of these concepts (which predominantly serves the interests of commercial LLM operators and their supply chain).
Several fallacies there.
First because China and the USA are opposite ends of the spectrum on many of the ways "freedom" is measured, and yet China is doing pretty well on the innovation front, including with AI. And that China is beating Europe, even though various Scandinavian nations rank higher on such freedoms than does the USA.
Second, cum hoc ergo propter hoc: Correlation does not imply causation. For example in this case, a reason why one of the big IP groups in the USA (Hollywood) got big, was because being in California enabled them to avoid the IP rights of the Motion Picture Patents Company that dominated cinema in the East Coast. I would even suggest that it is the disregarding of IP rights that enables much of the web, not only how and why China is doing well, but also Google (which has had legal fights over the interaction between copyright and search results), social media, and cultural elements such as memes and reaction gifs.
Third: the point of copyright is to encourage new works, because this makes money which can be taxed. All this becomes somewhat irrelevant when AI can also create new works.
If you want to set a bar for creativity high enough that current AI can't reach it, I suspect quite a lot of human works also fail, e.g. that Pratchett's Strata is obviously Ringworld, and that you would exclude from copyright all parts of The Lion King that are based on Hamlet.
> I would still oppose the use of the natural vs. unnatural distinction as the basis of that argument, though.
I'm not sure what you're saying when you "oppose" this. Does that mean you accept that, in principle, there could be some AI which would deserve rights in the category currently (but in principle inaccurately) called "human rights"?
> China is doing pretty well on the innovation front, including with AI.
From transistors to transformers, most of it builds on foundation that comes guess from where. The innovative layer you speak of is fairly thin.
> Correlation does not imply causation.
I give you that. However, without being able to re-run history, correlation is all we have.
> I would even suggest that it is the disregarding of IP rights that enables much of the web
I would suggest that much of the tech that powers the Web, including probably the most popular operating system on which most servers run, is enabled by copyleft, and copyleft cannot exist without the ability to defend it granted by IP rights, the very concept under fire.
> the point of copyright is to encourage new works
I agree on this.
> All this becomes somewhat irrelevant when AI can also create new works
I don’t agree with a phrase “AI can create new works” for reasons such as 1) “AI” is a meaningless term (let it be my revenge for consciousness) or 2) a tool without agency or will should not be X in a sentence “X can Y” (sure, we can maybe on occasion say “hammers can break things”, but if hammers having agency and will was a popular misconception then I would definitely prefer to stick to “hammers can be used to break things”). The “create new works” part is also questionable on a few levels, but it might exceed the scope of this argument.
That aside, I believe lack of copyright enforcement discourages the creation of new works even in presence of these tools, through the mechanism known as “why would I put effort into new work if I don’t effectively own the result”.
> Does that mean you accept that, in principle, there could be some AI which would deserve rights in the category currently (but in principle inaccurately) called "human rights"?
I think if we believe an LLM or some other software is sufficiently close to a human that it deserves human-like rights or just strong abuse protections (cf. octopus in some countries)—without saying whether I believe it possible or not, it really is orthogonal—then we could excuse it reciting some part of Harry Potter in the right context (probably not as work for hire), but it would be moot because we would also be ethically compelled to not subject it to the training and use that enables such recitation in the first place.
> From transistors to transformers, most of it builds on the foundation that comes guess from where. The innovative layer you speak of is fairly thin.
China rises to first place in most cited papers: https://www.science.org/content/article/china-rises-first-pl...
> 1) “AI” is a meaningless term (let it be my revenge for consciousness)
Fair.
But let's say "computer program" in that case. I'm not fussed about definitions.
> 2) a tool without agency or will should not be X in a sentence “X can Y”. Sure, we can maybe on occasion say “hammers can break things”, but if hammers having agency and will was a popular misconception then I would definitely prefer to stick to “hammers can be used to break things”.
Careful.
If you say that humans can only create things with copyright (even if to support copyleft), then the proletariat are the tool that the bourgeois use to create things.
I do not think this is what you intended :P
> That aside, I believe lack of copyright enforcement discourages the creation of new works even in presence of these tools, through the mechanism known as “why would I put effort into new work if I don’t effectively own the result”.
Same reason you commission a work, or even just buy it from a shop: because then you have the thing.
I mean, the cost of getting o3 to create a novel worth of text is about the same as the price of a generic book by an unknown author in a second-hand shop: https://openai.com/api/pricing/
I've not tried o3 yet, but I have tried o1, and as I've said on a different thread today, o1's output is merely OK, not worth publishing as a book — and I don't know how long it will take to get there. But it is displacing blog writers and podcast writers: https://news.ycombinator.com/item?id=44287953
> but it would be moot because we would also be ethically compelled to not subject it to the training and use that enables such recitation in the first place.
Surprising.
While I would seriously consider the possibility it may be unethical to force such an AI to work if it didn't want to, I think giving it the capability, the education, to be capable of making that choice rather than just saying "it doesn't matter if I wanted to or not, I can't", is just education, as per our own.
Still, I think that's coherent. I'm not sure I've fully internalised the implications so I will let it be.
> https://www.science.org/content/article/china-rises-first-pl...
Per capita?
> If you say that humans can only create things with copyright (even if to support copyleft), then the proletariat are the tool that the bourgeois use to create things.
The wealth gap and the divide is unlikely to be helped if more people are going to be using (and paying for, in whatever way) ML-based tech from a handful of large corporations.
> Same reason you commission a work, or even just buy it from a shop: because then you have the thing.
Simple posession is more about physical necessities. Commissioning or buying artwork from someone is not just about posessing it, it comes with supporting someone financially. I could make icons or basic illustrations for some small project myself, but I would still commission them if I can afford it because that supports an artist who may want some work (as well as building up for more collaborations in future). Here, I would be supporting the opposite of those artists, a thing that was built on those artists’ work without their consent. Some middleman megacorp of the worst kind.
> the cost of getting o3 to
Don’t they operate at a loss for the time being? They will have to make money sooner or later.
> While I would seriously consider the possibility it may be unethical to force such an AI to work if it didn't want to, I think giving it the capability, the education, to be capable of making that choice rather than just saying "it doesn't matter if I wanted to or not, I can't", is just education, as per our own.
This goes way beyond my thought. I assume if we are talking about education it would be a given that running it generating images 24/7 non stop, shutting it down/killing it, etc., is already out of the question.
> you may note that "human rights" is a recent invention and not actually enforced worldwide even today.
Consider that countries known for stronger interpretation of human rights and freedoms, including intellectual property rights, are also the countries at the forefront of innovation, including technical innovation that laid the foundation for LLMs in the first place. I think that is not a coincidence, and we should keep it in mind when there is a push to be dismissive of these concepts (which predominantly serves the interests of commercial LLM operators and their supply chain).
I’m sure you would not argue from a point where this recent interpretation of human rights is bad or incorrect, but if you would then perhaps there’s not much of a constructive discussion to be had. I would still oppose the use of the natural vs. unnatural distinction as the basis of that argument, though.