- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Copyright class actions could financially ruin AI industry, trade groups say.
AI industry groups are urging an appeals court to block what they say is the largest copyright class action ever certified. They’ve warned that a single lawsuit raised by three authors over Anthropic’s AI training now threatens to “financially ruin” the entire AI industry if up to 7 million claimants end up joining the litigation and forcing a settlement.
Last week, Anthropic petitioned to appeal the class certification, urging the court to weigh questions that the district court judge, William Alsup, seemingly did not. Alsup allegedly failed to conduct a “rigorous analysis” of the potential class and instead based his judgment on his “50 years” of experience, Anthropic said.



I respectfully disagree. Meta was caught downloading books from Libgen, a piracy site, to “train” it’s models. What AI models do in effect is scan information (i.e., copy), and distill and retain what they view as its essence. They can copy your voice, they can copy your face, and they can copy your distinctive artistic style. The only way they can do that is if the “training” copies and retains a portion of the original works.
Consider Shepard Fairies’ use of the AP’s copyrighted Obama photograph in the production of the iconic “Hope” poster, and the resultant lawsuit. While the suit was ultimately settled, and the issue of “fair use” was a close call given the variation in art work from the original source photograph, the suit easily could have gone against Fairey, so it was smart for him to settle.
Also consider the litigation surrounding the use of music sampling in original hip hop works, which has clearly been held to be copyright infringement.
Accordingly, I think it is very fair to say that (1) AI steals copyrighted works; and (2) repackages the essential portions of those works into new works. Might a re-write of copyright law be in order to embrace this new technology? Sure, but if I’m a actor, or voice actor, author, or other artist and I can no longer earn a living because someone else has taken my work to strip it down to it’s essence to resell cheaply without compensating me, I’m going to be pretty pissed off.
Lol. The liberal utopia of Star Trek is a fantasy. Far more likely is that AI will be exploited by oligarchs to enrich themselves and further impoverish the masses, as they are fervently working towards right now. See, AI isn’t creative, it gives the appearance of being creative by stealing work created by humans and repackaging it. When artists can no longer create art to survive, there will be less material for the AI models to steal, and we’ll be left with soulless AI slop as our de facto creative culture.
That action itself can and should be punished. Yes. But that has nothing to do with AI.
Is that what people think is happening? You don’t even have a layman’s understanding of this technology. At least watch a few videos on the topic.
I think that copying my voice makes this robot a T-1000, and T-1000s are meant to be dunked in lava to save Sarah Connor.
Absurd. It’s their entire fucking business model.
Meaning it would illegal even if they weren’t doing anything with ai…
So what an Ai does is the same thing as every human ever who has read/saw/listened a work and then wrote more words being influenced by that book/artwork/piece.
If you’ve ever done anything artistic in your life, you know that the first step is to look at what others have done. Even subconsciously you will pull from what you’ve seen, heard. To say that AI is not creative because it is derivative is to to say that no human being in history has been creative.