Not all AI. Primarily just the useless LLM shit. And frankly, if that needs to steal literally everything from everyone without compensation to be viable, then it deserves to die.Yes, and I think so too.
Think about what this ruling would even mean for small scale opensource projects. This would really be the death of all AI in the US
Let me take the opposing view. So what if AI trained on copyrighted books? The training set is 100 times larger than the model, it can't possibly store the materials. It learns reusable language abstractions.
With this anti-AI move we are making copyright more powerful, but this limits creativity in the future.
Besides that, it seems absurd to target AI for copyright infringement. It is the worst infringement tool ever - it cannot reproduce exactly,
Not to be a jerk, but no teachers do not. That is literally the first exception under US Fair Use: https://www.law.cornell.edu/uscode/text/17/107It ruined Napster too...
I should note that when human teachers are training human students, they need to have a valid license for all the copyrighted material in their training library. Schools and teachers are frequently audited to ensure they have a valid license for everything they are using. I personally know teachers who have faced career limiting punishments for photocoping material into their training library without a license.
Darn it. Sorry, you got that first before my comment. What I get for commenting so late.Which is so fucking stupid because multiple copies for classroom use is one of the examples quite literally set out in US law as being fair use and thus not an infringement.
As has been pointed out before, that is not exactly right.Not to be a jerk, but no teachers do not. That is literally the first exception under US Fair Use: https://www.law.cornell.edu/uscode/text/17/107
which should wake people up to whether they themselves should give respect to any concept of copyright,because this is the repeating story of foreverSo the argument is literally "we're too important for consequences"?
I've never heard of too big to fail bef... Oh.. oh no, well hopefully the new TARP might be called FART, so at least it might be entertaining edit 4 spollingSo the argument is literally "we're too important for consequences"?
“We are too rich to face consequences or lose any of our wealth. That’s only for the little people.”So the argument is literally "we're too important for consequences"?
I think everything needs to be from a physical book, a copy of which being distributed to every student for free. Licensing should be attached to the physical item. Honestly the time for allowing individuals control over curricula is over. I've seen teachers using handouts for different grade levels from who they were teaching, and who knows what else. That was waaay before the present mess we're in.It ruined Napster too...
I should note that when human teachers are training human students, they need to have a valid license for all the copyrighted material in their training library. Schools and teachers are frequently audited to ensure they have a valid license for everything they are using. I personally know teachers who have faced career limiting punishments for photocoping material into their training library without a license.
I have an appropriate response to the AI companies:"Copyright class actions could financially ruin AI industry, trade groups say."
<Mr. Burns' voice>: Excellent!
I think it retired. It's near Boca Raton last I heard.Where’s the “Oh no…anyways” gif?
It never hurts to see a tardigrade.I have an appropriate response to the AI companies:
View attachment 115461
It's been like this for a part of the population for most of the history of the USA.The USA was never great, it was only ever big.
These applications don't require any novels / movies / paintings / whatever to train the algorithm, so I really don't see what point you're trying to make.AI is already saving lives, and making great scientific leaps, but you luddites want it shut down. I pity you sad, sad people who hate everything.
Yes, but there's ALWAYS a Lehman Brothers.It's worked for the banks.
Somewhat more importantly, they've also openly discussed their "copyright problem" before. This isn't something they did by mistake thinking it'd be fair use. They violated these copyrights knowingly and need to pay the legal penalty for that. This whole idea that something new should get a pass on following the rules because it's potentially a big deal needs to die in a fire. Follow the rules or GTFO.I really don't care if the AI companies go bankrupt, it serves them right.
Stop, you've already convinced me.Copyright class actions could financially ruin AI industry, trade groups say.
This is never going to happen. There is a trillion dollars invested in this stuff and Trump and Congress are going to find a way to make it legal and allow AI to fuck over every content creator, writer, artist, and so on. We are solidly in the command and control market economy now and nobody is going to allow 10,000 points to get wiped off the Dow. The billionaires are going to get their money.
The basic economic theory from the right is pretty close to wipe out all labor, go to a full asset economy, make money off of crypto, meme stocks, and various scams, turn Goldman Sachs into a rack of computers. We can always have prisoners pick our crops until we invent robots to do it - prison slavery is still legal in the US after all.
What do you think that "training" is doing?It's not even about the training, it's the pirating of the content that this case is about
Yes, but there's ALWAYS a Lehman Brothers.
Much more nuanced than that in academic libraries. The source of the content that is being shared is very important. Licensing agreements for that content determine how or even if you can share it. Who is doing the copying is another factor. A student for personal use? OK. A teacher for classroom use as a handout? Check the license.If the material being copied is strictly for in-class use and is pure research, then it is almost certainly fair use.
But if the material being copied is for public use by the class (e.g., a play or song), then it is not fair use.
And if the material being copied is from an existing text book, then it is not fair use.
Both teachers and school districts have been sued for copyright violations.
The case is about who can be a plaintiff so whilst I (and 90% of Ars) agree with the rest of your post, the problem is this just isn't how litigation works. The objection to the class definition here is that in practice, the case will not get to this question because they'll get mired in what "the plaintiffs" means.Did the plaintiffs use copyright material without consent?
Yes.
It does not cause a problem for plaintiff or dependent. It causes a problem for the court and anyone else who wants to use that court on an unrelated matter.I really don't care if it leads to millions of tiny copyright actions in court, that's a result of the plaintiff's stupid actions, so they should take ownership of it, not suggest it's someone else's problem.#
If this is your sincere belief, you have been lead astray by a cadre of stupid racists who can’t even spell AI.Most valuable private company in the world is OpenAI now, like it or not. You're not going to hamstring the entire US economy and handing over the next arms race to the Chinese over copyright. Too much money and power involved.
Did the plaintiffs use copyright material without consent?
Yes.
Case closed.
I really don't care if the AI companies go bankrupt, it serves them right.
I really don't care if it leads to millions of tiny copyright actions in court, that's a result of the plaintiff's stupid actions, so they should take ownership of it, not suggest it's someone else's problem.
I really do hope that the AI companies lose bigglies. Only then will a robust law protecting authors' works be able to move forward.
The same with any other medium AI companies believe are ripe for exploitation.
Would it be a loss? Yes, there are lots of uses for AI dessimination of all kinds of things, but those must clearly remain tools, not replace that which they've disseminated.
For example, as a would-be writer, i use an online tool to correct my grammar and spelling and make critical comments on aspects of my writing that need improvement. It might even suggest rewording paragraphs to improve readability (it often gets it hopelessly wrong btw). It also compares my writing to the works of published authors, to see if there any comparisons. It might say Stephen King's work uses 54% of whatever my work uses, making the suggestion that perhaps emulation might improve the chances of my book being accepted.
It's a silly comparison in many ways, but it does give you an idea about your work if it veers far outside what is generally accepted popularist writing. It's a relative mark of your work's saleability.
That's a great use of AI that doesn't impinge on an author's copyright.
But wholesale lifting other works to create "new" works is just plain theft.
Someone mentioned the rich getting richer as if it's a new thing.
Nothing much had changed has it? The rich get greedier and richer, the poor get poorer, until every drop of blood has been drained from their bodies and the whole lot collapses. Because stupid people rule and even stupider people let them.
Let's hope stupid doesn't win this time, but be prepared for the worst.