Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

Ai Companies Can Use Copyrighted Work Without Permission, Judge Rules

Card image cap

Generative AI feeds on mountains of books, articles, video scripts, and magazines and remixes them all into something that feels new. It can’t get “better” unless it’s given a constant stream of new stuff, or else it starts to stagnate. This new stuff isn’t pulled from thin air. AI companies often use copyrighted material, like novels, to feed their models the creative works they need to stay fresh.

But what happens when those books are written by authors who are still alive and hoping to make money off of their work, yet an AI company is feeding that work to a machine that can easily, if not quite effectively, replicate their style and persona?

This week, a federal judge just gave us the first real, rather depressing answer: AI companies are within their right to train their AI models on copyrighted material, with some minor caveats.

The lawsuit, filed by authors Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson, accused Anthropic of straight-up pirating their books, ripping them from the internet and cramming them into the training data for Claude, the company’s generative AI model.

Judge William Alsup, appointed by President Bill Clinton, sided with AI developer Anthropic, deciding that using copyrighted books to train AI, even without the author’s permission, falls under “fair use.”

Of course, it doesn’t seem at all fair that Anthropic will now be endlessly remixing and making money off of that copyrighted material without a single red cent being sent back to the creators of the content its AI gobbled up, which could be used to create quick, cheap AI-generated knockoffs of the author’s original work.

Alsup agreed they were pirated, and yes, that’s still illegal. But when it came to using books that were legally acquired to train AI, that’s fine. His justification is summed up in two words from his ruling: “exceedingly transformative.”

Judge Alsup says AI doesn’t just repeat what it reads. As he wrote in his ruling, “Consistent with copyright’s purpose in enabling creativity and fostering scientific progress, ‘Anthropic’s LLMs trained upon works not to race ahead and replicate or supplant them — but to turn a hard corner and create something different.'”

In other words, his ruling hinges on a crucial word in the world of copyright and fair use laws: “transformative.” Meaning, that one work can be inspired by another, but it does not infringe on a copyright if it has been thoroughly reworked into something that stands on its own. Federal Judge Alsup ruled that authors’ works deserve strong protections, but the sheer innovation displayed by generative AI tips the scales in its favor. It’s doing enough to distance itself from the material on which it’s trained to stand on its own, according to the judge.

This ruling doesn’t mean AI companies have carte blanche to gobble up intellectual property. Alsup was clear that if you’re going to train an AI model on a book, at least buy it first. Piracy remains off-limits, and Anthropic will still face trial over its use of pirated works.

Federal rulings are not a law set in stone. They set a tone for similar cases to follow, and even then, every judge is different. There are other cases like this in the hands of other federal judges who could rule differently.

Congress, should it ever get its act together and start protecting people, could create a law that bans AI companies from using copyrighted material, or even one that requires companies to pay some kind of royalties for the use of copyrighted material. Knowing that this Republican-controlled Congress is considering a law that bars states from regulating AI companies in any way for 10 years, something like that is not likely to happen.

They are in the business of protecting big business, not your business.

The post AI Companies Can Use Copyrighted Work Without Permission, Judge Rules appeared first on VICE.