-
Stock markets track Wall St down with Nvidia, US jobs in view
-
Malaysia to appeal to CAS after damning FIFA report on forgery scandal
-
TotalEnergies accused of Mozambique war crimes 'complicity'
-
England quick Wood back bowling after injury scare
-
US lawmakers set for explosive vote on Epstein files
-
Gianfranco Rosi: the slow documentary maker in a frantic world
-
P.Priime, Nigeria's young leading Afrobeats producer
-
Merz, Macron to push for European digital 'sovereignty'
-
Trump hosts Saudi prince for first time since Khashoggi killing
-
Tonga's Katoa out of NRL season after brain surgery
-
Japan warns citizens in China over safety amid Taiwan row
-
In Somalia, a shaky front line barely holds back the 'dogs of war'
-
Shares in 'Baby Shark' studio jump on market debut
-
Thunder breeze past Pelicans, Pistons overpower Pacers
-
Grieving Cowboys remember Kneeland, defeat Raiders
-
Loaf behind bars: Aussie inmate says Vegemite a human right
-
In film's second act, 'Wicked' goes beyond Broadway musical
-
Asian markets track Wall St down with Nvidia, US jobs in view
-
Scott Boland: the best 'spare' fast bowler around
-
Fire and Ashes: England bank on fast bowling barrage in Australia
-
North Korea says Seoul-US sub deal will trigger 'nuclear domino' effect
-
Education for girls hit hard by India's drying wells
-
Haitian gangs getting rich off murky market for baby eels
-
Trump says will talk to Venezuela's Maduro, 'OK' with US strikes on Mexico
-
Oscar Piastri wins Australia's top sports honour
-
'Severely restricted': Russia's Saint Petersburg faces cultural crackdown
-
Polish PM denounces 'sabotage' of railway supply line to Ukraine
-
UK toughens asylum system with radical overhaul
-
Carney's Liberals pass budget, avoiding snap Canada election
-
LeBron back in training, edges closer to Lakers return
-
Climate talks run into night as COP30 hosts seek breakthrough
-
Germany and Netherlands lock up World Cup spots in style
-
Germany's Woltemade hopes for 2026 World Cup spot after scoring again
-
Germany 'send message' with Slovakia rout to reach 2026 World Cup
-
Trump unveils fast-track visas for World Cup ticket holders
-
Netherlands qualify for World Cup, Poland in play-offs
-
Germany crush Slovakia to qualify for 2026 World Cup
-
Stocks gloomy on earnings and tech jitters, US rate worries
-
'In it to win it': Australia doubles down on climate hosting bid
-
Former NFL star Brown could face 30 yrs jail for shooting case: prosecutor
-
Fate of Canada government hinges on tight budget vote
-
New research measures how much plastic is lethal for marine life
-
Mbappe, PSG face off in multi-million lawsuit
-
EU defends carbon tax as ministers take over COP30 negotiations
-
McCartney to release silent AI protest song
-
Stocks tepid on uncertainty over earnings, tech rally, US rates
-
Louvre shuts gallery over ceiling safety fears
-
'Stranded, stressed' giraffes in Kenya relocated as habitats encroached
-
US Supreme Court to hear migrant asylum claim case
-
Western aid cuts could cause 22.6 million deaths, researchers say
US judge backs using copyrighted books to train AI
A US federal judge has sided with Anthropic regarding training its artificial intelligence models on copyrighted books without authors' permission, a decision with the potential to set a major legal precedent in AI deployment.
District Court Judge William Alsup ruled on Monday that the company's training of its Claude AI models with books bought or pirated was allowed under the "fair use" doctrine in the US Copyright Act.
"Use of the books at issue to train Claude and its precursors was exceedingly transformative and was a fair use," Alsup wrote in his decision.
"The technology at issue was among the most transformative many of us will see in our lifetimes," Alsup added in his 32-page decision, comparing AI training to how humans learn by reading books.
Tremendous amounts of data are needed to train large language models powering generative AI.
Musicians, book authors, visual artists and news publications have sued various AI companies that used their data without permission or payment.
AI companies generally defend their practices by claiming fair use, arguing that training AI on large datasets fundamentally transforms the original content and is necessary for innovation.
"We are pleased that the court recognized that using 'works to train LLMs was transformative,'" an Anthropic spokesperson said in response to an AFP query.
The judge's decision is "consistent with copyright's purpose in enabling creativity and fostering scientific progress," the spokesperson added.
- Blanket protection rejected -
The ruling stems from a class-action lawsuit filed by authors Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson, who accused Anthropic of illegally copying their books to train Claude, the company's AI chatbot that rivals ChatGPT.
However, Alsup rejected Anthropic's bid for blanket protection, ruling that the company's practice of downloading millions of pirated books to build a permanent digital library was not justified by fair use protections.
Along with downloading of books from websites offering pirated works, Anthropic bought copyrighted books, scanned the pages and stored them in digital format, according to court documents.
Anthropic's aim was to amass a library of "all the books in the world" for training AI models on content as deemed fit, the judge said in his ruling.
While training AI models on the pirated content posed no legal violation, downloading pirated copies to build a general-purpose library constituted copyright infringement, regardless of eventual training use.
The case will now proceed to trial on damages related to the pirated library copies, with potential penalties including financial damages.
Anthropic said it disagreed with going to trial on this part of the decision and was evaluating its legal options.
Valued at $61.5 billion and heavily backed by Amazon, Anthropic was founded in 2021 by former OpenAI executives.
The company, known for its Claude chatbot and AI models, positions itself as focused on AI safety and responsible development.
H.Nasr--SF-PST