A lawsuit is being heard in a federal courtroom in San Jose, California, to decide whether artificial intelligence companies creating products valued at billions of dollars can use the creative work of millions of people without their consent, payment, or repercussions. The music publishers who brought this lawsuit are aware of this. Anthropic is aware of this. Observing from a safe distance, the larger technology sector most likely knows it as well.
In 2023, Anthropic was sued by Universal Music Group, Concord Music Group, and ABKCO for allegedly training its AI chatbot Claude on lyrics from at least 500 copyrighted songs, including Beyoncé, the Rolling Stones, and the Beach Boys, without permission or payment. However, since then, the case has grown significantly. After obtaining what they describe as overwhelming evidence that Claude reproduces lyrics on demand rather than merely passively learning them, publishers moved to amend their complaint in August 2025. If you ask Claude for the lyrics to a song that is protected by copyright, it will provide them under specific circumstances. The publishers contend that’s not learning. Reproduction is that. large-scale commercial reproduction without a license or royalties.
Beneath the training data argument is a darker, harder layer. According to the amended complaint, Anthropic used BitTorrent to systematically download from what publishers refer to as “notorious pirate libraries,” repositories of copyrighted content that exist specifically because they disregarded intellectual property law, rather than merely harvesting lyrics from publicly accessible websites. This is described by the publishers as “unmistakable, irredeemable infringement.” It’s not a gray area. There is no legal ambiguity. intentional theft carried out on a large scale in order to create a commercial product. If that accusation is true, it completely removes the discussion from the philosophical realm of fair use; you cannot say that your use of stolen content is transformative when the theft is the basis of the business.
| Category | Details |
|---|---|
| Case Name | Concord Music Group Inc. v. Anthropic PBC |
| Case Number | 5:24-cv-03811 |
| Court | U.S. District Court, Northern District of California (San Jose) |
| Presiding Judge | U.S. District Judge Eumi Lee |
| Defendant | Anthropic PBC |
| Lead Plaintiffs | Universal Music Group (UMG), Concord Music Group, ABKCO |
| Lawsuit Filed | 2023 |
| Amended Complaint | August 2025 |
| Summary Judgment Motion | Filed March 2026 |
| Songs Allegedly Infringed | Lyrics from at least 500 songs (Beyoncé, Rolling Stones, Beach Boys, Bruno Mars, others) |
| Potential Damages | Up to $3 billion+ (statutory damages up to $150,000 per infringement) |
| Key Allegation | Anthropic used pirated lyrics databases via BitTorrent to train Claude |
| Anthropic’s Prior Settlement | $1.5 billion paid to book authors class action (2024) |
| Supporting Amicus Brief | RIAA, NMPA, A2IM, SONA, Black Music Action Coalition, Artist Rights Alliance, SoundExchange |
| Opposing Ruling | Judge William Alsup (San Francisco) — ruled AI training on books is “quintessentially transformative” (June 2025) |
| Plaintiff Counsel | Matt Oppenheim, Oppenheim + Zebrak |
| Defense Counsel | Sonal Mehta, WilmerHale |
| Central Legal Question | Whether AI training constitutes “fair use” under U.S. copyright law |

A four-factor test has always been part of the fair use doctrine, and the first factor—whether the use is transformative—tends to predominate in AI cases. This argument has been heavily relied upon by AI companies, and it was effective for a while. In a different case last June, San Francisco Judge William Alsup declared that Anthropic’s use of books to train its AI was “quintessentially transformative”—possibly the most AI-friendly ruling a court has produced on this issue. The tech sector was very optimistic about the future of these cases after that decision. By citing a different set of facts, the music publishers’ March 2026 motion before Judge Eumi Lee is a clear attempt to get a different outcome. They contend that using books for training is one thing. On-demand, note-for-note reproduction of lyrics is a completely different matter.
It’s difficult to ignore how carefully the music industry has crafted this claim. In an amicus brief filed in April 2026, the RIAA, NMPA, A2IM, SoundExchange, the Artist Rights Alliance, and a number of other organizations presented the market harm argument in an unusually direct manner: Claude can produce lyrics for thousands of songs in the time it takes a human songwriter to write one. Listeners hear those lyrics. They compete with the licensed, paid originals and may even take their place. That argument is really strong when it comes to the fourth fair use factor, which asks whether the use hurts the market for the original work. Commercial actors who create products that undercut the economic value of the works they consume without paying for them were never intended to be protected by fair use.
As this develops, there’s a feeling that the music industry has gained insight from witnessing other creative industries—such as news publishers, book authors, and visual artists—engage in dispersed, disorganized legal disputes with AI firms. This unified front is noteworthy. The fact that SoundExchange and the RIAA are supporting the NMPA and the Artist Rights Alliance in a single amicus filing indicates that this is not a disagreement over a single company or set of lyrics. It is an industry-wide statement about where the line is and what happens when it is crossed.
For its part, Anthropic reached a reported $1.5 billion settlement with book authors in 2024, indicating that the company was aware of its exposure even before the music case reached its current level of severity. The company has not yet formally argued fair use in this specific proceeding and has refuted the specific allegations in the music publishers’ case. It remains to be seen if Judge Lee’s decision on the summary judgment motion will precede that. Tens of thousands of alleged violations could result in statutory damages of up to $150,000 per infringement, which would theoretically raise the case’s ceiling above $3 billion. Minds are focused by that number.
Every creative industry is grappling with the deeper question this case is bringing to light: will the introduction of large-scale AI systems fundamentally alter the definition of copyright protection? When the law was drafted, copying was costly, time-consuming, and detectable. In the new world, a single business can consume millions of creative works in a matter of months, create a product that replicates them on demand, and then request that a court declare it to be fair use. With Universal’s financial might and the institutional support of almost every significant music trade association, the music publishers are claiming that the answer is no. Courts continue to make decisions. However, the current San Jose argument is the most lucid version of that case that has been developed to date.
