Analysis

Justices' Cox Ruling Could Have Domino Effect On AI Cos.

(October 24, 2025, 8:45 PM EDT) -- The U.S. Supreme Court is set to hear oral arguments in December in a case over whether internet service providers can be held liable when their customers illegally download copyrighted works, and legal experts say its decision could potentially affect artificial intelligence companies if users of their products create infringing content.

The appeal from Cox Communications presents the question of secondary liability for ISPs, but intellectual property attorneys say the implications for AI companies could be significant if the high court affirms a Fourth Circuit decision upholding a Virginia jury's verdict against Cox, which led to a $1 billion judgment.

"This would absolutely open the door to all sorts of new theories of liability that would dramatically increase the legal exposure of these companies," said Ryan Baker, a partner at Waymaker LLP.

X Corp. raised that concern in an amicus brief on behalf of Cox submitted to the high court last month, noting that the issue has already come up in a lawsuit brought by The New York Times and other news organizations against Microsoft and OpenAI alleging, among other things, that people are using their products to get behind paywalls.

A New York federal court declined to dismiss a contributory infringement claim in the news organizations' complaint this year, "finding that the plaintiffs had plausibly alleged knowledge of alleged third-party infringement by customers of those generative AI products," according to X's brief to the justices.

The Times' lawsuit is one of dozens pending around the country against AI developers accusing them of infringing copyrighted works by using them to train their platforms, and in some cases also for allegedly infringing content the systems produce.

"When the actual alleged infringement is coming from or done by the end users of the service, it may not be worth it to go after them. So how can you go up the chain to the deeper pocket?" said Matt Rizzolo, a partner at Ropes & Gray LLP.

He said a ruling in favor of the music companies could increase the potential liability for "any provider of a service that's used by a lot of downstream users where there's copyright infringement."

Sony Music Entertainment, Capitol Records LLC, Universal Music Corp. and other music companies sued Cox in 2018 for contributory and vicarious copyright infringement for more than 10,000 copyrighted songs that the ISP's customers illegally downloaded. In 2021, jurors found Cox liable for both types of infringement and said its conduct was willful. Although the Fourth Circuit affirmed the contributory infringement verdict, it reversed the jury's vicarious liability finding and ordered the district court to recalculate damages.

Tech companies including Google and Microsoft said in an amicus brief that the potential reach of the justices' decision will not be limited "to cable providers."

"The issue before this court regarding the proper standard for contributory infringement will broadly affect all companies that provide internet-related products and services," said the amicus brief, which also included Amazon, Pinterest and Mozilla.

Cox and its supporters argue that contributory liability requires showing that a defendant had conscious, culpable intent. The music companies contend that that's exactly what they proved at trial, saying Cox put profits over legal compliance by failing to disconnect repeat infringers from the internet.

The music companies have also argued that they are not advocating to disconnect people from the internet en masse over unfounded allegations, which Cox has warned could happen if the Fourth Circuit's holding stands. But supporters of the music companies told the justices in amicus briefs this week that Cox wants to "effectively eliminate service provider exposure to liability for the vast majority of online infringements."

Counsel for Cox and the music companies declined to comment for this story.

A win for the music companies would "give copyright holders another way of bringing pressure on AI developers to either secure licenses [for their content] or change the way their tools work," said Brandon Butler, a copyright lawyer and executive director of Re:Create, a coalition of think tanks, libraries and tech companies that advocate for fair use in copyright law.

Although a ruling siding with music companies could affect various internet technology providers, Butler said, the consequences could be especially pronounced for AI developers.

"They provide really powerful, creative tools, and people who could not possibly make videos of SpongeBob high-fiving Spider-Man or whatever now can," he said, noting that the technology "materially contributes to their ability to do that."

Butler expressed concern that the increased threat of litigation might deter new AI developers from entering the market.

"If you told somebody you can't make a pencil unless you're willing to be responsible for everything anybody writes with a pencil, a lot fewer people would get into the pencil business," he said.

Although the Supreme Court may deliver a narrow focus on ISPs, attorneys Law360 spoke to said they still think litigators will seek ways to broaden its application.

"Even if the court were to say, 'We're limiting this ruling to ISPs,' it is impossible to imagine that wouldn't be seen as a crack in the door," said Baker, the Waymaker attorney. He added that AI companies may actually be more susceptible than ISPs to claims of secondary liability.

"ISP technology, I think that's pretty agnostic," he said. "You start talking about AI — is it agnostic? I don't know. There are lots of possible fact patterns you could have where AI could be seen as a facilitator."

Many AI systems already have guardrails in place to prevent answers to certain users' requests, including creating potentially infringing content, but the safeguards are not infallible, said Peter Salib, an assistant professor at the University of Houston Law Center, co-director of the Center for Law & AI Risk and adviser to the Center for AI Safety in San Francisco.

"All these companies would prefer that their users not be able to produce copyrighted works. It's like a big headache to get sued over it, and it's probably not adding that much value to the product," Salib said, noting that he thinks developers are probably "trying really hard to" prevent their systems from doing that.

"It's just that they don't know how," he said. "The technology does not yet exist to keep AIs from doing things that people don't want them to do, and that matters not just for copyright, but for a whole bunch of things, especially as these AI systems become more capable."

--Editing by Alanna Weissman.

For a reprint of this article, please contact reprints@law360.com.