Skip to content

It looks like we may have content for your preferred language. Would you like to view this page in English?

IP/Entertainment Case Law Updates

Andersen v. Stability AI Ltd.

District court grants in part and denies in part defendants’ motions to dismiss plaintiffs’ claims that defendants infringed plaintiffs’ artworks by including those works in data used to train defendants’ AI product. 

Plaintiffs Sarah Andersen, Kelly McKernan, Karla Ortiz, Hawke Southworth, Grzegorz Rutkowski, Gregory Manchess, Gerald Brom, Jingna Zhang, Julia Kaye and Adam Ellis, artists, filed a putative class action challenging defendants’ use of their art as training materials for an Artificial Intelligence (AI) software product called Stable Diffusion. Plaintiffs originally brought suit against Stability AI, the creator of the product, as well as Midjourney and DeviantArt, which plaintiffs alleged use Stability’s product. In October 2023, the court dismissed plaintiffs’ claims in part but with leave to amend (read our summary of the district court’s decision here). Plaintiffs then amended their complaint, adding several more artists as plaintiffs and adding Runway AI as a defendant, alleging that Runway AI worked with, helped train and then distributed Stable Diffusion with Stability AI. Plaintiffs also alleged that Runway made a text-to-image generator available via its online AI image product.

The court held that plaintiffs’ allegations of induced infringement against defendant Stability were sufficient to survive dismissal. Plaintiffs alleged, in essence, that Stability promoted the use of Stable Diffusion to infringe plaintiffs’ copyrights. Stability argued that plaintiffs were required to make clear allegations of active steps to encourage direct infringement, especially because Stability Diffusion was capable of substantial noninfringing uses. The court disagreed. Unlike in a case involving the sale of VCRs, where intent to induce infringement could not be presumed or imputed merely because the product was capable of being used for that purpose, plaintiffs in this case alleged that Stability Diffusion was built using their copyrighted materials. Thus, the court held that “[t]he plausible inferences at this juncture are that Stable Diffusion by operation by end users creates copyright infringement and was created to facilitate that infringement by design.” 

Stability also moved to dismiss plaintiffs’ claims under Section 1202(a) of the Digital Millennium Copyright Act (DMCA) for providing or distributing false copyright management information (CMI) and under Section 1202(b)(1) of the DMCA for the intentional removal of CMI. 

The court dismissed plaintiffs’ Section 1202(a) claim with prejudice. Plaintiffs alleged that Stability distributes its models under a license that asserts copyright in the model and that because the models are infringing, Stability was providing and distributing false CMI. But the court found that Stability’s license was “generic” (such as the copyright tag at the bottom of a Facebook page), and thus the license did not suggest any association with plaintiffs’ works. 

The court likewise dismissed plaintiffs’ Section 1202(b) claim with prejudice. The court determined that because none of the output images of the Stability tool were identical to plaintiffs’ works, Stability could not be liable for the removal of CMI during the training process, because failing to affix CMI to a different work is not “removal” under Section 1202(b) of the DMCA. 

Runway AI, the newly added defendant, moved to dismiss plaintiffs’ direct infringement claims brought under two different theories. The first, plaintiffs’ “model theory,” was based on the allegation that Stability’s product (which Runway incorporated into its product) itself constituted an infringing copy of plaintiffs’ works. The second, plaintiffs’ “distribution theory,” was based on allegations that Runway infringed plaintiffs’ exclusive distribution rights because distributing Stable Diffusion is “equivalent to distributing plaintiffs’ works.” 

The court noted that “both the model theory and the distribution theory of direct infringement depend on whether plaintiffs’ protected works are contained, in some manner, in Stable Diffusion as distributed and operated.” The court held that plaintiffs had sufficiently alleged that the works were contained in Stable Diffusion and, further, that the fact “[t]hat these works may be contained in Stable Diffusion as algorithmic or mathematical representations—and are therefore fixed in a different medium than they may have originally been produced in—is not an impediment to the claim at this juncture.” Additionally, as with defendant Stability, the court held that plaintiffs plausibly alleged a claim for induced infringement against Runway. 

The court denied Midjourney’s motion to dismiss plaintiffs’ Lanham Act claim for false endorsement. Plaintiffs alleged that Midjourney had published plaintiffs’ names on a list of artists whose styles its AI product could reproduce and included user-created images that incorporated plaintiffs’ names on Midjourney’s “showcase” site. The court held that plaintiffs’ allegations were sufficient to show that a reasonable consumer might be confused or led to believe that the artists were endorsing Midjourney’s product. The court also denied Midjourney’s motion to dismiss plaintiffs’ trade dress claim, which alleged that Midjourney’s product allowed users to create works capturing plaintiffs’ trade dress (the distinctive look and feel of their artwork). 

Lastly, the court held that plaintiffs had sufficiently amended their copyright claims against Midjourney and DeviantArt to survive those defendants’ motions to dismiss. However, plaintiffs had not sufficiently amended the claim for breach of contract against DeviantArt, and the court dismissed the claim with prejudice. The court also dismissed plaintiffs’ unjust enrichment claims, finding that they were not qualitatively different from plaintiffs’ copyright infringement claims and thus were preempted, but with leave to amend.

Summary prepared by Safia Gray Hussain and Erin Shields.