Anthropic Claims Intermediate Copying of Protected Works for AI Training May Qualify as Fair Use

By Marcus Delano Thompson

December 12, 2024 at 08:47 AM

Anthropic, an AI research company currently facing a music industry copyright lawsuit, has stated that copying protected works as an intermediate step to create non-infringing output can qualify as fair use.

Anthropic logo on notepad

Anthropic logo on notepad

In their response to the Copyright Office, Anthropic defended their AI assistant Claude's training process as "quintessentially lawful." They argue that their copying of copyrighted materials serves only as an intermediate step for statistical analysis, not for expressive purposes.

Key points from Anthropic's statement:

  • The copying process extracts unprotectable elements to create new outputs
  • The use is transformative and non-expressive
  • The analysis focuses on statistical relationships between words and concepts
  • The process doesn't communicate copyrighted expression to users

ASCAP strongly opposes this position, stating that unauthorized use of copyrighted works for AI training cannot constitute fair use because:

  • It's not transformative
  • Each unauthorized use serves a commercial purpose
  • Copyright holders' consent is required

Anthropic logo on black background

Anthropic logo on black background

This debate occurs amid broader discussions about AI regulation, with organizations like IFPI and GESAC advocating for mandatory AI training disclosures. The controversy continues as new AI-powered features emerge on platforms like Spotify and with the launch of services like Stable Audio.

Universal Music, Concord, and ABKCO are currently suing Anthropic for alleged "systematic and widespread" copyright infringement, highlighting the ongoing tension between AI development and creative rights protection.

Related Articles

Previous Articles