AI Startup Sued for Teaching Song Lyrics

OSABEE / shutterstock.com
OSABEE / shutterstock.com

Ask any group of parents, and most of them will admit to singing to their kids. Hell, most pride themselves on teaching their children how to speak with lyrics. Knowing full well the importance of not only developing their voice but also their ability to master their native tongue, which is crucial in modern society. So for an Artificial Intelligence (AI) company like Anthropic to write the scripts for their program to sound like a human voice, using music would be a natural fit.

Not so fast claim music publishing conglomerates.

As first reported by The Guardian, Universal Music, ABKCO, and Concord Publishing want to snuff out the sudden powerhouse AI startup Anthropic. In a lawsuit filed in Tennessee, these record companies accuse Anthropic and its AI-powered chatbot Claude of illegally using a massive collection of song lyrics in Clause’s training. Writing script based upon the inputs from its developers, this method of infusing lyrics is something that isn’t new for developing human-like speech patterns.

Claude is considered to be a sophisticated chatbot, and according to the lawsuit, the inclusion of over 500 distinct songs in his code without their permission is illegal. As The Guardian explains, the music “ranging from the Beach Boys’ ‘God Only Knows’ and the Rolling Stones ‘Gimme Shelter’ to Mark Ronson and Bruno Mars ‘Uptown Funk’ and Beyoncé’s ‘Halo.’” Further, they allege theta Clause will respond with the exact lyrics in response to various prompts. Even when they ask that a song be crafted in the style of a specific artist.

The lawsuit goes a step further by claiming that when prompted to craft a song about Buddy Holly, Claude returns the lyrics to Don McLean’s American Pie. They emphasized a legal standpoint when the publisher’s attorney, Matt Oppenheim, tried claiming that this was a common part of copyright. He states it is “well-established by copyright law that an entity cannot reproduce, distribute, and display someone else’s copyrighted works to build its own business unless it secures permission from rights holders.”

Back in September, Amazon and Anthropic entered into a lucrative deal. As Breitbart News outlined, “Amazon and Anthropic revealed the deal on Monday, announcing that it is part of a broader collaboration between the two companies in order to develop foundation models that support generative AI systems…Foundation models, also known as large language models, are technologies that function as platforms for AI applications. They are trained by using vast pools of online information, such as blog posts, digital books, articles, and music — to generate text, images, and video content that appears to have been created by a human.”

This kind of exchange between the platforms is a great way for Amazon to capitalize on its technology and for Anthropic to have massive amounts of space to develop its programs. A collection of information and lyrics like Amazon has in their repertoire would be more than enough to make Anthropic the pinnacle for AI chatbot development.

Lawsuits like this are tricky to even find a court to take it on. The same lyrics they are complaining about Anthropic using are spat out using search engines with relative ease. Getting a chatbot to automatically churn them out is simply a step in its development of the mastery of the English language. It’s no different than parents singing to their children and using music to help them form their vocabulary.

That being said, there is something to be said about using these chatbots to create “new” music from an artist. Using their writing style to come up with songs a musician failed to release would be not only difficult to get the tone right for, but there could be a case for copywriting problems there. Given how much of it would be faked in a matter of days, it makes sense to do this. However, simply learning the lyrics of the greats isn’t criminal, even if their music was.