Every day, we are discovering the mind-blowing power of creating content using artificial intelligence (AI). Though exciting to many, there are risks associated with what we create and how we go about doing so using this groundbreaking technology. It is well-known that the law has often lagged behind the development of technology.
In many instances, we have to look to laws drafted decades before much of the technology we use today was created. Thus, the use of AI has spawned many unprecedented legal questions that we just don’t have clear answers to right now.
For example, last week, a creator by the name of Ghostwriter977 (Ghostwriter), set the internet ablaze when they released an allegedly AI-generated song entitled, “Heart on my Sleeve.” The song features vocals that sounded extremely similar to that of Toronto-born superstars, Drake and The Weeknd.
The song was released on many major streaming platforms, including Apple Music, Spotify, YouTube, Amazon, SoundCloud, Tidal, and TikTok. The song reached over 15 million plays before it was taken down in response to complaints from the artists’ publishing company, Universal Music Group (UMG).
Copyright
UMG argued the song was in violation of copyright law, however, it is unclear if this is actually true. Copyright ownership allows you the exclusive right to use and profit from creative works such as art, books, and music.
The United States Copyright Office only allows a copyright to attach to a creative work if there is human authorship. In this case, there is an argument that the content was generated by artificial intelligence, not by a human.
However, the question remains whether a compilation of the artists’ music was used to generate the sound-alike voices in the song, which may allow copyright ownership to attach.
Additionally, though the end product, the song recording itself, may have been generated by artificial intelligence, it was still prompted and potentially written by a human. And in that case, the lyrics of the song themselves, if originally developed by Ghostwriter, may actually belong to them.
There are some defenses to copyright infringement, such as fair use, which permits the unauthorized use of copyrighted material for the purpose of criticism, comment, news reporting, education, scholarship, or research. Ultimately, copyright issues of this novel nature are very subjective and would be determined in court.
Name, Image, and Likeness
The argument could be made that Ghostwriter violated the right of publicity of Drake and The Weeknd by creating a song featuring voices that sound like theirs without their permission. The right of publicity grants you a right to profit from your name, image, and likeness, including your voice.
However, there is a clear distinction between using a person’s actual voice versus a voice that only sounds like the person’s voice. The First Amendment allows one to imitate the sound of another even when they specifically intend to do so – think cover artists.
However, there may be an exception to this rule when the imitation is connected with the intent to sell a product. See Midler v. Ford, 849 F.2d 460, 463 (9th Cir. 1988).
If so, there may be a showing of a violation of a right of that person’s publicity. In Midler, the Ford Motor Company used a Bette Midler sound-alike to sing one of her songs to sell cars. In the case of “Heart on my Sleeve,” it is not clear if anything was actually sold in connection with the song.
We’d also have to know how much the Ghostwriter tried to connect the song to Drake and The Weeknd and whether Ghostwriter received or attempted to receive any compensation in exchange for the song via the streaming platforms. Without more information, it is difficult to say there is a publicity right violation in this instance.
Consumer Protection
Although a less sexy topic, “Heart on my Sleeve” may also violate consumer protection laws. The Federal Trade Commission (FTC) and state governments enforce laws that protect the public from deceptive or unfair business practices.
One may argue that the Ghostwriter used deceptive or unfair business practices to stream and popularize a song that misled consumers by using vocals that mimic Drake and The Weeknd.
We would likely have to determine the lengths the Ghostwriter took to connect the song to the artists; like if the artists were listed in the credits and if their imagery was used in the cover art on streaming platforms. The developer of the underlying AI technology that facilitated the creation of the song could also be liable under consumer protection laws.
The FTC may come after you if you make, sell, or use a tool that is effectively designed to deceive – even if that’s not its intended or sole purpose. The FTC warns developers of AI technology to consider how their products could be used to deceive consumers and mitigate the risks where possible.
However, this may be a stretch since it does not appear that consumers were actually led to consume anything other than listening and sharing the song.
As you can see, the law is not very clear when it comes to the issue of using AI-generated content that mimics a real person. These types of analyses are extremely fact-specific and require a full investigation to determine what laws are implicated, what types of damages should be attached, and ultimately who should be held liable.
There have been many lawsuits filed to address some of these unclear issues and we will be sure to update you when we have more answers.
— Contributed by Ashley Cloud
Ashley Cloud is the founder of The Cloud Law Firm, servicing creative entrepreneurs in all 50 states. Follow her on Instagram and TikTok for more information.