The One & Only Newsletter In The USA That Tracks 150+ Issues! Try Now

2 Celebrities Sued An AI Company For Creating And Using AI Voice Clones Without Their Permission

In 2022, voiceover artist Paul Skye Lehrman was visiting a friend when they watched a YouTube video from a channel named Military News discussing Russia’s movements into Ukraine.

To his surprise, he identified the voice in the video as his own, even though he had not given permission for his voice to be used by the channel.

“It was my voice dictating the conflict and talking about weapons,” Lehrman says. “These are words I never said.”

A year later, while on his way to a medical appointment, he unexpectedly heard his voice in a podcast discussing Hollywood’s recent strikes.

The podcast featured a generative artificial intelligence text-to-speech tool answering questions about the risks associated with the technology.

Realizing that his voice might have been used without permission, he and his wife Linnea Sage, who is also a voiceover actor and suspects her voice was similarly misappropriated, decided to seek legal advice.

2 Celebrities Sued An AI Company For Creating And Using AI Voice Clones Without Their Permission

A proposed class action was filed in a New York federal court on Thursday against LOVO, an AI startup based in Berkeley.

The lawsuit alleges that the company unlawfully used the voices of individuals, including celebrities like Scarlett Johansson, Ariana Grande, and Conan O’Brien.

This legal action is thought to be the initial lawsuit targeting an AI company for employing people’s likenesses to train an AI system.

It highlights an increasing divide between content creators and companies that are accused of collecting large amounts of data to enhance their technology.

The legal action aims to represent voiceover artists who feel that their voices were used without permission by LOVO, a company that did not provide a comment when asked. The lawsuit also seeks to secure a court order to prevent the company from devaluing the artists’ work further.

The actors are now part of an increasing number of individuals who own rights, such as authors, artists, and publications.

They have decided to pursue legal action due to what they believe is the unauthorized and uncompensated use of their works and images to support a lucrative industry worth billions of dollars.

According to Jeffrey Bennett, the general counsel of SAG-AFTRA, the misconduct described in the lawsuit is expected to become more common as people overlook the rights associated with their voices.

The union argues that training artificial intelligence systems using members’ images without permission violates their rights.

The lawsuit filed on Thursday revolves around an accusation of breaching New York’s right of publicity law.

It claims that LOVO promotes its services by using what it refers to as “stolen property” and misleadingly asserts that it possesses the legal authorization to commercialize these voices, which is not the case.

In 2020, Lehrman was approached by an anonymous individual, who was later revealed to be an employee of LOVO, on the freelance platform Fiverr to offer voiceover services, as stated in the complaint.

Upon seeking more information, he was reportedly informed that his voice would be exclusively utilized for academic research.

After receiving $1,200 for his services, Lehrman was reassured in a subsequent message that his voice recording “will not be utilized for any other purpose.”

However, two years later, the lawsuit alleges that he identified his voice on the YouTube channel Military News, with over 336,000 subscribers, in a video discussing Russian weapons.

The complaint also mentions that on or around June 13, 2023, Mr. Lehrman heard his voice featured in a podcast episode of ‘Deadline Strike Talk,’ without receiving any compensation for its use.

As per the complaint, LOVO promoted the voice he allegedly misappropriated as part of its subscription service using the alias “Kyle Snow.”

The legal claim mentions that his voice was the default option for the software, highlighted as one of the top five text-to-speech voices, and utilized for advertising and explaining the product.

Sage, a voiceover artist with 14 years of experience famous for her contributions to Marvel video games, was approached in 2019 to create sample scripts for radio commercials on Fiverr.

Similar to Lerhman, she was assured that her recordings would only be used internally and would not necessitate any licensing rights, as stated in the lawsuit.

However, in 2023, Sage claimed that she found out LOVO had been utilizing her voice for its subscription service under the alias “Sally Coleman.”

Although LOVO argues that its AI technology was trained on numerous voices, the legal action asserts that the voices attributed to “Kyle Snow” and “Sally Coleman” unmistakably belong to Lehrman and Sage, respectively.

Steve Cohen from Pollock Cohen, who represents the actors, claims in the lawsuit that the voices of other LOVO voice options belong to other class Plaintiffs who did not provide consent for their voices to be used for teaching Genny, by LOVO, or for sale as part of LOVO’s services, and they were not fairly compensated.

With more than ten years of experience as a voiceover artist, Lehrman is widely recognized for his performances in NBC’s New Amsterdam and CBS’s Blue Bloods.

He reveals that he has observed a significant decrease of around 50% in his workload compared to the previous year. Lehrman emphasizes that the concern goes beyond fewer job offers; he is also troubled by the negative impact on his professional reputation.

“I find that my voice is articulating messages I don’t align with, associating me with brands I wouldn’t choose to collaborate with, and positioning me in places I wouldn’t wish to be associated with,” he elaborates. “Furthermore, I lack control over the subtleties of the artistic expression.”

Sage is worried about the possibility of being completely pushed out of Hollywood due to the increasing use of AI voice technology.

She cautions that if unauthorized use of her voice, like by LOVO, persists, the majority of voiceover work could be taken over by AI-generated voices.

“I may not have achieved great success, but I have collaborated with numerous dedicated individuals in this field throughout my entire adulthood,” she explains.

Regarding the SAG-AFTRA lawsuit, Bennett emphasizes the importance of companies understanding the rights they are acquiring and the rights individuals are entitled to. He advises members against agreeing to contracts with overly broad language that grants perpetual rights.

Bennet explains that we currently reside in a world where it is possible to clone someone’s voice and appearance.

He emphasizes the significant risk posed by overly broad provisions. By agreeing to such terms, individuals might unknowingly forfeit their rights and consent to being cloned.

With the emergence of AI technologies enabling the replication of actors’ appearances, the union is pushing for a federal right of publicity legislation.

At present, there is no federal regulation addressing the utilization of AI to imitate a person’s voice. While various state laws on right of publicity have partly addressed this issue, individuals have limited options in states without such specific regulations.

Bennett emphasizes the importance of a federal law regarding voice and likeness, highlighting the establishment of intellectual property rights at the federal level.

Having intellectual property rights at the federal level enables individuals to address and request the removal of online unauthorized use.

The class action lawsuit aims to represent individuals whose voices were utilized for training LOVO’s AI system, as well as those whose voices were unlawfully used.

There is a possibility that the lawsuit could involve high-profile talent. Moreover, the company is known for promoting its services by using names and images that barely conceal references to celebrities, like “Ariana Venti,” “Barack Yo Mama,” and “Cocoon O’Brien.”

Despite advertising the ability to replicate any voice, the company specifies that this feature should not be used for imitating celebrities; it is solely meant for personal entertainment purposes.

Related Stories:

Help Someone By Sharing This Article