ElevenLabs, a startup specializing in AI services including voice cloning, recently took action against a user responsible for creating an audio deepfake of Joe Biden.
The deepfake, an impersonation of the president, was utilized in a robocall aimed at disrupting the elections in New Hampshire.
The call instructed voters not to participate in their state’s primary election. Initially, the technology behind voice cloning was uncertain.
However, a detailed investigation by security firm Pindrop revealed that ElevenLabs’ tools were utilized in creating the deepfake.
The security firm analyzed a robocall’s audio, removing background noise and enhancing clarity. They compared the cleaned audio to samples from over 120 voice synthesis technologies known for creating deepfakes.
According to Pindrop CEO Vijay Balasubramaniyan, the analysis indicated a likelihood of “well north of 99 percent” that the audio originated from ElevenLabs.
Bloomberg reported that ElevenLabs was informed of Pindrop’s findings and is currently investigating the matter. The company has already taken action by identifying and suspending the account responsible for creating the fake audio.
In response to the issue, ElevenLabs stated to the news organization that they are unable to provide specific comments but emphasized their commitment to preventing misuse of audio AI tools. They also highlighted their serious approach to addressing any instances of misuse.
This deepfaked Biden robocall highlights the potential dangers of technologies capable of replicating someone’s appearance and voice, especially in influencing votes during the upcoming US presidential election.
According to Kathleen Carley, a professor at Carnegie Mellon University, this incident merely scratches the surface of what could transpire regarding voter manipulation and election interference.
Carley emphasized that such technological manipulation could extend beyond robocalls, raising concerns about voter suppression tactics and attacks targeting election personnel.
She views the deepfaked Biden robocall as a precursor to a range of potential deceptive tactics that could emerge in the months leading up to the election.
Users wasted no time in exploiting ElevenLabs’ beta platform to generate audio clips resembling celebrities uttering controversial statements shortly after its launch.
The startup’s technology enables the replication of voices for purposes like “artistic and political speech contributing to public debates,” as stated on its safety page.
However, the warning on the safety page explicitly prohibits users from misusing the cloned voices for fraudulent activities, discrimination, hate speech, or any online abuse, citing legal implications.
Despite these warnings, there’s a clear necessity for ElevenLabs to implement stricter measures to thwart malicious individuals from leveraging their tools to sway voters and manipulate elections globally.
- Voice.AI Stole Open-Source Code, Banned The Developer Who Informed Them About This, From Discord Server