In a twist similar to Black Mirror, some actors who are low on funds are now regretting their decisions to sell their likenesses for AI videos. They feel that these videos are embarrassing and damaging, according to a report by AFP.
One such actor is Adam Coy, a 29-year-old from New York. He licensed his face and voice to a company called MCM for a year, receiving $1,000 without considering the implications of his choice.
Later, his partner’s mother stumbled upon videos featuring him as a doomsayer, predicting disasters, which made him rethink his decision.

Another example is South Korean actor Simon Lee, whose AI likeness was used to mislead unsuspecting internet users in a troubling way.
He expressed his shock to AFP when he discovered that his AI avatar was promoting dubious health remedies on TikTok and Instagram, leaving him embarrassed to have his image associated with such scams.
As the technology for AI avatars continues to advance, more actors may feel tempted to license their images. One prominent company in this field, Synthesia, based in the UK, doubled its valuation to $2.1 billion in January, as reported by CNBC.
Recently, they also secured a $2 billion deal with Shutterstock to make their AI avatars appear more lifelike, according to The Guardian.
To encourage actors to license their likenesses, Synthesia has launched an equity fund. The company announced that actors who create popular AI avatars or appear in their marketing will have the chance to earn options in a $1 million pool of company shares.
Actors may find it easy and potentially profitable to sell their AI likenesses. They only need to show up, make various facial expressions in front of a green screen, and then receive their payments.
However, lawyer Alyssa Malchiodi, who represents actors, mentioned to AFP that many clients do not fully understand the agreements they are signing.
They often agree to contracts with unfair clauses, allowing for “worldwide, unlimited, irrevocable exploitation, with no right of withdrawal.”
Malchiodi pointed out that one major warning sign is the use of broad language in contracts that gives companies complete ownership or unrestricted rights to use an actor’s voice, image, and likeness in any medium.
Even a company like Synthesia, which claims to ethically develop AI avatars and avoid harmful content, cannot guarantee that all issues will be caught during content moderation.
British actor Connor Yeates shared with AFP that his video was used to promote Ibrahim Traore, the president of Burkina Faso, who came to power through a coup in 2022, violating Synthesia’s policies.
Alexandru Voica, head of corporate affairs at Synthesia, explained that a few videos slipped through their content moderation system three years ago due to gaps in enforcement regarding factually accurate but controversial content or videos with exaggerated claims and propaganda.
Synthesia provides options for actors to opt out and promises to safeguard their interests. For those in the acting profession who fear that AI might take their jobs, AI avatars could be a way to boost their income and secure their livelihoods.
Since 2023, the company has sold its avatars to around half of the Fortune 500 companies, and that number has now increased to 70%, according to Voica speaking to Ars. For brands, using AI avatars helps cut down on production time and costs for marketing, making it unlikely that the demand will decrease.
Voica noted that “the typical Synthesia user is an employee from a Fortune 500 company,” which is quite different from users of less-regulated AI apps that often produce low-quality, misleading, or fraudulent marketing content.
Voica explained that harmful content is treated as a serious security issue by Synthesia, while many other AI apps view it as part of their business model. However, for actors who regret signing contracts that prevent them from removing harmful videos, the financial gain may not seem worth it anymore.
Yeates received about $5,000 for a three-year contract with Synthesia, a decision he made out of necessity since he didn’t come from a wealthy background. He likely didn’t foresee his likeness being used for propaganda, a situation that even Synthesia didn’t expect.
Some actors may dislike their AI avatar videos but find the financial compensation sufficient to outweigh their concerns. Coy admitted that money was a key factor in his choice, and although he felt it was “surreal” to be portrayed as a con artist selling a bleak future, he acknowledged that “it’s decent money for little work.”
To improve conditions for actors, Synthesia is creating a talent program that aims to involve actors in decisions regarding AI avatars.
According to a blog post from the company, “By involving actors in decision-making processes, we aim to create a culture of mutual respect and continuous improvement.”
The company also stated that it has implemented security features to reduce misuse and protect actors from harmful content. Importantly, these measures include restricting the use of AI avatars in paid advertising on social media or broadcast platforms.
Perhaps most crucially, Synthesia allows actors to opt out at any time if they no longer want their AI avatars to appear in new videos. While this option may offer some relief, it won’t affect videos that are already online.
Currently, Synthesia claims to be the “only enterprise-focused AI video platform to proactively monitor and thoroughly prevent the creation of harmful content,” with policies designed to ensure that the likenesses of their talent are not misused.
Other Stories You May Like