
South Korean actor Simon Lee was stunned when he saw his likeness—at times as a gynecologist or a surgeon—being used to promote questionable health cures on TikTok and Instagram.
He is one of scores of people who have licensed their image to AI marketing companies, and then ended up with the unpleasant surprise of seeing themselves featured in deepfakes, dubious adverts or even political propaganda.
“If it was a nice advertisement, it would’ve been fine to me. But obviously it is such a scam,” he told AFP, adding that the terms of his contract prevented him from getting the videos removed.
The result was that he was left with his digital clone advocating for lemon balm tea to lose weight or ice baths to fight acne.
AI technology—cheaper than filming actors, but more realistic than an entirely AI-generated avatar—allows firms to build catalogues of digital models to appear in videos that mostly promote products or services.
Solene Vasseur, a digital communications and AI consultant, said this new form of advertising was fast and cheap compared to a real-life production.
Using avatars is also a way for brands to “show that they’re comfortable with the new tools.”
The method is quick and straightforward: half a day’s shooting, a green screen and a teleprompter.
The actor has to display different emotions, which will allow the artificial intelligence to make the avatar say all sorts of things, in an infinite number of languages.
“The performance in terms of the expressiveness of a real human—voice, facial movements, body language… is still superior to anything AI can generate right now,” said Alexandru Voica, head of corporate affairs at Synthesia, a UK-based industry leader.
To make a video, the platform’s customers just have to select a face, a language, a tone—such as serious or playful—and insert the script.
The whole process comes at a modest price: the ultra-basic version is free, while the pro version costs a few hundred euros.
“Am I crossing a line?”
The contracts offer up to a few thousand euros, depending on duration and how well a person is known.
But they can be filled with legal jargon and sometimes abusive clauses, and in their rush to make quick cash, some people have found it hard to fully understand what they were signing up for.
Such was the case of Adam Coy, a 29-year-old actor and director based in New York, selling his image was a financial decision.
In October 2024, he signed over the rights to his face and voice to MCM for $1,000 (885 euros), granting the company the use of his avatar for one year.
“If I was more successful, I feel like I would maybe be able to have the ethical conversation with myself,” he said. “Is this right, or am I crossing a line by doing this?”
A few months later, his partner’s mother came across videos in which his digital clone claimed to come from the future and announced disasters to come.
None of this is forbidden by the contract, which only prohibits use for pornographic purposes, or in connection with alcohol and tobacco.
Coy described the experience of watching his avatar as “surreal” and said he initially thought he would be an animated avatar.
But “it’s decent money for little work,” he added.
Propaganda
British actor and model Connor Yeates, who signed a three-year contract with Synthesia for 4,600 euros, also encountered an unpleasant surprise in 2022.
At the time, he was sleeping on a friend’s sofa, he told British newspaper The Guardian in 2024.
“I don’t have rich parents and needed the money,” he said.
This seemed like a “good opportunity.”
But he then discovered that his image had been used to promote Ibrahim Traore, the president of Burkina Faso who took power in a coup in 2022.
“Three years ago, a few videos slipped our content moderation partly because there was a gap in our enforcement for factually accurate but polarizing type of content or videos with exaggerated claims or propaganda, for example,” said Voica, head of corporate affairs at Synthesia.
The firm said it has introduced new procedures but other platforms have since appeared, some applying much less stringent rules.
An AFP journalist was able to make an avatar from one of these platforms say outrageous things.
“The clients I’ve worked with didn’t fully understand what they were agreeing to at the time,” said Alyssa Malchiodi, a lawyer who specializes in business law.
“One major red flag is the use of broad, perpetual and irrevocable language that gives the company full ownership or unrestricted rights to use a creator’s voice, image and likeness across any medium,” she said.
Contracts often contain clauses considered abusive, Malchiodi said, such as worldwide, unlimited, irrevocable exploitation, with no right of withdrawal.
“Technology is evolving faster than courts or legislatures can respond,” the lawyer said.
“These are not invented faces,” she said, calling for more caution.
© 2025 AFP
Citation:
They sold their likeness to AI platforms—and regretted it (2025, April 16)
retrieved 16 April 2025
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Leave a comment