They sold their face to an AI but had not planned that

- Jackson Avery

Real faces but words generated by AI: on social networks, users sell their image to marketing societies specializing in AI but it is sometimes found, without their knowledge, in doubtful “Deepfakes” relaying infox or political propaganda.

Less expensive than a real shooting with actors, but more realistic than a avatar entirely generated by AI, this technology makes it possible to constitute large catalogs of digital “models” to animate videos, most of the time to boast products or services.

According to Solène Vasseur, digital communication consultant and AI, it is a new form of “fast” and “inexpensive” advertising compared to traditional production. And using these avatars allows brands to give an image of modernity and “to show that they are comfortable with new tools”.

The process is fast: half a day of filming on a green background, facing a promptr. The actor for a day must interpret different emotions. Then, artificial intelligence will make him say all kinds of words, in an infinity of languages.

“The performance of a human being-in terms of voice, body language or micro-expression-remains, for the moment, above all that AI can produce,” explains Alexandru Voica, head of general affairs at Synthesia, one of the leaders in the United Kingdom sector.

To make a video at his convenience, the client of the platform selects a face, a language, a tone (serious, playful …), then inserts the desired text. All for a low price: from a free ultra-basic version to a few hundred euros for a “pro” version for example.

“Surrealist”

But between legal jargon, sometimes abusive clauses and money quickly won, some struggle to fully understand what they are committed by selling their image.

South Korean actor and model Simon Lee paid the price. Sometimes presented as a surgeon, sometimes as a gynecologist in videos on Tiktok and Instagram, we see his avatar boasting pseudo remedies, such as lemon balm to lose weight or icy baths to fight acne. Videos that serve above all as a pretext to promote a hair product sold on Amazon.

“It’s annoying. If it was a correct advertisement, it wouldn’t have bothered me. But there, it is clearly a scam, “he told AFP. He would like to have the videos withdrawn, but his contract stipulates that his image “can be used by third parties”.

The contracts offer up to a few thousand euros, depending on the duration of the operation and the notoriety of the person.

For Adam Coy, 29, actor and director in New York, selling his image was an economic choice. In October 2024, he yielded the rights of his face and his voice to the MCM company for $ 1,000, authorizing the use of his avatar for a year.

“If I was more successful, I might may have this ethical conversation with myself,” he explains.

A few months later, his partner’s mother came across videos in which his digital double claims to come from the future and announced to come.

Nothing prohibited by the contract, which only prohibits the use for pornographic purposes, or linked to alcohol and tobacco. “It’s quite surreal. I do not know why (by signing the contract) I imagined becoming a kind of cartoon, ”he says.

But “it is a correct sum for little work,” he admits.

Propaganda

Unpleasant surprise too, in 2022, for the English actor and model Connor Yeates, who signed with the company Synthesia, a three -year contract for 4,600 euros.

He was sleeping at the time on a friend’s sofa, he reports to the “Guardian” in 2024. “I have no rich parents, I needed money and it seemed to me to be a good opportunity”. He then discovered that his image is used for political purposes, in particular to promote Ibrahim Traoré, president of Burkina Faso brought to power by a coup in 2022.

“Three years ago, some videos escaped our moderation because our system was not well equipped to treat polarizing or exaggerated content,” recognizes Mr. Voica, at Synthesia.

The company claims to have established new procedures. But other platforms have since emerged and apply much less strict rules, as an AFP journalist has seen by having an enormities say to an avatar available on one of them.

Many people “ignore the real scope of what they sign,” warns Alyssa Malchiodi, a lawyer specializing in business law. They “discover that they have given up extensive, sometimes perpetuity, without control over the content generated”.

Contracts often contain clauses considered abusive: global exploitation, unlimited, irrevocable, without the right to withdrawal.

“The law does not follow the speed of development of the AI”, deplores the lawyer. “These are not invented faces,” she insists but real people that AI does not erase, but exposes. “You have to be careful.”

Jackson Avery

Jackson Avery

I’m a journalist focused on politics and everyday social issues, with a passion for clear, human-centered reporting. I began my career in local newsrooms across the Midwest, where I learned the value of listening before writing. I believe good journalism doesn’t just inform — it connects.