in

Human Models Horrified to Discover Their Faces Are Being Used for AI Propaganda


“People will think I am involved in the coup.”

Bad Surprises

Synthesia, a text-to-video AI company with a valuation of over one billion dollars, claims that its tech allows users to “create studio-quality videos with AI avatars” as easily as they can throw together a slide deck.

The company’s clientele is a wild mix, ranging from media stalwarts like Reuters and global accounting giant Ernst & Young to authoritarian regimes — a reality that’s come as a terrible surprise to the human models whose faces Synthesia’s AI models are trained on.

As The Guardian reports, several human models who posed for Synthesia have been horrified to discover that their likenesses have been used in AI-powered propaganda clips generated by groups linked to authoritarian states like China, Russia, and Venezuela.

“I’m in shock, there are no words right now,” Mark Torres, a creative director based in London who modeled for Synthesia, told The Guardian after viewing one of the clips for the first time. “I’ve been in the [creative] industry for over 20 years and I have never felt so violated and vulnerable. I don’t want anyone viewing me like that.”

“Just the fact that my image is out there, could be saying anything – promoting military rule in a country I did not know existed,” Torres added. “People will think I am involved in the coup.”

Fine Print

It’s a particularly hot-button topic with California passing two new bills last month making it illegal to use an AI-generated digital replica of an actor’s likeness or voice without their explicit consent.

The use of generative AI played a major part in last year’s Screen Actors Guild and Writers Guild of America strike, with performers eventually reaching an agreement, including new rules surrounding the use of AI.

Synthesia’s clients, however, were seemingly roped into a good deal, being offered thousands of dollars for a single day of filming.

Actor and one-time Synthesia model Dan Dewhirst, whose likeness was discovered last year by Semafor being used in AI-generated Venezuelan propaganda, told The Guardian that he continues to worry about the AI videos’ potential impact on his career.

“I may have lost clients,” Dewhirst told The Guardian, adding that when he found out about the clips, he “was furious.”

“It was really, really damaging to my mental health,” he continued.

Synthesia, for its part, has maintained that it’s protected by the fine print, with a spokesperson telling The Guardian that the company explains its “terms of service and how our technology works so they are aware of what the platform can do and the safeguards we have in place” at the start of its “collaboration” with actors and models.

“Though our processes and systems may not be perfect,” the spokesperson added, “our founders are committed to continually improving them.”

More on AI and propaganda: ISIS Affiliates Using AI to Generate Propaganda Promoting Terrorism

The race to find new materials with AI needs more data. Meta is giving massive amounts away for free

The race to find new materials with AI needs more data. Meta is giving massive amounts away for free