in

How Meta and AI companies recruited striking actors to train AI


This kind of legalese can be hard to parse, particularly when it deals with technology that is changing at such a rapid pace. But what it essentially means is that “you may be giving away things you didn’t realize … because those things didn’t exist yet,” says Emily Poler, a litigator who represents clients in disputes at the intersection of media, technology, and intellectual property.

“If I was a lawyer for an actor here, I would definitely be looking into whether one can knowingly waive rights where things don’t even exist yet,” she adds. 

As Jessica argues, “Once they have your image, they can use it whenever and however.” She thinks that actors’ likenesses could be used in the same way that other artists’ works, like paintings, songs, and poetry, have been used to train generative AI, and she worries that the AI could just “create a composite that looks ‘human,’ like believable as human,” but “it wouldn’t be recognizable as you, so you can’t potentially sue them”—even if that AI-generated human was based on you. 

This feels especially plausible to Jessica given her experience as an Asian-American background actor in an industry where representation often amounts to being the token minority. Now, she fears, anyone who hires actors could “recruit a few Asian people” and scan them to create “an Asian avatar” that they could use instead of “hiring one of you to be in a commercial.” 

It’s not just images that actors should be worried about, says Adam Harvey, an applied researcher who focuses on computer vision, privacy, and surveillance and is one of the co-creators of Exposing.AI, which catalogs the data sets used to train facial recognition systems. 

What constitutes “likeness,” he says, is changing. While the word is now understood primarily to mean a photographic likeness, musicians are challenging that definition to include vocal likenesses. Eventually, he believes, “it will also … be challenged on the emotional frontier”—that is, actors could argue that their microexpressions are unique and should be protected. 

Realeyes’s Kalehoff did not say what specifically the company would be using the study results for, though he elaborated in an email that there could be “a variety of use cases, such as building better digital media experiences, in medical diagnoses (i.e. skin/muscle conditions), safety alertness detection, or robotic tools to support medical disorders related to recognition of facial expressions (like autism).”

Now, she fears, anyone who hires actors could “recruit a few Asian people” and scan them to create “an Asian avatar” that they could use instead of “hiring one of you to be in a commercial.” 

When asked how Realeyes defined “likeness,” he replied that the company used that term—as well as “commercial,” another word for which there are assumed but no universally agreed-upon definitions—in a manner that is “the same for us as [a] general business.” He added, “We do not have a specific definition different from standard usage.”  

ChatGPT & Large Language Models: Wohin die Reise gehen kann (#86 Alexandra Ebert, Bernhard Knapp)

John Grisham Unloads on Big Tech Using His Books to Train AI