When Erik Passoja voiced a Belgian geneticist in a “Call of Duty” game nearly 10 years ago, he didn’t expect that his face would also pop up on an entirely different character.
But following the game’s release in 2014, a friend told Passoja that his son had shot someone who looked just like him in the player-versus-player version of the game. The friend sent over a screenshot of a buff, armored Passoja – but with shorter hair.
“I remember feeling violated,” said Passoja, a Los Angeles resident who also has worked on games including “Diablo 4” and “Red Dead Redemption 2.” To Passoja, the experience highlights what’s at stake as actors union SAG-AFTRA strikes and calls for better protections around the use of artificial intelligence, among other demands.
Protections for game voice actors fall under a separate SAG-AFTRA contract for interactive work – striking actors can still do voice work for games – but that contract, negotiated in 2017, did not include AI.
And many voice actors say that they are looking at the outcome of the current strike, which began last month, as a bellwether for their own future. Society can’t stop AI technology from advancing, actors said, but workers can secure contracts that would require their consent to reproduce their voice or likeness and compensate them when that does happen.
A new interactive contract representing about 2,500 performers is being negotiated, members of the SAG-AFTRA negotiating committee said.
The interactive media agreement covers “off-camera (voiceover) performers, on-camera (motion capture, stunt) performers, stunt coordinators, singers, dancers, puppeteers, and background performers,” according to SAG-AFTRA.
Although the technology to reuse a likeness or modify a voice has existed for years, actors say that AI ups the ante because it can scrape more information more efficiently and potentially turn it into a plausible clone of an actor, combine actors’ work or pass as a new, ersatz artist.
The debate over the use of AI comes as nearly half of Americans – 45% – are concerned about the effect artificial intelligence will have on their own line of work, compared to 29% who are not concerned, according to an August poll for The Times conducted by Leger, a Canadian-based polling firm with experience in U.S. surveys.
Voice actors – in gaming and beyond – need to protect themselves from AI, Passoja said, because repurposing his likeness in “Call of Duty” was “completely unethical, completely immoral, and yet completely within the bounds of my contract.” A performer’s “product” – their face, voice and movement – should be digitally watermarked and digitally tracked, he said, so actors can be fairly compensated when that product is used.
“It needs to start yesterday,” he said, “because of what happened to me.”
Activision Blizzard, the Santa Monica-based parent company of “Call of Duty” publisher Activision, declined to comment. A spokesperson for Electronic Arts, the Redwood City-based gaming company whose titles include “Apex Legends” and “It Takes Two,” also declined to comment.
Zeke Alton, a voice actor and member of the SAG-AFTRA interactive contract negotiating committee, said that actors and writers aren’t asking for an end to AI technology. Instead, he said, they want a baseline agreement that if their work is used, a company will ask for consent and pay for that use, as well as provide transparency around the use of data for training or machine learning.
“If we don’t put at least hard cases around the edges to ensure that a career remains, the technology will catch and surpass (us),” he said.
AI-generated voices will take over voicing nonplayer characters if unregulated, he said. The problem, Alton said, is that type of work is where most actors “cut their teeth” and learn how to be a lead voice actor.
“Those experience-building jobs will be gone because they’ll be done by the algorithm,” said Alton.
He added that even if the union does secure AI protections, the nonunion world is open. A growing segment of the video game industry – particularly quality assurance testers – is pushing to unionise.
“You see it across social media all the time that the video game companies are using their nonunion characters to feed machine learning algorithms that will then be able to do the characters without actually hiring humans,” said Alton. “In the very near future, that may mean that your very first job is your last job.”
Voice actors have recently noticed audio from video games being stripped and modified, and placed on websites to either be put into other games or to perform lines that the actor wouldn’t say.
“This is dangerous on multiple levels,” said Tim Friedlander, president and founder of the National Assn. of Voice Actors. “There is one voice actor that we’re working with who has found a piece of audio from a game she did years ago in a new mobile game. And it’s a direct lift or it’s a clone of her voice.”
Last month, voice actors spoke out against game “mods” – in which players or fans of a game alter content – in the popular role-playing game “Skyrim,” which used AI-generated voices based on actors’ performances, cloning them for pornographic purposes.
Julia Bianco Schoeffling, chief operating officer and casting director at the Halp Network, which connects clients with onscreen and offscreen talent, said that AI is a powerful tool as long as it doesn’t replace human performance.
Her goal is to figure how to compensate talent fairly as AI progresses. Unions, she added, are tasked with the difficult job of putting guardrails on something that has yet to be regulated.
“None of the people working on the art want to see the art gone,” she said. “But I think there’s an inevitability to the tools, because they do increase capability so significantly.”
Many voice actors believe that unchecked AI use could pose an existential threat to not only their careers but the acting industry as a whole.
Because the depth of narratives varies across game genres, the amount of work that goes into voicing a lengthy role-playing title may differ from the time spent recording lines for a shooting game, said voice actor Grace Rolek.
Live-service games, which are constantly updated and expanded, regularly add new content, said Rolek.
She often goes into the studio for short sessions. That space in particular is one where she sees the potential for greater AI use. If a character’s line is slightly tweaked, Rolek said, it’s possible that they could run the original voice through an AI system and get a similar performance with the new dialogue.
Voice actors could lose the opportunity go back into the studio and get paid for that type of work unless they sign licensing agreements that would pay them for continued use of their voice, she said.
“I think that having some sort of consistent standard across the industry is really important, especially for something like AI that is only going to get better,” said Rolek. “It’s not quite there yet to be like, ‘Oh, can you have a breath in between words?’ Or ‘can you put the stress on this word instead of that word?’ But I do feel like that’s only a matter of time.”
Brock Powell said that their voice has been the target of generative AI. In addition to spoken dialogue, the actor provides creature voices like realistic animals, monster and zombie sounds.
Friends and fans have contacted Powell, offering congratulations on a role in a game. But Powell was confident that they hadn’t worked on those titles.
Powell said the voice work was recognizable because they create a guttural “double roar” sound that is “rather hard to replicate.” Powell considers it a “sonic signature” and believes that source material they provided both in auditions and in sessions likely was being used to create new performances without the actor’s permission or compensation.
“I’m not just talking about they lift an edit and then pitch it down,” said Powell, who has worked on such titles as “Minecraft Legends” and “Genshin Impact.” “I’m talking about sounds are elongated or manipulated in a way that is not performed initially.”
Voice actors who create a library of specialty sounds are especially vulnerable, Powell added, because it can be impossible to prove that a sound like a roaring monster was voiced by a specific actor.
“The soul of the industry is on trial here,” Powell said. “Are we interested in making stuff to tell great stories and move people? Are we interested in just making content and keeping people distracted?” – Los Angeles Times/Tribune News Service