General News

Actor’s experience of AI gone wrong

When international media reports exposed the ‘fake news’ his avatar was peddling for a Venezuelan propaganda campaign, Dan Dewhirst found himself exposed and with few protections. He turned to Equity.

The AI news anchor avatar used by the Venezuelan government to spread misinformation
  • Actor’s AI avatar used in political propaganda campaign against his will 
  • Equity union is calling for protections in law for performers 

Equity member Dan Dewhirst was excited when he was offered a job to become a ‘performance’ avatar, powered by artificial intelligence (AI). He expected similar rights, safeguards and limitations as he was used to in stock video or photography work, but the lack of regulation and unwillingness of the engager to amend contract clauses left him vulnerable. Little did he know how badly events would turn out. When international media reports exposed the ‘fake news’ his avatar was peddling for a Venezuelan propaganda campaign, Dan found himself exposed, violated and with few protections. He turned to Equity for help. 

Dan’s story - outlined in full below - is an extreme but not unique case. 

The use of Artificial Intelligence (AI) has grown rapidly across the entertainment industry in recent years, from automated audiobooks and voice assistants to deep fake videos and text to speech tools. But UK law has failed to keep pace, and this is leading to performers being exploited. 

As a union, Equity is fighting to change that. Our aim is to get the government to recognise the threat posed by AI, update the law, and to get industry onside so that artists can feel safe in the knowledge that their historic and future intellectual property is protected.  

Find out about our Stop AI Stealing the Show campaign

John Barclay, Equity assistant secretary, said: “AI isn’t the future, it’s the present. It’s here right now and performance artists are increasingly working in this growth area, providing movement and voice recordings for Gen AI content.  Engagers of our members are slow to respond to the threat not only to our members’ rights, but also their own Intellectual Property and this must change. 

“Whether working in live performance, recorded media or new technology arenas, basic employment and human rights apply. Companies can’t have carte blanche to use and abuse performers’ likeness - their faces, voices and bodies - without some basic protections and limitations. This is a rapidly growing area and must be urgently addressed by government, business and players across the creative industries.”  

The industry needs cast iron safeguards to avoid my experience becoming endemic to our business. 

Dan Dewhirst, actor and Equity member, said: “What happened is awful, but the small comfort I take is that by telling my story I can help effect positive change. The industry needs cast iron safeguards to avoid my experience becoming endemic to our business. 

 “I want other artists to be able to take these jobs without fear of being exploited or given false assurances. Having Equity behind me gives the strength to take on this challenge.” 

Find advice and support on working in AI in our AI toolkit

Dan’s story 

Credit: Edo Brugue. 

Dan Dewhirst is a full-time actor with over a decade’s experience in TV, film and commercials.  

In 2021 Dan took up a contract with a company called Synthesia. It was set to be an exciting job in cutting edge new technology, but despite Dan’s reservations, he could never have known how badly it would turn out. Within months, an AI avatar of Dan was being used to peddle fake news and propaganda in a highly volatile Venezuelan political social media campaign. And there was nothing he could do about it. 

It was in July 2021 that Dan was offered a job, via his agent, to be one of the first actors to be made into an AI avatar. It was exciting, and at the time it felt like a great opportunity. It was not long after Covid, and work had been scarce, so the fee was welcome money to pay the bills. 

Although it was a new experience, Dan could draw comparisons with motion capture and stock video and image work. 

Dan had to submit a self-tape audition (recording your own audition and sending it into the casting agents). He was then invited to a studio in East London to see if he would make a ‘good avatar’. This session involved reading a monologue which made little sense in itself; the purpose being more about capturing mannerisms, intonation and expressions. Dan was captured from different camera angles and in different costumes, including a doctor. “It was like being a sort of Action Man” Dan says. “I’m now very nervous about ‘Dr Dan’, given what happened with the other avatar.” 

Dan was paid a BSF (basic studio fee) for the day in the studio and waited to hear the verdict. He didn’t have to wait long to be told he had passed the ‘good avatar’ test. He was given the good news along with a set of paperwork, including his contract and release, or ‘buy-out’, terms.  

At this point Dan became concerned about the wide-ranging clauses in the contract, so he contacted his agent and his union, Equity. 

Always check your contract

 At this point in the story, Dan wants to urge any performers, dancers, actors, artists or creatives to check your contract carefully. Don’t just leave it to your agent.  

Take a look at the detailed advice on AI contracts in our AI toolkit. If you have any questions get in touch with us for further advice. 

Check our AI toolkit for advice

There are certain things you always want to look for and this is the time to do it. Safeguards and reasonable buy out clauses should be in black and white, for example don’t sign away your performance for perpetuity, not in all media types, ensure the fee reflects the work and value all parties getting. 

Dan has experience working as a model, actor and dancer, so he was familiar with the standards he would expect in stock video and photography – somewhat comparable fields of work. Such work usually includes safeguards and exclusions for what it can’t be used for, for example anything illegal, misleading, political etc. This contract didn’t have that in place.  

A question crept in for Dan – what is it that an avatar version of Dan could say which the real Dan wouldn’t? 

He discussed his concerns with Equity and his agency. The union echoed his concerns and said ‘yes you’re right to be concerned about this, don’t do it on these terms’. Dan got a lawyer to help draft changes to the contract. 

Synthesia wasn’t willing to amend the contract. But Dan's agent pointed to their booking confirmation which includes stipulations that the avatar can’t be used for illegal or unsavoury purposes. They also pointed to the Synthesia handbook which contains ethical codes. 

With these two assurances, albeit not in his contract, Dan decided he had enough assurances to take the job on. So he signed the contract.

Dan’s avatar launched in late 2021. He received a thank you from Synthesia, delivered in the form of his own avatar, an amusing but also slightly unsettling experience. 

Dan’s avatar starts to be used, and people send him messages and links. It’s ‘low-level’ stuff, such as corporate presentations and B2B content – not jobs Dan would be chasing as jobs, but he doesn’t have a problem with ‘avatar Dan’ appearing in them. 

Then something different happens. 

A friend sends a link to a CNN report dated 30 March 2023 asking: “is this you?” It’s 3 April 2023 and the bottom falls out of Dan’s world. His AI avatar, the one recorded for Synthesia and with apparent assurances over its use, is front and centre posing as a TV news anchor spreading fake news pushed by the government of Venezuelan Dictator Nicolas Maduro. His avatar is telling viewers that the economy is doing well, among a string of other inaccurate and contested narratives. 

My stomach dropped. Everything I was worried about has happened but a thousand times worse.

The impact on Dan is huge. “I couldn’t believe it” he says. “My stomach dropped. Everything I was worried about has happened but a thousand times worse. I’m literally the face of fake news. My avatar is being used for misinformation, political propaganda and in TV - all things not allowed under the assurances I was given. No one wants to see themselves, against their will, spout the rhetoric of a controversial political leader! And it’s literally my face. It’s the ultimate defamation. I don’t want to be associated with these things. I care about my work. You need to have integrity in this business.” 

For Dan, it was an “I hate to say I told you so” moment.  

Dan contacts Equity and his agent. The agent suggests he gets support from Equity. Equity provides legal advice and industrial representation to Dan, both of which are ongoing at the time this story is told.

Stop AI Stealing the Show

Join our campaign to strengthen performers' rights in response to the rise of artificial intelligence across the entertainment industry.

AI Toolkit

Everything you need to know about AI and work in the arts and entertainment industry


Latest News