Here, you can find frequently asked questions about the use of AI across the performing arts and entertainment industries.
You can also watch below a training session with Equity's AI Working Party held on 22 February 2024. Hosted with with sector experts, the session covers the latest state of the technology, an overview of the process behind motion and voice capture, and a closer look at generative AI in the postproduction process.
There is no single definition of AI. The UK government defines AI as: “technologies with the ability to perform tasks that would otherwise require human intelligence”.
The goal of AI is to create machines that can learn, reason, carry out tasks and make decisions similar to humans. It deals in probabilities, and through vast amounts of training on data, produces the closest approximation that the algorithms can manage.
AI encompasses a wide range of technologies, including machine learning, deep learning, neural networks, expert systems, and robotics.
Generative AI is a subfield of artificial intelligence that can create new content, such as text, images, audio, or code, by learning the patterns and structure of the input data. Generative AI algorithms can generate outputs that resemble human-created content or synthetic media.
The use of AI is growing across different sectors, including voiceover, film, TV, modelling, music, dance, and gaming.
Whilst the use of AI in the production of audio and audiovisual content is not yet widespread, the speed at which the technology is evolving makes this a not-so-distant reality. In fact, some in the industry have claimed that we are not far off from seeing a filmmaker release a feature-length film that is produced entirely with AI, including the screenplay, visuals, actors and music.
Equity is particularly focused on “performance cloning”, which is the process of creating a synthetic performance by recording, using or reproducing the performance, voice or likeness of an Artist by machine learning systems and equivalent technology. There is a wide range of applications that include:
- Text-to-voice or image-to-voice translation or generation
- Interactive digital humans or digital avatars capable of audio-visual interaction with users
- Manipulation of existing identities in audio-visual content such as deepfakes
AI cannot replicate the creativity, emotion and authenticity performers bring. However, AI can be a tool that assists the creative process, with humans making the final decisions. But there are some areas of entertainment work where AI has started to replicate certain aspects of performers' roles, which is a trend likely to increase as the technology develops.
If AI is applied ethically and responsibly in collaboration with workers and their trade union, it has the potential to positively impact performers. For example, AI could allow performers to appear in multiple productions around the world across a single period boosting income levels. AI could increase accessibility to the labour market for deaf and disabled performers, whilst enhancing the safety of working as a stunt performer. AI could also aid dubbing and automated dialogue replacement by matching an actors’ mouth and facial movements to the dialogue they are speaking.
Job displacement - 65% of performers we surveyed thought the development of AI technology posed a threat to employment opportunities in the performing arts sector. This figure rose to 93% for audio artists.
Pay - The one-off payments offered to performers who engage in AI work often do not reflect the fact that their image, voice or likeness may be used forever and on thousands of different projects.
Rights - Many generative AI platforms are using original creative work without a licence to create new material. This is infringing on our members’ intellectual property and legal rights.
Transparency - Performing artists often do not know where or how their professional contribution is being exploited due to unclear contract provisions, unspecified usage, and a lack of auditing.
Awareness - 4 out of 5 performers we surveyed didn’t have a full understanding of their rights before signing a contract for AI work.
Poor industry practices - The Arts is already a very precarious sector with performers often asked to sign away all of their rights in perpetuity for low levels of pay. By adding AI into the mix, Artists are at an even greater risk of seeing their performance or work repurposed to generate new content without their expressed consent or adequate remuneration.
NDAs - Performers are being asked to sign excessive Non-Disclosure Agreements without any knowledge of what the job entails, which is common in other areas of entertainment work.
Harmful content - It is estimated that 96% of deepfakes are pornographic and depict women, and 99% of deepfake subjects are from the entertainment industry
Equality - Rapid technological developments have the potential to exacerbate existing inequalities in access to work, income inequalities and wage gaps for those most marginalised across the industry i.e. deaf, disabled and neurodivergent artists.
Creative expression - The puppetry nature of performance synthesis technologies, such as the creation and use of digital repliacas, removes agency from the performer. If we reduce the opportunities to perform in person in recorded media, the incentive to train professionally and move into the profession is reduced significantly.
Collective bargaining - This technology is challenging for performers’ unions across the globe as this innovation would not be covered in historic collective bargaining agreements.
UK law - There are no UK laws explicitly written to regulate AI. This includes the Copyright, Designs and Patents Act 1988, which provides the legal framework for performers’ intellectual property. This is leading to some performers being exploited.
Government priorities - The government wants to make the UK a global AI superpower by “harnessing the opportunities and benefits" the technology presents. As it stands they do not plan to introduce new legislation to regulate its use. Instead they are adopting a “light touch” approach with a voluntary code of practice for the industry.
There are no UK laws explicitly written to regulate AI. However, the law does give performers the exclusive right to reproduce and share content you have created (which is protected by copyright) and performances you have given (which are protected by performers’ rights). The law also protects your face, your voice and your likeness as protected sensitive personal data because they each can identify you personally. As the data subject, the law grants you a right to consent to the collecting and processing of your personal data, and a right to request your personal data be erased from records held by others. To learn more about your rights when it comes to AI, please read our guidance.
The union is working hard to ensure that Equity members understand the new and evolving landscape that you are working in. Through our programme of education work and AI Toolkit, we want performers to be informed when choosing to engage performance work involving AI. We hope you will have the tools you need to negotiate ethical terms and conditions and avoid signing exploitative and unfair contracts.
Our next priority is to integrate ethical terms and conditions for AI work into working practices through collective bargaining. The strength of our industry is built on the strong and equitable collectively bargained agreements we have with engagers, such as the BBC, ITV, SKY, PACT, Netflix, Disney+, and Apple+. However, these agreements need strengthening so that performers have the protections you need when working under an Equity contract. We are also exploring new partnership with digital cloning companies that operate outside of the agreements. Ultimately what we want to ensure that performers have the contractual right to informed consent, control, fair compensation and transparency.
Equity members achieved a huge victory getting the government to abandon their damaging data mining exemption as part of our ground breaking campaign to Stop AI Stealing the Show.
Going forward, we need the government to step in and ensure that AI system developers and users are held liable for complying with existing legal frameworks, including intellectual property, data protection and privacy laws. As a longer-term priority, we are campaigning for wholesale reform of our intellectual property framework, including new synthetised performance rights, new image rights, and improved moral rights.
As we move forward with this innovation, we are advocating for an artist-centred approach to safeguard the workforce. Equity’s AI Vision Statement outlines eight core principles for the industry to adopt when engaging artists for performance cloning. We believe artists have the right to:
1. consent (and not consent) for past, current and future performances
2. license their performance or likeness on personal, non-exclusive, time limited basis
3. be identified and object to derogatory treatment of their work and performance
4. transparent information
5. meaningful consultation
6. fair and proportionate remuneration
7. equal access and treatment
8. be engaged under a collectively bargained agreement