2022 02 Art 3 1200.jpg

deepfakes in the entertainment industry

June 2022

By Vejay Lalla, Adine Mitrani and Zach Harned, Fenwick, New York and Santa Monica, USA

Ever since the first Terminator movie was released, we have seen portrayals of robots taking over the world. Now we are at the beginning of a process by which technology—specifically, artificial intelligence (AI)—will enable the disruption of the entertainment and media industries themselves.

The term “deepfake” refers to an AI-based technique that synthesizes media. This includes superimposing human features on another person’s body—and/or manipulating sounds—to generate a realistic human experience. (Photo: ©MIT/Halsey Burgund)

From traditional entertainment to gaming, we explore how deepfake technology has become increasingly convincing and accessible to the public, and how much of an impact the harnessing of that technology will have on the entertainment and media ecosystem.

What is a “deepfake” and why does it matter?

The term “deepfake” refers to an AI-based technique that synthesizes media. This includes superimposing human features on another person’s body—and/or manipulating sounds—to generate a realistic human experience. Actor Val Kilmer lost his distinctive voice to throat cancer in 2015, but Sonantic’s deepfake technology was recently used to allow Kilmer to “speak.” (The actor’s son was brought to tears upon hearing his father’s “voice” again).

Deepfakes have also been used to break down linguistic barriers, including by English soccer great David Beckham in his Malaria No More campaign. There, deepfakes enabled Beckham to deliver his message in nine different languages. And sometimes deepfakes are used for downright fun, such as in this art installation, which allows users to take a “surreal” selfie with Salvador Dalí.

Video: Deepfakes have been used to break down linguistic barriers, including by English soccer great David Beckham in his Malaria No More campaign, where deepfakes enabled him to deliver his message in nine different languages.

Leveraging deepfakes to enhance a talent’s skillset

Commercial applications of deepfakes currently include both hiring the underlying “deepfake actors,” as well as individuals whose likeness is used as a “wrapper” (i.e., the visage or likeness portrayed in the content) for the underlying performance. Where the so-called wrapper is a famous personality, this may save the underlying talent hours of time they would otherwise need to spend on set; that burden can be shifted to the deepfake actor instead. Additionally, such technology allows influencers to create personalized messages for hundreds or thousands of individuals without the need to actually record each message.

The foregoing novel applications of this technology do not fundamentally change the nature of talent agreements or acquiring the necessary rights from talent—however, they do introduce new wrinkles that both negotiating parties must consider carefully. For example, control over the use of the talent’s likeness rights is always negotiated in great detail, but it is unlikely that talent releases or agreements generally contemplate the right to use likeness rights as a wrapper to generate a potentially infinite number of lifelike deepfakes. Additionally, clauses relating to moral rights will require careful drafting to address whether a deepfake performance, potentially one in which the talent had no control, can serve as grounds to trigger termination. Talent unions may also have to consider more specifically how this technology is addressed in future industry negotiations.

Finally, there is the open question of whether this technology will help or hurt talent overall. On the positive side, the scalability of allowing an actor to appear in commercials or on websites for e-commerce all over the world (without requiring trips to the studio, learning a new language or improving accent work) could be empowering. For instance, Synthesia recently did this with two commercials featuring rapper and entrepreneur Snoop Dogg. The initial commercial was such a success that the company’s subsidiary wanted to use the same commercial, but with the branding and names switched out. Rather than having to reshoot, Synthesia used deepfake technology to change Snoop Dogg’s mouth movements to match the subsidiary’s name in the new commercial.

On the other hand, the widespread adoption of deepfakes could allow for the supplanting of actors who are not celebrities, leading to job losses or a shift in how the industry hires talent for productions. If it becomes more efficient and otherwise desirable to hire relative unknowns to portray those with celebrity status, there are fewer opportunities for these actors to become known or “get discovered” in their own right. That could lead to the creation of a caste of deepfake actors who never achieve celebrity status or the ability to monetize their name and likeness.

In 2020, MIT’s Center for Advanced Virtuality launched a new digital storytelling project to educate the public about deepfakes and show how convincing they can be. With the help of a deepfake actor, the team created a “complete” deepfake (manipulated audio and video) of US President Nixon delivering the real contingency speech written in 1969 in the event the Apollo 11 crew were unable to return to earth. (Photo: ©MIT/Halsey Burgund)

Incorporating celebrity deepfakes in digital content

Individuals have also leveraged celebrity deepfakes on social media platforms, further highlighting the pervasiveness (and accuracy) of the underlying technology. In early 2021, a Belgian digital AI artist worked with a Tom Cruise impersonator to generate very realistic videos of “Tom Cruise” on TikTok under the account @deeptomcruise. Those videos featured “Tom Cruise” partaking in quirky activities, from falling and telling a Soviet Union joke in a retail store to performing industrial clean-up services, and attracted hundreds of thousands of views. Also, a deepfake of Harry Styles demanding more strawberries in a musical ode to his song Watermelon Sugar went instantly viral on TikTok last year.

If an individual or business would like to create a celebrity deepfake for media content, it should carefully consider with an attorney whether it is permitted to do so under applicable law. It should navigate some key legal bases to post that type of content, including whether the content is a protected class of free speech (e.g., a parody), whether the celebrity’s rights of publicity have entered into the public domain and whether it has a fair use defense to a copyright infringement claim. Otherwise, as in all other cases, consent is likely required for use of the talent’s likeness in this context.

Considering applicable laws

In the United States, the legal landscape for deep fakes has been changing rapidly. An individual or business should consider recent state laws that specifically address synthetic and digitally manipulated media.

For example, in November 2020, New York enacted a law that expressly bans the use of “a deceased performer’s digital replica” in audio-visual content for 40 years after the performer’s death, if that use is “likely to deceive the public into thinking it was authorized.” This could prohibit the use of deepfakes in instances such as the Anthony Bourdain documentary Roadrunner. There, controversially, the film’s director leveraged deepfake technology to generate three lines that brought Bourdain’s “voice back to life” in order to complete the production following his death, despite the celebrity chef’s widow, Ottavia Bourdain, asserting that she did not give permission for such use.

On the political front, Texas enacted a law in September 2019 that banned disseminating deceptive “deepfake videos” intended to damage candidates or influence a voter base within 30 days of an election. The following month, California passed a similar law but specified that the period at issue is within 60 days of an election. Further, the platforms that host deepfakes will also need to consider compliance concerns regarding claims of deception.

Sometimes deepfakes are used for fun. Dali Lives is a groundbreaking AI experience at the Dali Museum in Florida, USA. It uses machine learning to create a version of Dali’s likeness in the present day, which appears on a series of interactive screens. Visitors can even take a “surreal” selfie with the master. (Photo: Courtesy of the Dalí Museum in St. Petersburg, Florida, USA)

Augmenting video game characters with deepfakes

The gaming industry is another natural arena for disruption by deepfakes, particularly with respect to avatars. A key premise of many games is a player assuming the role of a character, such as Luke Skywalker or Princess Leia from Star Wars. But an even more immersive gaming experience would be not simply controlling Luke or Leia with a gamepad, but also having the avatar track your face and mouth movements—something deepfake technology is making a reality. Further, with deepfake-generated synthetic speech, it is possible to make your voice sound like Luke or Leia, and this has sometimes resulted in unanticipated positive consequences. For example, these so-called “voice skins” are enabling LGBT+ people to change their in-game voices, resulting in more pleasant gameplay—an unsurprising finding given the 2020 statistic from the Anti-Defamation League that more than half of voice chat users are harassed during gameplay, and 37 percent of LGBT+ players are harassed on the basis of their sexual orientation.

As deepfakes continue to permeate various facets of digital media, individuals and businesses seeking to leverage the underlying technology will have to preemptively think through their existing contractual arrangements and navigate applicable law on this topic.

Of course, general purpose technology like this also has the potential to be misused, such as for fraudulent impersonation for financial gain or fraudulent logins of voice-gated systems. And deepfake technology will impact nonplayer characters (NPCs) as well as your own avatar. The combination of impressive natural language generation models such as GPT‑3 paired with gaming deepfakes will result in NPCs possessing the limitless ability to converse with your avatar with convincing synchronized face and mouth movements without needing to follow specific scripts. Video game developers will need to analyze their existing licensing arrangements with the content owners of these characters and story arcs to determine whether the deepfake use cases are permitted.

Other potential benefits

In addition to the economic benefits of using deepfakes discussed above, the underlying technology can also be used for social good in digital media. Take, for example, an HBO documentary that details the lives of LGBTQ+ activists forced to live in secrecy under threat of execution. To protect the identities of these activists, the documentary used deepfake wrappers, where the director reviewed only wrappers who were themselves LGBTQ+ activists but resided in countries free from the threat of death due to their sexual orientation. Deepfakes have also been used to create unique and bespoke voices for the millions of people who rely on synthetic speech to communicate.

Practical considerations going forward

As deepfakes continue to permeate various facets of digital media, individuals and businesses seeking to leverage the underlying technology will have to preemptively think through their existing contractual arrangements and navigate applicable law on this topic. Further, individuals who enter into talent agreements should carefully review the terms regarding their rights of publicity to ensure that they have sufficient control in how those rights might be used in conjunction with AI-based technologies. If approached thoughtfully, the development and use of deepfakes can be leveraged for good, both commercially and socially.



Source by [author_name]