DON’T LET YOUR TALENT GET (DEEP)”FAKED OUT” IN A CONTRACT
Many of us have encountered deepfakes in one form or another. My first encounter was pre-COVID, when I spoke on a panel about its burgeoning presence.
Back then, as well as during COVID, people were experimenting with primitive AI to create deepfakes of talent and other public figures. They put them in situations that were at best, funny or harmless, and at worst, embarrassing, defamatory, or career-damaging.
An example at the panel was a clip of Bill Hader on Letterman's late night show. He was doing a spot-on imitation of Tom Cruise, when his face suddenly morphed into Tom Cruise's face!
The uses that are being made of “digital replicas” (sometimes referred to as “deepfakes”) and their implications have come a long way since then. So have the protections available to talent from the unauthorized creation and use of their digital replicas, especially in entertainment, brand, and other contracts for their services, where the uses and effects of AI have become more commonplace. It's imperative that talent representatives, especially lawyers, are aware of these protections, including new laws and practices, and how to implement them into their talents’ contracts. This has become necessary to protect creators from the AI use and abuse of their talents, to make sure they maintain control over the fruits of their labors, and remain fully and fairly compensated for them, in a fast-changing environment.
One major new protection for talent was the 2023 collective bargaining agreement between SAG-AFTRA and their signatories. It for the first time introduced provisions governing the use of digital replicas (defined by SAG-AFTRA as a synthetic performance using an actor's image, voice, or likeness generated by Al) including the form of consent required from SAG-AFTRA members to do so. (This can impact influencers here as well – those who have signed up for SAG-AFTRA protection, as well as those SAG-AFTRA members who moonlight as influencers).
One such provision is that members have a right to "informed consent" for use of any digital replica of themselves, no matter when or how created. The informed consent must be “reasonably specific” (whatever that means), and it must be clear and conspicuous in each of the contracts.
Last year, California and New York enacted laws, both effective January 1 of this year, to cover all types of performances created or materially altered by Al, with their own definitions of “digital replicas”.
The California law requires talent’s written consent and proper representation before a digital replica of the that talent can be used by. Specifically, any contract that permits the creation or use of a digital replica instead of and/or in addition to work the talent would have otherwise performed, must include “a reasonably specific description of the intended uses” (whatever this means”) of the digital replica, unless:
• the talent was represented by legal counsel, who negotiated a fully signed contact where the intended digital replica uses and terms of those uses are specific, clear, and conspicuous, or
• the talent was represented by a labor union whose collective bargaining agreement expressly addresses uses of digital replicas (such as the SAG-AFTRA Agreement).
(The New York law is substantially similar to California’s, but there are differences. One is the definition of “digital replicas”.
(But the gist of each definition is that the digital simulation of the talent’s voice or likeness so closely resembles their actual voice or likeness that a layperson would not be able to readily distinguish the digital simulation from the performer’s authentic voice or likeness).
Another difference is the contracts to which these new laws apply. In California, it only applies to "new performances, fixed on or after January 1, 2025". But, the contracts can have been entered into before 2025. In New York, the law only applies to contracts entered into or modified on or after January 1, 2025. It doesn’t matter when the digital replica is exploited.
These new laws are certainly a step in the right direction to protect talent. But, as in most new laws, what their language means isn’t yet clear.
One example is the lack of a definition of either “reasonably specific” or “a reasonably specific description of the intended uses”.
I've learned many important lessons from decades of drafting and negotiating contracts on how to protect my contracting clients. One seminal one is that clarity avoids controversy.
The more detailed and nuanced the terms and conditions of a contract, the better. This is especially important when contracting for still-emerging technologies, like the use of talent’s digital replicas as part of their provision of services.
Reasonable (and unreasonable) minds can and do differ on what constitutes “reasonably”. So, any contract for a talent’s services should specifically, in detail, and in plain conversational English, delineate whether the counterparty does or doesn’t have the right to create digital replicas of the talent, and if so, how, for how long, and for what purposes. The talent should also have prior written absolute approval of the final digital replica(s) and any subsequent changes to them.
There are other laws that can affect the enforceability of talent contracts, or a counterparty’s potential liability for the use, creation, and exploitation of digital replicas. Talent advisors need also be aware of state privacy laws (such as California’s California Consumer Privacy Act or "CCPA") that give residents rights over their personal and biometric information (which can include facial characteristics and voice prints – key components of digital replicas).
The takeaway? There need to be clear, detailed, and enforceable guardrails in talent contracts over if, how, and when any digitization of the talent’s recognizable characteristics can be used. Otherwise, the negative effects on the talent can be deep.
Just Sayin’ … ®