Legal Challenges Regarding “Deepfakes”
A documentary about the life of Anthony Bourdain made headlines this summer when it was revealed that several lines in the movie, which sounded like they were spoken by the late chef, were actually generated by artificial intelligence. Although the ethics of using such “deepfake” technology to mimic Bourdain’s voice has received great scrutiny, deepfakes have become a common feature in entertainment. The emergence of this technology has challenged the firm assumptions about what is real and what is not, but it also presents a unique legal challenge: how to balance the First Amendment’s protections of expressive speech against an individual’s right of publicity.
The right of publicity evolved from the right of privacy. An individual has both a personal right to be left alone, as well as a right to benefit from the commercial exploitation of his or her name and likeness. Case law recognizes, however, that the scope of the right of publicity must be balanced against societal interests in free expression. To find that balance, courts have applied the “transformative use” test, which looks at five factors to assess whether the “work in question adds significant creative elements so as to be transformed into something more than a mere celebrity likeness or imitation.” In re NCAA Student-Athlete Name & Likeness Licensing Litig., 724 F.3d 1268, 1274 (9th Cir. 2013).
Under this test, courts consider five factors: (i) whether the celebrity likeness is a “raw material” from which the work is synthesized, or the sum and substance of the work itself; (ii) whether the work is primarily the artist’s own expression; (iii) whether the “literal and imitative or the creative elements predominate” in the work; (iv) whether the “marketability and economic value” of the work derive primarily from the fame of the celebrity depicted; and (v) whether the artist’s skill is subordinated to the overall goal of achieving a portrait of the celebrity. Id. (citing Comedy III Productions, Inc. v. Gary Saderup, Inc., 25 Cal. 4th 387 (2001)).
Unlike many traditional forms of artistic expression, however, deepfakes are valued precisely because they so closely mimic the person being depicted. Thus, the “transformative use” test will likely depend on how broadly the courts construe the “work” of the artist. If, for example, these factors are applied to only those sound bites that used artificial intelligence to recreate Bourdain’s voice, a court may well conclude that the work is merely imitative and lacks the creative elements to be transformed into something more. However, if these factors are applied to the documentary as a whole, the opposite conclusion may be reached.
Courts may also consider whether the “transformative use” test is the right standard to apply where the goal of the artist is to imitate, rather than transform, a person’s likeness. Courts may instead look to whether the work is a “parody” and thus falls within the “fair use” defense to a copyright action. As parody requires “the use of some elements of a prior author's composition to create a new one that, at least in part, comments on that author's works,” Campbell v. Acuff-Rose Music, Inc., 510 U.S. 569, 580 (1994), this standard may better allow for the close replication of a person’s voice or appearance. Alternatively, courts may look to the “likelihood of confusion” test applied to trademark infringement cases, and consider whether a parodic work is “merely amusing, not confusing.” Dr. Seuss Enterprises, L.P. v. Penguin Books USA, Inc., 109 F.3d 1394, 1403 (9th Cir. 1997).
The similarity of the deepfake with the original may create liability for the creator in other ways. Where the deepfake contains defamatory content, portrays an individual in a false light, or causes emotional distress, for example, courts may turn to those areas of tort law for guidance. This is particularly true if the deepfake is used to attribute a statement to an individual who did not say or agree with the statement made.
Another issue raised by such deepfakes is the right of a deceased celebrity to enforce his right of publicity. On this question, courts may be guided by prior decisions finding that the right of publicity is a property right that, upon the death of an individual descends “like any other intangible property right.” Presley's Est. v. Russen, 513 F. Supp. 1339 (D.N.J. 1981). A party seeking to enforce the deceased’s right of publicity must therefore demonstrate that it inherited that property right. Where different descendants inherit different pieces of the estate, this may not be a straight-forward issue, and estate planners will be well served by specifying who inherits the right of publicity.
Additionally, this question of inheritance will lead to questions for artists seeking permission to use the deceased’s name and likeness as to who is authorized to provide such consent. Clear documentation will thus alleviate disputes over this right. For example, the documentary’s filmmaker claims to have received consent from Bourdain’s estate, but Bourdain’s wife has denied providing such consent.
While many questions remain unanswered, one thing is clear: The use of deepfake technology is here to stay. Courts will soon be faced with balancing the First Amendment and private property interests around, as well as liability arising from, the use and misuse of this cutting-edge technology.