What Are the Copyright Implications for Porn of Deepfakes?

The technology for "deepfakes"—taking a person in an existing image or video and replacing them with someone else's likeness using computerized artificial intelligence—has been around since 2017, but its use is increasing and is expected to play a major role in the 2020 elections, as perhaps represented by the deepfake above of the Felon-In-Chief himself, Donald J. Trump, whose minions pasted his head on the body of Sylvester Stallone as boxer Rocky Balboa for Trump to tweet to his followers. In reality, Trump is just another balding, flabby septuagenarian, and not someone of whom kids would likely hang a poster in their bedrooms.

Perhaps the main problem with deepfakes, since they can be nearly impossible to discern with the naked eye, is that there may easily be many more of them out there than people might noticed. And according to some investigators, the vast majority of deepfakes—96 percent, by one estimation—are based on images and video of porn.

"As tech firms scramble to tackle the spread of deepfake videos online, new research has claimed 96 percent of such videos contain pornographic material targeting female celebrities," reported the Indo-Asian News Service. "The researchers from Deeptrace, a Netherland-based cybersecurity company, also found that top four websites dedicated to deepfake pornography received more than 134 million views."

Now, many of those deepfaked images and videos were created mainly to garner additional clicks for websites—even a few adult-oriented ones—hoping to catch the attention of porn fans rather than to be used to affect political campaigns, though anyone with half a brain can see how the technology will be used for just that purpose over the coming year. After all, the video created this summer which took footage of House Speaker Nancy Pelosi, slowed it down and manipulated her vocals on the soundtrack to make it seem as if she were drunk is a crude example of what Americans will undoubtedly be seeing as the elections draw closer. And putting a female politician's head or even just her face on, for instance, the nude body of an adult starlet, or even a male politician's head or face on a gay star's body, could easily cost that politician an election, since almost no matter how quickly the deception was exposed, at least some portion of the public will just see the fake and not get the news of its discreditation.

It's for that reason that the California legislature recently passed Assembly Bill 730, and Gov. Gavin Newsom signed it into law in early October. The bill criminalizes the distribution, by individuals or groups, of deepfakes within 60 days of an election unless it also carries with it a disclosure statement that it has been manipulated.

"Voters have a right to know when video, audio and images that they are being shown, to try to influence their vote in an upcoming election, have been manipulated and do not represent reality," said Assemblymember Marc Berman in a statement. "In the context of elections, the ability to attribute speech or conduct to a candidate that is false, that never happened, makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters."

One clause in that bill, however, may have implications for the adult industry. According to an article on Courthouse News, "The new law allows for candidates who are the targets of such deepfakes to sue producers or distributors of the material."

One topic regarding deepfakes that hasn't been discussed anywhere is copyright. After all, adult studios and performers who create their own content make their money, quite frankly, from the women's bodies. Porn viewers (if they're honest) pay good money to see their favorite performers nude and having sex, and are likely even to give actresses they've never seen before a chance to become new favorites—and it's for that reason that the studios, performers and websites that create their own content usually copyright that content: So no one else can steal it and make money from images, both still and video, that they don't own.

So what remedy do adult content owners have against those who would use adult video content and/or images of adult performers in deepfakes?

AVN asked prominent attorney Allan Gelbard, who's been involved in several high-profile content piracy cases, about whether performers could sue those who use video of the performer engaging in sex acts in deepfake creations.

"Only the holder of the copyright can sue for infringement," Gelbard cautioned. "Most porn stars are not the 'author' of the work (which is the cameraman unless it’s a work for hire); they just appear in someone else’s copyrighted work. The copyright holder could sue—and the usual fair use defenses would probably be easy to overcome."

Indeed, most cameramen (or women) don't own the rights to the videos they shoot, having already contracted with an existing studio or website to shoot the action in the first place—"work for hire"—or they have an agreement in place to relinquish all rights to the video once it's completed and sold to a distributor.

But Gelbard postulated that performers might easily have some legal rights to their own image under the laws of California and a few other states.

"A porn star might be able to sue on a statutory right of publicity claim (commercial appropriation of likeness) if (and only if) people would know it's her from the body, or from her setting in the overall work," Gelbard said.

Several years ago, VCA Pictures used to hold an online contest, inviting viewers of its website to try to identify adult actresses solely by viewing close-up images of their tits or pussies—and some respondents were surprisingly good at the game. But in the modern world, it's far easier to identify some performers either by their tattoos, piercings, birthmarks or other singular markers on their bodies, and such evidence could easily be used to press a "right of publicity" lawsuit.

So adult industry members may want to start paying more attention to deepfake content that makes its way around the web or even televised news, since such content created and/or distributed by well-funded political groups could mean high-value settlements for their use of images and footage they don't own.