Fake AI Celeb Porn Starts Disappearing From Reddit, Gyfcat

CYBERSPACE—Late last year, tech-savvy porn fans began using artificial intelligence algorithms to create a new level of an old genre: fake celebrity porn. Using the AI technology, users were able to create porn clips featuring well-known celebrities by digitally pasting celebrity faces over the faces of porn performers in actual hardcore clips. Using AI, the celebrity faces adopted various expressions, giving a realistic, lifelike quality to the phony videos.

Known as “deepfakes,” the phenomenon has spread quickly, as the anonymous creators began creating fake porn “starring” ordinary people they know in real life—without permission from the individuals who suddenly and unexpectedly found their faces on bodies engaged in explicit sexual activity.

An anonymous software developer even created a free app, aptly christened “FakeApp,” that allowed anyone to create phony AI porn, even with no technical knowledge of software coding or artificial intelligence.

Now, even as it becomes clear that the online AI porn-forgers are one step ahead of laws that provide no protection for people who appear against their will in deepfakes videos, at least one online forum popular among deepfakes creators is cracking down on the ethically questionable practice.

On the heavily trafficked online forum Reddit, which includes a message board, or “subreddit” devoted exclusively to posting deepfakes video, numerous phony AI porn videos have simply disappeared in recent weeks, according to a report by the NextWeb tech news site.

NextWeb also reported that on Gyfcat, another site popular with AI porn creators, phony porn videos older than one day have apparently ben recently scrubbed from the site, with Gyfcat placing a “Page Not Found” graphic where the videos previously appeared.

The site Motherboard also took note of the removals, and cited a spokesperson for Gyfcat who confirmed that the online forum’s management considers the fake videos “objectionable” and is “actively removing” the videos.

According to a report by Wired Magazine online, however, removal by the site managers is the only recourse against deepfake porn for those who find their faces used in the videos without their consent.

According to First Amendment lawyer Mary Anne Franks, who has helped craft much of the current legislation against non-consensual, sexually explicit material—widely known as “revenge porn”—existing laws are not set up to guard against “face-swapping” images and videos.

Why? Because current anti-“revenge porn” laws are based on the idea that victims have had their privacy violated. But that standard does not apply to deepfakes video, no matter how realistic the content appears.

“Face-swap porn may be deeply, personally humiliating for the people whose likeness is used, but it's technically not a privacy issue,” write Wired correspondent Emma Grey Ellis.  “That's because, unlike a nude photo filched from the cloud, this kind of material is bogus. You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.”

In fact, experts say, because deepfakes video uses only faces, not the actual bodies of the people whose images appear, they videos could be classified as “art” and protected by the First Amendment, meaning that only private companies could delete or regulate the AI fake porn, while the government remains powerless.