Despite Fake News Fears, 96 Percent of Deepfakes Online Are Porn

Since the emergence of so-called “deepfakes” technology technology in late 2017, as AVN.com reported at the time, fears that the fake videos could be used to creates “fake news,” manipulating public opinion on political issues, have reached deep into even staid institutions such as the Brookings Institute.

The venerable Washington think tank in June posted a paper titled Deepfakes, Social Media, and the 2020 Election, arguing that “under the right set of circumstances, deepfakes will be very influential. They don’t even have to be particularly good to potentially swing the outcome of an election.”

The state of California is taking the deepfake issue so seriously that last week Governor Gavin Newsom signed into a law bill that makes it illegal to post a deepfake video of a political candidate within 60 days of an election

But while the anxiety about deepfakes being used as covert propaganda to swing the outcome of an election are certainly worth taking seriously, a new study by the cybersecurity firm DeepTrace Labs suggests that, at least right now, they may be somewhat overblown. The study found nearly 15,000 deepfake videos online—but 96 percent of them are simply porn, according to a report on the study by MIT Technology Review

The most simple explanation of deepfakes is that they are videos that make real people appear to do things that they did not actually do. As AVN.com covered, the first known deepfakes appeared on the internet message board Reddit back in late 2017. And of course, those videos were fake celebrity porn.

The laboriously created short clips took the faces of such well-known Hollywood actresses as Gal Gadot, Scarlett Johansson and others, and realistically superimposed them onto the bodies of porn performers in existing hardcore videos. The innovation was that, beyond simply swapping faces, the deepfake videos actually made those faces move, with believable facial expressions, even in porn scenarios.

But two years later, despite remarkable advances in deepfake video creation allowing the deceptive videos to be created in seconds rather than hours, as AVN.com reported, most deepfake videos are still just fake porn.

But California is taking deepfake porn seriously as well. Newsom last week signed a second deepfake bill into law, allowing California residents to sue anyone who uses her or his image in a fake porn video. The bill would allow lawsuit awards of up to $150,000 for creating deepfake porn videos, without the consent of the person whose face appears in the bogus porn clip.

Photo By Abyssus / Wikimedia Commons