MIT Technology Review Subscribe

Forget fake news—nearly all deepfakes are being made for porn

The internet is home to at least 14,678 deepfakes, according to a new report by DeepTrace, a company that builds tools to spot synthetic media. But most of them weren’t created to mess with elections. 

Back to the beginning: Deepfakes arrived on the scene in late 2017. The word was originally used to describe AI-generated fake porn, with the heads of actresses on the bodies of adult film stars. Since then, the meaning of “deepfakes” has expanded to refer to any kind of manipulated video, like one with Mark Zuckerberg giving a fake speech about Facebook. This has stoked fears about the end of truth and the potential of deepfakes to swing elections. 

Advertisement

The internet is for porn: The report found that most of the videos aren’t about trying to influence politics. A full 96% of the deepfakes were still plain old fake porn.  “Deepfake pornography is a phenomenon that exclusively targets and harms women,” the authors write. All the fake porn contained women, mostly famous actresses and musicians. (The remaining nonpornographic deepfakes on YouTube mostly included men.) The number itself isn’t that high, but it’s worrying that it is growing so quickly.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Fighting back with law: The issue has caught the attention of legislators. In California, Governor Gavin Newsom just signed into law two bills that limit what people can do with deepfakes. One law makes it illegal to make and distribute a malicious deepfake of a politician within two months of an election. (The ACLU and Electronic Frontier Foundation have already pushed back, saying the law is too broad and will hurt political speech.) The second deepfake law gets closer to how the manipulations are really being used. It lets people sue if their picture is used in deepfake porn without consent

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement