The number of deepfake videos found online has nearly doubled since 2018, and most of them are pornographic videos featuring women without their consent. The term “deepfake” refers to video altered using machine-learning technology to present a situation that didn’t occur, such as this video of filmmaker Jordan Peele as president Barack Obama.
A recent report from Deeptrace Labs, an Amsterdam-based cybersecurity company, surfaced a total of 14,698 deepfake videos, from YouTube and a collection of pornographic websites. That’s nearly double the amount from December 2018, when the startup’s researchers found 7,964 videos using roughly the same sources.
Women were the subject of virtually all of the deepfake pornography videos found by the researchers. Particularly worrisome is that 96% of the videos found were identified as “non-consensual” pornography. American and British actresses, as well as South Korean K-Pop musicians, were a common subject of the videos.
Researchers were particularly surprised by the high number of K-pop singers, which made up a quarter of the pornography videos they found. “We had seen they were frequently targeted but didn’t expect the massive volume of videos and view count that these videos attracted,” Henry Ajder, head of communications and research analysis at Deeptrace, told Quartz over email.
Only in the rare instances where uploaded deepfakes weren’t X-rated did men frequently appear, although non-pornographic videos only made up about 4% of the videos found in the study. Men appeared as a subject for the majority of non-pornographic deepfakes, accounting for 61% of the videos.
Deepfake videos are growing in number because the tools needed to create them are easy to find, researchers say. The report notes that DeepNude, a computer app that stripped images of fully clothed women, was taken offline by its founder shortly after its launch back in June, can still be found on torrenting websites and open-source repositories like GitHub. The repository site said in July that it would be banning copies of the app, but it appears there are still versions online. GitHub noted in a statement to Motherboard that while it does not “proactively monitor user-generated content,” it does act on abuse reports. The company wasn’t immediately available to comment on the new report.
A supposedly “improved” version of the DeepNude website by anonymous creators offers unlimited access for $20 per month, the researchers also found. “The software will likely continue to spread and mutate like a virus, making a popular tool for creating non-consensual deepfake pornography of women easily accessible and difficult to counter,” noted the report’s authors.
DeepNude isn’t the only service of its kind on the web—the researchers uncovered a host of other portals that sell custom creations or secure the services of an individual deepfake creators. One service only required 250 images of the subject and two days of processing time. Researchers found services that were priced as low as $2.99 per custom video.
Non-celebrity women also face the risk of being the subject of a deepfake porn video. On a separate thread for paid requests observed by Quartz, users on a deepfake forum can find individuals capable of crafting videos starring the target of their choosing. Payment is taken in Bitcoin. “Hello, I’d like a high-quality video of a woman friend of mine. Can pay however!,” writes one user. Another user requested a deepfake video of his “high school sweetheart.” One user posted several mirror selfies of a young woman and asked for assistance in digitally removing her clothing. “I can try if you have better photos,” another user responded.
While some predicted that deepfake videos would harken a new, more frightening era of fake news, such worries have yet to materialize. A doctored video of US speaker of the house Nancy Pelosi made the rounds back in May, but it’s unclear how many people it actually fooled. It was eventually flagged by Facebook and other social media platforms. Tech companies are also investing in new technology to detect deepfakes. Facebook, the Partnership on AI, Microsoft, and representatives from academia teamed up last month to launch a contest to find new tools to unearth deepfakes.
California last week passed a law that made it possible for individuals to sue if their image is used for sexually explicit content. The move may prompt other states to follow suit. But for many women, the move is already too little, too late.