Miles Dilworth, Senior Reporter, Dailymail.Com
14:33 01 Oct 2023, updated 16:07 01 Oct 2023
- Shocking report lifts the lid on the depraved world of deep-fake pornography forums
- Websites are littered with tips on how to create kinky material in seconds.
- Experts warn that “no one is safe” as the number of private victims increases by 400%.
Activity on fake celebrity porn forums has nearly doubled in a year as sophisticated artificial intelligence (AI) has become widely available to the public, DailyMail.com reports.
Shockingly, a surge in kinky activity has been found on websites easily accessible through Google and other internet browsers, meaning satisfying kinky fantasies doesn’t require knowledge of how to use the dark web.
A number of famous female stars, including Taylor Swift, Natalie Portman and Emma Watson, have already had their images manipulated using technology to make them appear in erotic or pornographic content.
Now a report from online security company ActiveFence, published exclusively to DailyMail.com, lifts the lid on the confusing world of deepfake porn forums, where perverts share online tools that can turn almost any celebrity photo into pornography.
The sites are littered with boasts from users that their technology “can help you undress anyone,” as well as guides to creating sexually explicit material, including advice on what images to use.
The boom has created an entire commercial industry built on deep fake pornography, including websites with hundreds of thousands of paying users.
One of the most popular sites, MrDeepFakes, receives about 17 million visitors a month, according to web analytics company LikeWeb.
ActiveFence said the number of open web forums discussing or sharing celebrity deepfake porn increased by 87 percent between February and August this year compared to the same period last year.
But researchers said “no one is safe” from having their images breached, with the percentage increase for individuals being a staggering 400 percent.
This has raised fears that thousands of people could become victims of AI-generated “revenge porn”.
“Boom AI”
Deep fake porn is usually created by superimposing the face of a person or celebrity onto the body of another person performing a sex act.
Previously, creating such content required a user to have technical knowledge, images taken from different angles, and Photoshop skills.
Now all someone needs is a non-nude image of their victim, often stolen from social media or dating profiles, to feed to a chatbot.
This is partly due to the advent of generative AI, a form of AI that can actually create things like words, sounds and images.
The problem is also compounded by the tech giants’ release of the code used to create these chatbots.
Chatbots created by companies such as OpenAI, Microsoft and Google have strict security measures in place to prevent them from being used to create malicious content.
But in February, Meta—the tech giant that owns Facebook, Instagram and WhatsApp—decided to make its code public, allowing amateur tech executives to rip out those filters.
Smaller tech companies have followed suit.
It’s no coincidence that ActiveFence has detected a surge in deepfake porn since February of this year, a moment the company calls the “AI boom.”
“Nobody is safe”
Traditionally, female celebrities have been the subject of deep fake pornography.
Back in 2018, actress Natalie Portman’s likeness was computer-generated from hundreds of frames and featured in an explicit video, as was Harry Potter star Emma Watson.
Deep fake videos featuring singer Taylor Swift have been viewed hundreds of thousands of times.
But ActiveFence researcher Amir Oneli said the speed and volume at which AI can now create deep fakes means the general public is increasingly falling victim.
He said that in the past, when creating images using AI was slow and difficult, users focused on celebrity content that went viral.
“What we see today is that it affects individuals because it happens so immediately and so quickly,” he added.
“The most tragic thing is that no one is safe.”
The ActiveFence report also describes a “vibrant scene with guides” available on the open web that advise users what types of images to use and how to modify them.
It is recommended to photograph the victim in a simple pose, with the body clearly visible, without baggy clothing, and with “a good contrast between the color of the skin and clothing.”
Deep fake chatbots simply ask users to “select the photo you want to strip,” while others claim that “their advanced image editing technology can easily remove clothing from any image, leaving behind only the essentials.”
Cash in on debauchery
Bots and websites that allow users to quickly create nude images mostly operate on a subscription model, allowing users to pay either monthly or as each image is used.
Some of them charge as little as $6 to download fake porn pictures, with prices going as high as $560 for 2,000 images, ActiveFence found.
Demand is such that some websites offer “fast pass” tickets, allowing users to pay $400 to “skip the line” to access the images, with the ticket expiring after nine hours.
MrDeepFakes is one of the most popular deep fake porn sites and ranks first in Google search results for this query.
According to a recent NBC investigation, the site features short teaser videos that entice users to purchase longer versions on another site: Fan-Topia.
Fan-Topia bills itself as the highest paying platform for adult content creation.
Noelle Martin, a legal expert on technology-assisted sexual abuse, told NBC that MrDeepFakes is “not a porn site” but “a predatory website that does not rely on the consent of the people on the actual website.”
“The fact that he’s even allowed to operate and is known about is a complete indictment of every regulator in the space, every law enforcement agency, the entire system,” she told NBC.
Sharing sexually explicit images without consent is illegal in most states, but this is not the case for deepfake material in all but four states: California, Georgia, New York and Virginia.
The Anti-Deepfakes of Intimate Images Act was introduced in Congress in May to make it illegal to distribute AI-generated pornography in the US, but it has yet to pass.