Apps and websites that use artificial intelligence to digitally remove garments from women’s photographs without their consent, according to Time research, are getting increasingly popular.
According to studies conducted by the social network analysis business Graphika, 24 million people used “nudity” or “undressing” services in September alone. These services utilize artificial intelligence algorithms on genuine photos of clothed women to create bogus nude images.
Numerous new artificial intelligence diffusion models capable of creating amazingly realistic fraudulent nude images have been made accessible to the public, coinciding with the astounding growth. Prior AI-altered images tended to be fuzzy or unrealistic, but these new models let even inexperienced photographers generate nude portraits that are absolutely real without any technical knowledge.
These services are often publicly marketed on social media networks such as Reddit and Telegram, with some adverts even instructing purchasers to send the false nudes back to the victim of the crime. Despite breaking the rules of the majority of websites, one Nudify app pays for promoted content on YouTube and ranks first in Google and other search engines. However, the functioning of the services is mostly uncontrolled.
Deepfakes and Non-consensual Pornography Poses the Biggest Risk
According to experts, the growth of do-it-yourself deepfake nudes in the area of non-consensual pornography heralds a dangerous new phase. Eva Galperin of the Electronic Frontier Foundation saw an upsurge in the frequency of occurrences affecting high school kids. “We are seeing more and more of this being done by ordinary people with ordinary targets,” she said.
However, many victims are unaware that such images exist; those who are may have trouble getting law enforcement to investigate or may lack the financial means to pursue legal action.
At the moment, federal law does not directly ban deepfake pornography; rather, it outlaws the creation of information that is manufactured to represent child sexual abuse. In November, a North Carolina psychiatrist was sentenced to forty years in prison, the first conviction of its kind. The penalty was imposed for using artificial intelligence undressing programs on photos of underage patients.
TikTok has begun restricting search terms related to nudity apps in response to the trend, while Meta has begun censoring keywords associated with this trend on its platforms. Despite this, experts feel there is a huge need for more awareness and action in the area of non-consensual artificial intelligence porn.
The apps violate women’s private rights and use their bodies as raw material without their consent. This is done in order to make a profit. Santiago Lakatos said that it is feasible to build something that seems to be realistic. It is precisely the danger that victims confront.
Why Female Students Are the Primary Targets
Female students at a New Jersey high school were targeted by deepfake pictures last month after AI-generated sexual images swept throughout campus, prompting a mother and her 14-year-old daughter to advocate for tougher NCII content restrictions.
Earlier this year, at a high school in Seattle, Washington, a young youngster allegedly used AI deepfake programs to create images of female peers.
In September, more than 20 females were victims of deepfake photos created with the AI program ‘Clothoff,’ which allows users to ‘undress ladies for free.’
The report was created by Graphika, a social network research company, and claimed to have found essential strategies, tactics, and procedures used by synthetic NCII vendors to understand how AI-generated nudity websites and applications operate and monetize their activities.
According to the researchers, ‘we assess that the increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material.’
Graphika uncovered that the programs adopt a ‘freemium model,’ in which a limited amount of functions are provided for free but more advanced capabilities are kept behind a barrier.
To get access to the additional features, users may be requested to pay additional ‘credits’ or ‘tokens,’ with prices ranging from $1.99 to $299 per credit.
The study also showed that advertisements for NCII apps or websites are clear in their descriptions, implying that they offer ‘undressing’ services or post images of people they have ‘undressed’ as proof.
Other advertisements are less straightforward, claiming to be an ‘AI art service’ or a ‘web3 image gallery,’ but they include key terms related to NCII in their profiles and are linked to their posts.
AI Undress Deepfake Generating Millions
Besides the increased traffic, the services, some of which charge $9.99 per month, claim to have a large number of members on their websites. “They are doing a lot of business,” Lakatos said. “If you take them at their word,” he said, “their website advertises that it has more than a thousand users per day.”
Non-consensual pornography of famous people has always been a scourge of the internet, but privacy experts are increasingly concerned that advances in AI technology have made deepfake software easier and more effective.
“We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, the Electronic Frontier Foundation’s director of cybersecurity. “You see it among high school children and people who are in college.”
According to Galperin, many victims never hear about the images, and even those who do may have difficulties getting law enforcement to investigate or find a way to take legal action.
There is no federal law now forbidding the manufacture of deepfake pornography, however, the US government does restrict the manufacturing of such photos of minors. In November, a child doctor in North Carolina was sentenced to 40 years in prison for using undressing apps on photographs of his patients, the first instance of its kind under a provision that prohibits the deepfake fabrication of child sexual assault material.
TikTok has blocked the word “undress,” a popular search term connected with the services, warning users that it “may be associated with behavior or content that violates our guidelines,” according to the app. A TikTok spokesperson declined to go further. Meta Platforms Inc. began prohibiting key terms associated with hunting for undressing apps in response to requests. The company’s representative declined to comment.