A horrifying new AI app swaps women into porn videos with a click

From the beginning, DeepFax, or synthetic media, has been used primarily to create pornographic representations of women, which often seem to be psychologically destructive. The original Reddit creator who popularized the faces of technology-swapped female celebrities in porn videos. To date, research firm Sensitive AI estimates that between 90% and 95% of all deep online deepfake videos are unsupported porn, and about 90% of them are women.

As technology has advanced, numerous no-code tools have also emerged to use, allowing users to “strip” clothing from the female body in images. Many of these services have since been forced to fly offline, but the code still exists in open source repositories and has resumed in new forms. The latest such site received more than 6.7 million visits in August, according to researcher Genevieve Oh, who discovered it. It still has to take offline flight.

There are other single-photo face-swapping applications, such as ZAO or ReFace, that place users in selected scenes from mainstream movies or pop-up videos. But as the first dedicated pornographic face-swapping app, Y takes this to a new level. Adam Dodge, founder of End Tab, says that creating pornographic images of people without their consent is “tailor-made”, educating people about non-profit technology-enabled abuse. This special use for the creators makes it easy to modify the technology for the case and entices people who would not otherwise think about making deepfake porn. “Whenever you’re an expert like that, it creates a new corner of the Internet that will attract new users,” says Dodge.

Y is incredibly easy to use. Once a user uploads a photo of the face, the site opens a library of porn videos. The vast majority are women, although there are also men in a small handful, mostly in gay porn. The user can then select any video to preview the face-swapped result in seconds અને and pay to download the full version.

The results are far from perfect. Many swaps of the face are obviously fake, turning different angles as the faces glow and distort. But for the casual observer, some are subtle enough to pass, and the speed of Deepfax has already shown how quickly they can deviate from reality. Some experts argue that even the quality of the deepfake doesn’t really matter because the mental toll on victims can be the same. And many members of the public are unaware that such technology exists, so even low-quality facial expressions can be able to fool people.

To date, I have never managed to download any images. Forever, it will be there. No matter what I do.

Noel Martin, an Australian activist

Y offers itself as a safe and responsible tool for exploring sexual fantasies. The language on the site encourages users to upload their own face. But nothing can stop them from uploading other people’s faces, and comments on online forums suggest that users are already doing so.

Such activities can crush the results for targeted women and girls. On a psychological level, these videos can be perceived as a violation of revenge porn – real intimate videos are filmed or released without consent. “This kind of abuse – where people misrepresent your identity, name, reputation and change it in such a disgraceful way – breaks you to the core,” says Noel Martin, an Australian activist who was targeted by the Deepfake porn campaign. Is.

And the impact can last a lifetime with victims. Images and videos are hard to remove from the Internet, and new content can be created at any time. “It affects your interpersonal relationships; it affects you getting a job. For every job interview you ever go to, this can be brought up. Possible romantic relationships,” says Martin. It’s not entirely successful. It will be there forever. No matter what I do. “

Sometimes it is even more complicated than revenge porn. Because the material is not real, women may doubt whether they deserve to be traumatized and whether they should report it, Dodge says. “If a person wrestles with him whether he is really suffering, it weakens his ability to recover,” he says.

Unsatisfactory deepfake porn can also have an impact on the economy and career. Rana Ayub, an Indian journalist who fell victim to the Deepfake porn campaign, was subjected to so much online harassment after that that he had to reduce his online presence and thus his profile was necessary for his work. UK-based poet and broadcaster Helen Morte, who previously shared her story with the MIT Technology Review, said she too felt pressured to do so after discovering that her photos were stolen from private social media accounts to create fake nudity.

The Revenge Porn Helpline, funded by the UK government, recently received a case from a teacher who lost her job and her deepfake pornographic pictures were circulated on social media and brought to the attention of her school. “It’s getting worse, not better,” Dodge says. “More women are being targeted in this way.”

Ajdar says that, despite being limited, Wayne’s option of making deepfake gay porn is an additional threat to men where homosexuality is considered a crime. Globally, the case is in 71 jurisdictions, 11 of which carry the death penalty.

Ajdar, who has discovered numerous Deepfake porn apps in the last few years, says he has tried to contact Y’s hosting service and force it to .fly. But he is pessimistic about stopping making similar tools. Already, another site has come up that is trying the same thing. He thinks that banning such content from social media platforms, and perhaps making their creation or use illegal, would be a more sustainable solution. “It means that these websites are treated the same as dark web content,” he says. “Even if it runs underground, at least it gets out of sight of everyday people.”

Y did not respond to multiple requests for comment on a press email listed on its site. Registration information associated with the domain is also blocked by the Privacy Service Rocky Privacy Service. On August 17, after MIT Technology Review made a third attempt to reach the creator, the site put a notice on its homepage that it was no longer available to new users. As of September 12, the notice was still there.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *