AI-generated pornography showing the faces of non-consenting women is becoming more ubiquitous online, and the problem is spilling over into the world of popular influencers and streamers.

In January, British live streamer «Sweet Anita», who has 1.9 million followers on Twitch, where she posts videos of her gaming and interacting with followers, was notified that a large number of videos were circulating online. sexually explicit fakes with the faces of Twitch streamers. .

His first thought was, «Wait, am I on this?»

He quickly googled his name along with the term «deepfake,» a word used to describe a highly realistic but fake digitally manipulated video or image, and a technique that is increasingly being used, often without consent, for marketing purposes. pornographic. Anita’s initial search turned up several videos showing her face edited onto another person’s body.

«Obviously this has been going on for quite some time without my knowledge, I had no idea, it could have been years, for all I know,» said Anita, 32, who did not want to share her full name with NBC News. concerns for the safety and privacy of her offline.

Hany Farid, a professor of computer science at the University of California, Berkeley, said deepfakes are a phenomenon that is «absolutely getting worse» as it becomes easier to produce sophisticated, lifelike videos through apps and websites. automated.

The number of deep fake porn videos available online has seen a sharp rise, nearly doubling every year since 2018, according to research by livestreaming analyst Genevieve Oh. In 2018, only 1,897 videos were uploaded to a well-known deepfake streaming site, but by 2022 this number has risen to over 13,000 with over 16 million monthly views.

Now all of a sudden the people who are vulnerable are people who have very small online footprints.

-Hany Farid, professor of computer science at the University of California, Berkeley

Previously, celebrities were mainly the target of deepfakes.

“Now all of a sudden the vulnerable people are people who have very small online footprints,” Farid said. “The technology is getting so good that it can generate images from relatively small training stats, not these hours and hours of video that we used to need.”

Anyone interested in creating deepfakes can quickly access a large number of free and paid face swapping apps available on the Google Play and Apple app stores, making it easy for anyone to upload a photo and edit it into a photo. or video in seconds.

Some major platforms like Reddit, Facebook, Tik Tok and Twitter they have tried to address the spread of fake pornography with policy changes. While each of the platforms specifically prohibits the material, some have had trouble moderating it. A Twitter search, for example, found fake pornographic videos claiming to feature Twitch stars, along with hashtags promoting fake deeps.

In January, the proliferation of deepfake pornography made waves online, when a popular Twitch streamer with more than 300,000 followers admitted paying for explicit material with AI-generated versions of his peers.

January 30 in a cry apology video which was shared on Twitter and garnered millions of views, Brandon Ewing, who goes by the screen name «Atrioc» on Twitch, said he clicked on a fake porn ad while browsing a popular porn website. He said he then subscribed to and paid for content on a different website that featured other female broadcasters after becoming «morbidly curious.»

in a longest statement posted on Twitter On February 1, Ewing addressed live Twitch streamers Maya Higa and Pokimane, whose image briefly appeared in a tab on a website hosting fake pornography during one of her live streams.

“Their names were dragged and they were sexualized against their will,” he said. «I am sorry that my actions have led to further exploitation of you and your body, and I am sorry that your experience is not uncommon.»

Ewing did not respond to a request for comment.

Pokimane also did not respond to a request for comment, but in a message on January 31, cheep she wrote, «stop sexualizing people without their consent. That’s it, that’s the tweet.»

Higa said that he had no further comment to make beyond her. Twitter Statement on January 31, in which she wrote, in part, «the situation makes me feel disgusting, vulnerable, nauseated and violated, and all these feelings are all too familiar.»

The incident highlighted the growing prevalence of AI-generated non-consensual pornography and the ethical issues it creates.

There has been an «increase» in websites that are «willing, eager and monetizing the hosting of this material,» Farid said.

QTCinderella, another Twitch streamer who discovered she had appeared on the deepfake website, said she found it particularly painful because Ewing is a close friend.

“I think that was the most unfortunate thing: I didn’t find out from Atrioc. I found out about it from the internet,” said QTCinderella, 28, who also did not share her full name with NBC News to protect her privacy and security offline.

She said she quickly traced the video content to an account on a subscription-based website and issued a takedown notice, but the videos continue to spread like «wildfire.»

In the United States, while the most states have laws ban revenge porn, just New York, Virginia, Georgia and California they have laws that specifically address fake deep media, according to the Cyber ​​Civil Rights Initiative. Meanwhile, the UK announced in november last year that it planned to criminalize non-consensual explicit deep fake media.

QTCinderella said the current legal framework is «daunting».

“Every attorney I’ve talked to has essentially come to the conclusion that we don’t have a case; there is no way to sue the guy.”

While a lot of fake porn can seem amateurish and low-quality, Farid said he’s also now seeing accounts that offer to create fancy custom fakes of any woman for a small fee.

After seeing the deep fake videos being sold of her online, Anita said she felt groggy, tired and disengaged.

“They sell me against my will,” he said. «I did not consent to being sexualized.»

QTCinderella saying experienced “body dysmorphia”.

“When you see a porn star’s body grafted so perfectly where yours should be, it’s the most obvious comparison game you could ever have in your life,” she said. «I cried and said ‘my body will never be like this’.»

Sophie Compton campaigning against the abuse of intimate image with the organization My image, my choice she said the women they target are «embarrassed or silenced» and feel their experience is minimized because there are few legal options available to those affected by the deepfakes.

“We need to find a way to make these sites and their business model impossible,” Compton said.

Specific platforms that host non-consensual sexual images should be held to account, rather than individual accounts and creators, Farid said. “If you really want to tackle this problem, go against the grain,” he said. «That’s where all the power is.»

Anita said she wants there to be «very visible consequences.»

What worries her most in the future is that it is impossible to know who bought the fake videos.

«When I go to a meet and greet, I might end up hugging and signing something for someone who saw me get faked…and I’d have no way of knowing they’re consuming that,» he said. «To have my body bought against my will is really horrible.»