NBC report recently unveiled a celebrity face-swapping app, Facemega, with the potential to quickly create deepfake porn depicting famous or public-facing women. Deepfake porn refers to fake but highly realistic, often AI-generated porn and sexual imagery of people without their consent. Unsurprisingly, it typically targets women and especially female celebrities; victims have thus far included popular female Twitch streamers and, as early as 2018, Gal Gadot.
Emma Watson has been the most recent victim and the app has been using her deepfake videos in its ads which were being shown on Meta platforms.
i got this ad yesterday and wow what the hell pic.twitter.com/smGiR3MfMb
— lauren (@laurenbarton03) March 6, 2023
Per NBC’s report, Facemega ran hundreds of different ads on Facebook, Instagram, and other Meta apps, advertising how easily users could face-swap themselves with the likes of Emma Watson or Scarlet Johansson, accompanied by highly realistic, sexually suggestive video examples. (This, of course, is despite how Meta banned most deepfake content back in 2020.)
The app also placed ads in the Apple store, where it first became available sometime last year. As of Tuesday evening, NBC’s Kat Tenbarge reported that the Apple app store had taken down the face-swapping app, and Meta stopped running ads for it, shortly after NBC’s original report. But the app remains on Google Play. Lauren Barton, who uploaded a screen recording of one of the videos to Twitter, said to NBC:
“This could be used with high schoolers in public schools who are bullied. It could ruin somebody’s life. They could get in trouble at their job. And this is extremely easy to do and free. All I had to do was upload a picture of my face and I had access to 50 free templates.”
“Replace face with anyone,” the captions on some of the Meta ads say. “Enjoy yourself with AI swap face technology.” Facemega also offers users the opportunity to face-swap varying celebrities onto preset videos, and among other categories, includes a “Hot” category that “features videos of scantily clad women and men dancing and posing.”
Even as Meta and Apple seem to have taken action to rein in Facemega, with or without the app, deepfake porn is a rapidly growing crisis. Per one streaming researcher who spoke to NBC, last month saw the greatest volume of deepfake porn videos uploaded ever.
Face-swapping apps and other AI technologies can be wielded to create deepfake porn targeting not just celebrities, but friends, co-workers, and even casual acquaintances. Experts say that increasingly popular AI-generated imagery apps like Lensa AI have been a boon for child sexual abuse content. One researcher wrote in Wired last year that upon uploading her childhood photos to Lensa, “what resulted were fully nude photos of an adolescent and sometimes childlike face but a distinctly adult body.”
And, especially concerning, our laws are wildly unequipped to protect people. While most states have varying anti-cyber-exploitation laws to rein in “revenge porn” (nonconsensual nude images of individuals shared by former partners or harassers), the only states explicitly prohibit nonconsensual deepfake sexual content are California, Virginia, and Texas. As apps like Facemega proliferate, likely aided by social platforms like Meta, we’re simply not prepared for the scope of the damage this could inflict.
In a January 2020 press release, Meta claimed that it was tightening its grip on manipulated content—the company specifically referenced the growing popularity of deepfakes. Meta said it would remove manipulated videos and photos that are edited in ways to intentionally deceive the average person, or in instances where an AI superimposes content onto a video.
“Our policies prohibit adult content regardless of whether it is generated by AI or not, and we have restricted this Page from advertising on our platform,” Meta told in a mail.