“Deepfake porn” is on the rise. Here’s how it harms the people it purports to depict.
With rapid advances in AI, the public is increasingly aware that what you see on your screen may not be real. ChatGPT will write everything from a school essay to a silly poem. DALL-E can create images of people and places that don’t exist. Stable Diffusion or Midjourney can create a fake beer commercial—or even a pornographic video with the faces of real people who have never met.
So-called “deepfake porn” is becoming increasingly common, with deepfake creators taking paid requests for porn featuring a person of the buyer’s choice and a plethora of fake not-safe-for-work videos floating around sites dedicated to deepfakes.
The livestreaming site Twitch recently released a statement against deepfake porn after a slew of deepfakes targeting popular female Twitch streamers began to circulate. Last month, the FBI issued a warning about “online sextortion scams,” in which scammers use content from a victim’s social media to create deepfakes and then demand payment in order to not share them.
Sophie Maddocks, a doctoral student in the University of Pennsylvania’s Annenberg School for Communication, studies image-based sexual abuse, like leaked nude photos and videos and AI-generated porn.
Here, Maddocks talks about the rise of deepfake porn, who is being targeted, and how governments and companies are (or are not) addressing it:
Author Profile
- Futurity is a nonprofit website that aggregates news articles about scientific research conducted at prominent universities in the United States, the United Kingdom, Canada, Europe, Asia, and Australia. It is hosted and edited by the University of Rochester.
Latest entries
- ScienceOctober 30, 2024People don’t like stories they think AI wrote
- ScienceOctober 29, 20244 tips for safer trick-or-treating this Halloween
- ScienceOctober 28, 2024Humans’ love for carbs may date back to before farming
- ScienceOctober 27, 2024Higher quality ‘good cholesterol’ particles tied to better memory