Tell Legacy Media: Center Survivors in AI Deepfake Reporting
The petition to the editors and leadership of major legacy media organizations, including The New York Times, CNN, The Washington Post, and BuzzFeed reads:
Do more to center survivors of deepfake abuse in your coverage of the issue.
AI-generated sexual imagery and videos (which we will call deepfake abuse for short) is a rapidly growing form of image-based sexual abuse (IBSA) that overwhelmingly targets women and girls. And yet, legacy media coverage, including coverage by your outlet, too often focuses on the tech, the perpetrators, and the sensational headlines instead of the survivors being harmed.
This framing causes real harm. Researchers have warned that coverage often emphasizes the novelty or spectacle of AI-generated imagery rather than its real-world impacts on survivors. The frequent use of terms like 'deepfake porn' also minimizes the reality that these images are created and distributed without consent.
We urge you to improve your reporting on AI-enabled sexual abuse by:
- Centering the voices and experiences of survivors in reporting.
- Stopping the use of harmful terms like 'deepfake porn' and other language that minimizes the abuse.
- Treating deepfake abuse as a serious form of gender-based violence and digital exploitation.
Responsible reporting can help shift public understanding, support survivors, and hold technology platforms accountable. We urge you to report on this crisis with care, accuracy, and respect for those most harmed.
When Olympic figure skater Alysa Liu competed in the 2026 Winter Olympics, millions watched her skate. At the same time, anonymous users on 4Chan were using publicly available photos of Liu to generate fake nude images with AI – and the media ran with the story, further spreading the images and using minimizing language like “deepfake porn.”
This is part of a growing crisis. AI tools are being used to create sexual deepfakes of women and girls without their consent. In fact, it's estimated that at least 90% of deepfakes online are sexual — and roughly 90% of those target women and girls.
Yet media coverage often sensationalizes these stories or uses terms like “deepfake porn,” which minimizes the reality that this is a form of image-based sexual abuse.
Media organizations have the power to shift the narrative. By centering survivor voices and reporting on the real harm caused by AI-generated sexual abuse, journalists can help hold tech companies accountable and push for change.
Sign the petition to demand responsible reporting that treats this crisis with the seriousness it deserves.