A national legislative framework targeting AI-generated explicit and nonconsensual content would enable a more comprehensive approach to addressing the causes and consequences of deepfake sexual violence.
Every time you read the news, there’s another story about AI, deepfake abuse, and the ongoing debate on how to regulate these rapidly advancing technologies. Encountering multiple instances of online sexual violence driven by AI-generated deepfake technology is deeply unsettling. Women represent 99 percent of those targeted by deepfake “pornography,” which makes up 98 percent of all deepfake videos online. Urgent action is needed, and effective legislation is a critical starting point.
Thousands of women and girls have experienced this form of gender-based violence already, and it’s exacerbated by the advancing accessibility and sophistication of technology. In 2023 alone, the volume of deepfake abuse videos surpassed the total of all previous years combined, with the number of nonconsensual videos doubling annually.
Those nonconsensual images are created and shared with the goal of humiliating and degrading the women and girls in them. The fallout is immense, and it goes beyond personal harm. The silencing effect leads to people stepping back from vital arenas like politics, journalism and public discourse. But that’s the point of this misogyny, isn’t it? It’s gender-based violence at its core.
Policy and technology solutions must center the experiences of survivors. Those who have lived experience have unique insights that can shape legislative and technology policies to combat the harms of nonconsensual content. However, these voices are often sidelined.
The Reclaim Coalition to End Online Image-Based Sexual Violence, powered by Panorama Global, is a global network that integrates survivor leadership into policy discussions worldwide. Without comprehensive legislation informed by survivors, we all are at risk, particularly the upcoming generation. The absence of adequate legal protections is failing survivors who have little to no access to justice and exposes the broader population to a growing threat.
But legislation marks just the beginning of the solution. Genuine technological accountability, responsible AI development, and safeguarding the right to online privacy require more than a single federal law. However, enacting such legislation is a critical first step, and there is precedent that legislation can catalyze more widespread change.
Consider how the 1994 Violence Against Women Act (VAWA) resourced the gender-based violence and domestic violence fields, and how the Trafficking Victims Protection Act (TVPA) provided a foundation for coordinated anti-trafficking efforts.
A national legislative framework targeting AI-generated explicit and nonconsensual content could unlock vital resources, funding and accountability needed for headway against online sexual violence. This would enable a more comprehensive approach to addressing the multifaceted causes and consequences of deepfake sexual violence.
In the U.S., we’ve seen modest progress. Approximately 10 states have passed laws, with varying levels of protection.
Globally, others are ahead of the curve, with national and regional legislation enacted in the European Union.
Despite encouraging strides, such as the progression of proposed bills like the DEFIANCE Act, the SHIELD Act and the Preventing Deepfakes of Intimate Images Act through Congress, the U.S. federal government has yet to reach a lasting, bipartisan consensus. As a result, perpetrators involved in creating, sharing and distributing explicit and nonconsensual deepfake content operate with little to no sense of deterrence or accountability.
While debating the merits of each proposed policy—such as creating the private right to sue in civil court versus making this conduct a criminal offense—is important, lawmakers must listen to survivors and expedite the process. Each day that passes without legislation means more women and girls targeted. Legal protections are foundational to ensure that women and girls are not forced out of spaces they have a right to be in.
It’s time to amplify our voices and demand action from our leaders. We must urge them to take immediate steps to protect all women and girls from the growing threat of nonconsensual deepfake content. No survivor should be left to endure this injustice alone.
Up next:
U.S. democracy is at a dangerous inflection point—from the demise of abortion rights, to a lack of pay equity and parental leave, to skyrocketing maternal mortality, and attacks on trans health. Left unchecked, these crises will lead to wider gaps in political participation and representation. For 50 years, Ms. has been forging feminist journalism—reporting, rebelling and truth-telling from the front-lines, championing the Equal Rights Amendment, and centering the stories of those most impacted. With all that’s at stake for equality, we are redoubling our commitment for the next 50 years. In turn, we need your help, Support Ms. today with a donation—any amount that is meaningful to you. For as little as $5 each month, you’ll receive the print magazine along with our e-newsletters, action alerts, and invitations to Ms. Studios events and podcasts. We are grateful for your loyalty and ferocity.