If It Can Happen to Taylor Swift, It Can Happen to Any of Us

Taylor Swift was the victim of our lawless internet, where platforms can entirely evade a duty of care to their users. So were countless women before her.

Taylor Swift at M&T Bank Stadium on Jan. 28, 2024, in Baltimore, Md. (Patrick Smith / Getty Images)

A few days ago, TIME’s Person of the Year was the victim of a deepfake pornographic attack.

For many women, the story was not revelatory; it was a traumatic reminder. Their hearts beat faster, their stomachs churned as they remembered seeing their own faces plastered on deepfake pornography. Some women likely felt a little angry, because when their worlds were upended, no one said a word.

Today, the law remains silent for Swift, even while the world shouts. 

A few facts differentiate Swift’s experience from those of other women and girls. While Swift has legions of fans ready to come to her immediate defense and demand that the posts be taken down, most women do not have an army at the ready. Most women do not have the time or money for litigation, and few firms offer pro bono help for victims of online abuse. 

That said, Swift likely experienced the same nauseating feeling that many other women did when she saw her face plastered on nude bodies and virtually defiled by the public. And Swift’s lawyers will struggle to find satisfactory legal recourse. Taylor Swift was the victim of our lawless internet, where platforms can entirely evade a duty of care to their users. So were countless women before her.

While the mainstream media hasn’t been covering this subject, scholars—mostly female—have been paying close attention and have conducted groundbreaking research. Many of these women have been subject to attack, since taking on the fight.

Nina Jankowicz is one of the leading scholars of state-sponsored disinformation and gendered online abuse. After being tapped by President Biden to lead the Disinformation Governance Board, Jankowicz became the target of online abuse. She received rape and death threats and was the subject of deepfake pornography. The Biden administration disbanded the board, and the far right celebrated its successful takedown of Jankowicz.

Danielle Citron, privacy expert and University of Virginia professor of law, has written extensively about privacy harms and the online abuse of women. In an article about deepfakes, Citron and Bobby Chesney, dean of Texas Law, wrote that deepfake sex videos add a new dimension to rape threats, as the threats take on a new virtual reality.

Part of the problem with deepfakes is that they stick with us, long after we click away. In her book, The Fight for Privacy: Protecting Dignity, Identity, and Love in a Digital Age, Citron explained that people are 80 percent more likely to remember a photograph than to remember text. Our eyes cannot unsee an image, whether fake or real.

Online abuse sticks with victims viscerally, and can have physiological, psychological, life-altering consequences for them. For women of color, and religious and sexual minorities, gendered violence is compounded by racist and xenophobic attacks. 

Deepfake sex videos of female journalists and politicians of color are particularly pernicious. 

If you’ve heard about a female journalist subjected to voracious online abuse, it was likely Rana Ayyub—one of India’s most recognized journalists, renowned for her journalistic achievements, and now recognized because of the online violence she has experienced. 

In April 2018, a deepfake porn video of Ayyub went viral and was widely distributed, including on the Facebook page of the chief minister of India’s largest state. Ayyub was called a “presstitute” and an “ISIS sex slave,” and Twitter mobs called for her to be gang-raped. After the video went viral, Ayyub was doxed, meaning her identifiable information including her address and phone number were published online.

Ayyub told reporters at the Washington Post that seeing the video made her seriously ill. The experience was different from threats; it was “uniquely visceral, invasive and cruel.”

Ayyub was one of the women chronicled in the International Center for Journalists’ (ICFJ) report on female journalists who have suffered intense, prolonged and coordinated attacks. In addition to Ayyub, they issued reports about female journalists in Mexico, the Philippines, the U.K., and Arab States. Many of these journalists work in patriarchal, autocratic environments where the governments propagate the deepfakes meant to humiliate and discredit them.

Attacks against Maria Ressa, a journalist from the Philippines, included sexist, misogynistic and explicit abuse; threats of sexual and physical violence; racist abuse; and homophobic abuse.

In the case of Ghada Oueiss, abusers circulated a doctored photo of Oueiss in a bikini, provoking an onslaught of attacks. Oueiss contacted Facebook to alert them to the disinformation and was informed that the content “did not breach the platform’s community standards.” 

What kind of “community standards” do we live by, if they cannot protect a woman from salacious disinformation that leaves her vulnerable to death threats and rape threats?

Anu Bradford wrote in a Foreign Affairs essay that Europe has an edge over America and China: While the U.S. digital empire has followed a market-driven approach, the E.U. is pursuing a rights-driven approach. Europe has decided that “the AI transformation has such disruptive potential that it cannot be left to the whims of tech companies but must instead be firmly anchored in the rule of law and democratic governance.” Bradford noted that Australia, Brazil, Canada and South Korea are modeling their laws after European regulations. 

In the United States, our hyper-fixation on the First Amendment has resulted in overprotection of abusive content and a gaping hole in place of law to protect victims. We protect companies with the iron shield of Section 230, granting them immunity for all content hosted on their platforms. We do so even though these companies have failed to protect female users, leaving them with no recourse to respond to abuse. By promoting freedom of speech at all costs, we have democratized violence and penalized women who participate in democracy.

Mary Anne Franks and Danielle Citron lead the Cyber Civil Rights Initiative to combat online abuse that threaten civil rights and liberties. They have endorsed HR 3106, the Preventing Deepfakes of Intimate Images Act, proposed by Rep. Joe Morelle (D-N.Y.). On a domestic level, this legislation is crucial. It prohibits the non-consensual disclosure of digitally altered intimate images, makes the sharing of these images a criminal offense, and creates a right of private action for victims to seek relief. Importantly, it would also make the United States a leader on the global stage, one that recognizes a duty to protect women and promote democracy.

Though it took an outrageous attack on the world’s most famous woman, eyes have been opened to this problem. Let’s hope this one sticks with us.

Up next:

U.S. democracy is at a dangerous inflection point—from the demise of abortion rights, to a lack of pay equity and parental leave, to skyrocketing maternal mortality, and attacks on trans health. Left unchecked, these crises will lead to wider gaps in political participation and representation. For 50 years, Ms. has been forging feminist journalism—reporting, rebelling and truth-telling from the front-lines, championing the Equal Rights Amendment, and centering the stories of those most impacted. With all that’s at stake for equality, we are redoubling our commitment for the next 50 years. In turn, we need your help, Support Ms. today with a donation—any amount that is meaningful to you. For as little as $5 each month, you’ll receive the print magazine along with our e-newsletters, action alerts, and invitations to Ms. Studios events and podcasts. We are grateful for your loyalty and ferocity.

About

Kristin O'Donoghue is a fourth-year student at the University of Virginia. She works as a research assistant for UVA Professor of Law Danielle Citron, focusing on gendered disinformation. Her work has appeared in publications based in Charlottesville, Va, and New York City.