It’s Tough Being a Woman Online. Section 230 Makes It Even Harder.

An estimated 85 percent of women and girls globally have experienced some form of online harassment and abuse (Photo Illustration by Rafael Henrique/SOPA Images/LightRocket via Getty Images)

On Jan. 26, 2022, Matthew Hardy, a 30-year-old man from Northwich, England, was sentenced to nine years in prison for the online harassment of at least 62 women over the course of 11 years. 

Andrea Yuile, one of his earliest victims, said Hardy started his online harassment campaign against her while she was in high school. During this time, Hardy’s then-anonymous messages provided Yuile with tidbits of gossip, including information about her boyfriend cheating on her—sending Yuile into a panic and ultimately ripping apart her friendships. 

Hardy’s most vicious attack came after Yuile’s mother died; he sent her a message alleging her mother had been unfaithful to her father, and threatened to reveal this to her father while he was grieving. Eventually Yuile discovered Hardy’s identity, after other girls from her school revealed they were receiving similar messages. They compared notes and quickly realized the perpetrator was Matthew Hardy. Yuile went straight to the police with her findings.

Hardy’s behavior is not unusual: About 85 percent of women and girls globally have experienced some form of online harassment and abuse. Black women are 84 percent more likely to encounter online hate than white women. 

Men, too, experience online violence. But they are more likely to experience name-calling and physical attacks, a 2017 Pew Research Center survey found, while women online are more likely to experience attacks that are sexual in nature—much like Yuile and her fellow victims.

Hardy’s victims chose to report their abuser and seek prosecution from law enforcement. Still, too many other victims of online harm still await accountability from the tech platform companies allowing this type of harassment to take place. 

Gonzalez v. Google 

A series of coordinated terrorist attacks in Paris in 2015 carried out by three suicide bombers resulted in the death of 23-year-old U.S. citizen Nohemi Gonzalez, who was studying abroad in France, as well as the death of 129 others. ISIS took responsibility for this incident.

The next year, Gonzalez’s family sued Google’s YouTube and other tech companies, arguing that since the platforms’ algorithms suggested the content that radicalized the perpetrators of the attack, the companies were complicit in Gonzalez’s death. These companies should be responsible for keeping terrorist content off its site, the lawsuit argued.

Last month the U.S. Supreme Court heard oral arguments in the case, Gonzalez v. Google LLC—”the first Supreme Court case to consider the scope of Section 230 of the Communications Decency Act, which immunizes websites from legal liability for content provided by their users,” according to the ACLU.

Jose Hernandez and Beatriz Gonzalez, stepfather and mother of Nohemi Gonzalez, who died in a terrorist attack in Paris in 2015, outside of the U.S. Supreme Court following oral arguments in Gonzalez v. Google on Feb. 21, 2023—a landmark case about whether technology companies should be liable for harmful content their algorithms promote. (Drew Angerer / Getty Images)

To be sure, there is no evidence that the terrorists who took responsibility for the 2015 Paris attack watched YouTube prior to the attack. However researchers have found that algorithmic recommendation protocols radicalize vulnerable people. Consider that 70 percent of the content viewed on YouTube is suggested by their algorithm, and the Gonzalez family’s arguments become clearer. 

Section 230 and the Social Media Business Model

Google vehemently denies the company’s responsibility in radicalizing the perpetrators and ultimately for the results of the attacks. Their lawyers argue YouTube and other platform companies have the legal right to host videos displaying extremist content because they are protected by Section 230

The two clauses that shield tech companies from liability for the editorial thrust of user-based content are:

  • 230(c)(1), which protects platforms from legal liability relating to harmful content posted on their sites by third parties, and
  • 230(c)(2), which allows tech platforms to police their sites for harmful content—without requiring that they remove anything, and protecting them from liability if they choose not to. 

Combine these legal protections with the fact that social media platforms make money by selling advertisers the ability to target niche online communities, and all user-generated content becomes a commodity—including content that harms women. This is why Gonzalez v. Google is so important to women and nonbinary femmes. 

Beyond 230: Creating Feminist Futures and Protecting the Ballot

During oral arguments in Gonzalez v. Google, Justice Clarence Thomas suggested the YouTube algorithm could not be blamed for the recommendation of ISIS videos because it was simply trying to predict what the user wanted to watch.

Justice Elena Kagan followed up by asking Eric Schnapper, the lawyer representing the Gonzalez family, if he thinks Facebook, Twitter feeds and search engines should lose Section 230 protections because they make recommendations. Schnapper responded that all those things can and should be liable to lawsuits if they recommend harmful speech. Kagan then suggested that position may be too extreme for the Court.

The modification of Section 230 has bipartisan support. Even still, our collective technical fates are in the hands of a series of judicial appointees, who in this case do not seem to want to make sweeping changes to existing tech laws which allow the online harassment of women across the globe—as seen in the case of Section 230.

The only way to ensure positive feminist futures is by controlling who gets into the White House and Senate—the two bodies with the power to make federal judicial appointments. And the only way to influence who sits in these institutions is by protecting the Black vote. After all, Black women are the most reliable segment of the U.S. electorate, and they overwhelmingly support social change candidates. In 2008, 96 percent of Black women voted for Barack Obama, America’s first Black president. In 2016, 94 percent of Black women supported Hillary Clinton, vying to be America’s first woman president. We cannot solely rely on white women to deliver this because of their history of voting for regressive candidates: In 2016, 47 percent of white women voted for Trump; in 2020, 53 percent of white women voted for Trump—an increase of six points.

Then, once progressive elected officials take office, we have to keep the pressure for them to prioritize online safety by demanding they appoint federal judges with a proven track record of placing the public good in front of the profitability of tech companies—so that when cases like Gonzalez v. Google come to the bench, there will be a cohort of jurists who consider the constitutional and social implications of the use of advanced technological systems in society. The only way to protect American women from online harassment, is to make sure our judicial appointees believe online safety is a human right. 

Up next:

U.S. democracy is at a dangerous inflection point—from the demise of abortion rights, to a lack of pay equity and parental leave, to skyrocketing maternal mortality, and attacks on trans health. Left unchecked, these crises will lead to wider gaps in political participation and representation. For 50 years, Ms. has been forging feminist journalism—reporting, rebelling and truth-telling from the front-lines, championing the Equal Rights Amendment, and centering the stories of those most impacted. With all that’s at stake for equality, we are redoubling our commitment for the next 50 years. In turn, we need your help, Support Ms. today with a donation—any amount that is meaningful to you. For as little as $5 each month, you’ll receive the print magazine along with our e-newsletters, action alerts, and invitations to Ms. Studios events and podcasts. We are grateful for your loyalty and ferocity.


Mutale Nkonde is tech columnist for Ms. and founder of AI for the People, a nonprofit that seeks to use popular culture to increase support for policies to reduce algorithmic bias, and an unabashed Black feminist. Learn more about her work here. Follow her @mutalenkonde on Twitter and @mutalenkonde2 on Instagram.