Google Jigsaw Introduces New Tool to Protect High-Risk Online Users

google-jigsaw-online-abuse-women-racism
Nearly 40 percent of women have personally been targeted by online violence. A new tool from Google thinktank Jigsaw helps high-risk internet users catalog and report harassment and abusive behavior online. (Wikimedia Commons)

A new tool from Jigsaw, a unit inside Google that explores threats to societies and builds technology, is aiming to make the internet a safer place for marginalized, high-risk users: women and people of color. Called Harassment Manager, the open source code is a free web app that identifies abuse or harmful language, and allows the user to mute or block the sender, as well as document the harassment. 

From threats of doxxing to ever-increasing worldwide cybersecurity threats, the internet can be a terrifying place. And women and people of color in particular are uniquely susceptible to abuse or harmful language: Activists, journalists and everyday users alike are subject to digital abuse and threats, either anonymous or attributed, from all over the world. 

In a 2021 study, Jigsaw and the Economist Intelligence Unit surveyed women from 51 of the most online-populated countries to get a better understanding of the harassment women were experiencing online. The report found 38 percent of women surveyed reported experiencing personal harassment online, and 85 percent of those surveyed reported having witnessed violence against women online. And while online harassment is incredibly prevalent, it is also largely underreported: The same report found that only one in four women reported the harassment to the platform on which it occurred, and 78 percent reported that they were often unaware that there are options to report this harmful behavior. 

google-jigsaw-online-abuse-women-racism
(“Measuring the Prevalence of Online Violence Against Women” study, via the Economist and Jigsaw)

Google Jigsaw’s head of partnerships and business development Patricia Georgiou shared the thought process behind creating the Harassment Manager and why it matters. 


Ramona Flores: In layman’s terms, what does the Harassment Manager do? Who does it aim to protect?

Patricia Georgiou: Harassment Manager is an open source tool built by Jigsaw to help women public figures—such as journalists, activists, and politicians—deal with online harassment. These women face a disproportionate amount of attacks on social media platforms because of who they are and what they say or do. So we built a tool to help them manage the harassment they receive online, starting with Twitter: with the help of our technology, they can delete or hide toxic language directed at them, block users, and document their experience.

Flores: Why is this tool needed, and what gap in resources does it fill? 

Georgiou: According to research from Jigsaw and the Economist Intelligence Unit, 85 percent of women have witnessed harassment or other forms of online violence. Nearly 40 percent of women have personally been targeted by online violence. These numbers are even higher for women whose work is public-facing, like journalists or human rights defenders, and for minority groups. 

Currently, targets of online abuse cannot feasibly sort through a high volume of toxic comments—and we heard from users in our research that there was a need for a tool to support this. Harassment Manager enables these targeted women to take action and manage their Twitter feed. And by removing this toxic content, it also helps reduce the “chilling effect” that such attacks would otherwise have on everyone else.

Flores: What is the importance of documenting harassment online? 

Georgiou: Many of the journalists and activists that we interviewed in our research said: “the first thing I do when I get harassed online is to document it.” Having this evidence helps them get support from their employers, their networks, the platforms on which they had this experience, law enforcement, but it can be time consuming and emotionally draining if you have to take manual screenshots of every negative comment, so Harassment Manager helps you do that in a more automated way.

Documenting these experiences can also reveal larger trends, such as the increasing harassment of groups or communities like journalists or minorities.

Nearly 40 percent of women have personally been targeted by online violence. These numbers are even higher for women whose work is public-facing, like journalists or human rights defenders, and for minority groups

Flores: When considering access and distribution, what processes and thoughts were behind making the code open source? 

Georgiou: We are open sourcing the technology so that anyone can use it and adapt it to the specific needs of their communities. For example, the Thomson Reuters Foundation will be launching this tool for their network of journalists around the world, and other news organizations are interested in using it too. But the needs of each community might differ based on regions, activities, the platforms that they use, etc. So open-sourcing the code provides that flexibility—you can adjust the features as you see fit. And it also allows for scale—anyone in the world can pick up the code for free.

Egyptian gay rights activist Omar Shariff, Jr., is one of the Middle East’s first openly gay celebrities.

Flores: What are your plans to expand to platforms beyond Twitter? 

Georgiou: The Twitter version of this tool is just the first step. Now that the code for Harassment Manager is open sourced, our hope is that other ecosystem players can adopt it, adapt it, expand it. It could be expanded for cross-platform use, for example—a lot of the journalists and activists that we talked to mentioned that when they’re targeted by organized or “coordinated” campaigns of online attacks, these attacks span multiple social platforms simultaneously.

The Twitter version of this tool is just the first step. Now that the code for Harassment Manager is open sourced, our hope is that other ecosystem players can adopt it, adapt it, expand it.

Flores: Why should people be invested in the success of tools that record and address hateful behavior online? 

Georgiou: Toxicity and online harassment marginalize important voices, particularly those of political figures, journalists, and activists. According to research by IWMF, 70 percent of women journalists receive threats and harassment online, and more than 40 percent stopped reporting a story as a result. This can translate to real world violence: we have heard the stories of journalists who are assassinated because they uncovered a story of government corruption, for example.

So in addition to the significant impact on individual women, this harassment can have negative effects on our democracies, if important sources of information and different points of views are silenced by intimidation.

A tool like Harassment Manager can help reduce this online toxicity and its chilling effect.

In Afghanistan, a country where Taliban forbade women from attending school or working, Roya found another way to empower young women to learn for themselves.

Flores: What is something you’d like to share with the Ms. audience? 

Georgiou: Building this tool is a part of Jigsaw’s broader mission to protect democracies through technology. We have other ongoing efforts to combat online violence against women—like training thousands of women on digital safety tools, or supporting other organizations who focus on this issue. We also have other tools that can help reduce online toxicity and cyber attacks. We hope our work inspires more technology, civil society and political responses to uphold human rights around the world.

Up next:

About

Ramona Flores is an editorial fellow with Ms. and is completing her undergraduate studies at Smith College, with a double major in government and the study of women and gender. Her academic focuses include Marxist feminism, transnational collective organizing and queer history. Her writing covers internet subcultures, reproductive care advocacy and queer theory. She hails from Austin, Texas.