Facebook Still Has Far to Go

When Ijeoma Oluo, Editor-at-Large of The Establishment, shared screenshots of the hateful messages she was receiving on Facebook, she considered it an act of protest against her harassers as well as the social network itself, which had failed to properly handle the harassment or equip her with the resources to block and report the users responsible.

In an ironic but unsurprising turn of events, Facebook then suspended her account.

In a Medium post, Oluo details her experiences with Facebook’s failure to properly handle racist hate speech—and its twisted, contradictory response.

My facebook page is infested with racist hate and violent threats from people who are so angry that I would be nervous to be surrounded by them.

So after getting absolutely no help from facebook whatsoever, I started posting screenshots of the comments and messages I was getting. The ones you are seeing above, and more. If you send me a message saying that you hope I get hit by a bus, or pushed off the Grand Canyon, and facebook absolutely refuses to hold you accountable, the least you deserve is for people to see the hate you are spreading.

And finally, facebook decided to take action. What did they do? Did they suspend any of the people who threatened me? No. Did they take down Twitchy’s post that was sending hundreds of hate-filled commenters my way? No.

They suspended me for three days for posting screenshots of the abuse they have refused to do anything about.

In this particular case, the harassment Oluo was facing came from a Tweet. While she was on a road trip through the American west, Oluo posted a tweet joking about feeling unnerved at a Cracker Barrel full of white people—a restaurant whose decor she felt romanticized an era characterized by slavery, and that has been caught overtly mistreating black customers.

Soon after, Oluo was inundated with messages weaponizing racial slurs as well as death and rape threats. Dubbing this flurry of hateful responses “#crackerbarrelgate,” Oluo documented them and began reporting the users. To Oluo’s relief, Twitter took down most of the threatening and racist responses she reported and locked the accounts of many of the offenders. “With the addition of the quality filter that blocks out the majority of hate from reaching my updates  it’s almost as if I don’t have hundreds of angry white people calling me a fat gorilla on there,” Oluo wrote in her Medium post.

Facebook, however, failed to remove the comments and messages that had begun to flood Oluo’s Facebook account when the harassment she was receiving on Twitter began to pour over onto the other network. Oluo also realized that using Facebook on her phone, since she was traveling, limited her ability to report and block users.

Although Facebook updated its community standards in 2015, many users have seen little change. And that’s because the problem does not lie in its formal guidelines, but in the way misogynistic, racist and transphobic norms are ingrained into Facebook’s version of free speech, from its “real name” policy that excluded transgender people to its repeated determinations that abuse like what Oluo experienced does not qualify as a violation of community standards.

“Facebook is failing people of color, just as they are failing many feminists and transgender people, in punishing them for speaking out about abuse,” Oluo wrote. “And they need to be held accountable.”

According to Oluo, Facebook has since apologized for the way it handled her situation. But this isn’t the first time Facebook has failed users who have been targeted by hate speech and harassment—particularly women and people of color—and it likely won’t be the last. The instances in which Facebook hasn’t removed reported content on the grounds that it didn’t violate community standards—from animal abuse to “rape joke” pages to violent racist threats like those Oluo received—are well-documented. Activists, feminists and others attempting to counter hate speech or organize on Facebook have also been blocked from their own accounts time and time again.

Putting an end to the toxic racism and sexism that permeates digital spaces is a long process, and it isn’t a battle that will be won overnight. But Facebook could make a strong head start by taking seriously the posts and messages that are hateful and threatening instead of penalizing the users who call them out.

About

Maddie Kim is a former Editorial Intern at Ms. studying English and creative writing at Stanford. Her poetry and prose have been recognized by the Norman Mailer Center, Princeton University, Sierra Nevada Review and Adroit Prizes. She is a prose reader for The Adroit Journal. When she’s not writing, she likes tap dancing and taking blurry photos of her dogs. You can find her on Instagram and Twitter.