TikTok Bans Misogyny, Misgendering and Deadnaming Trans Users, and the Promotion of Conversion Therapy

(Pixabay / Creative Commons)

TikTok, a social media platform with over 1 billion monthly users, is taking new steps to protect LGBTQ and women users from harassment and hate speech.

The most recent update of its Community Guidelines bans any content found to contain misogyny, as well as the misgendering and deadnaming of transgender creators. Released Tuesday, the new guidelines also clarified any content promoting or supporting so-called conversion therapy would also be a violation of the rules, along with content that promotes disordered eating.

“Though these ideologies have long been prohibited on TikTok, we’ve heard from creators and civil society organizations that it’s important to be explicit in our Community Guidelines,” Cormac Keenan, TikTok’s head of trust and safety, wrote in a blog post about the changes. “On top of this, we hope our recent feature enabling people to add their pronouns will encourage respectful and inclusive dialogue on our platform.”

The decision to ban anti-trans content is of particular note, as an October report from Media Matters showed a deep connection between transphobia and other kinds of far-right extremism—which researchers found was then being reinforced by TikTok’s algorithm. “Exclusive interaction with anti-trans content spurred TikTok to recommend misogynistic content, racist and white supremacist content, anti-vaccine videos, antisemitic content, ableist narratives, conspiracy theories, hate symbols, and videos including general calls to violence,” wrote researchers Olivia Little and Abbie Richards. The recent change from TikTok will no doubt lessen the probability of users falling down what Little and Richards call “far-right rabbit holes.”

LGBTQ advocacy organizations like GLAAD applauded the change to the guidelines. “When anti-transgender actions like misgendering or deadnaming, or the promotion of so-called ‘conversion therapy,’ occur on platforms like TikTok, they create an unsafe environment for LGBTQ people online and too often lead to real world harm,” said Sarah Kate Ellis, president and CEO of GLAAD. “TikTok’s move to expressly prohibit this harmful content in its Community Guidelines and to adopt recommendations made in GLAAD’s 2021 Social Media Safety Index raises the standard for LGBTQ safety online and sends a message that other platforms which claim to prioritize LGBTQ safety should follow suit with substantive actions like these.”

The past decade has seen a rise in trans and gender non-conforming visibility and representation. Even still, GLAAD’s Social Media Safety Index called the problem of anti-LGBTQ hate speech and misinformation “a public health and safety issue.” More anti-trans bills were introduced in state legislatures in 2021 than in any previous year on record, and trans advocates are projecting that 2022 could see even more discriminatory laws than ever before.

The problem of anti-LGBTQ hate speech and misinformation is a public health and safety issue, according to GLAAD’s Social Media Safety Index (SMSI). (Read the full Index here.)

Trans creators are not a small portion of the app’s users, with some popular trans influencers, like Ve’ondre Mitchell, on the platform garnering over 300 million likes on their content—part of a move towards radical authenticity as a method of political and social resistance.  This shift in TikTok’s priorities shows a positive push towards normalizing trans people and stigmatizing transphobia and homophobia. But it is just the first of many necessary steps to protect trans people, both on and offline.

Up next:


Ramona Flores is an editorial fellow with Ms. and is completing her undergraduate studies at Smith College, with a double major in government and the study of women and gender. Her academic focuses include Marxist feminism, transnational collective organizing and queer history. Her writing covers internet subcultures, reproductive care advocacy and queer theory. She hails from Austin, Texas.