Leave this site

Blog | 01 August 2023

Understanding Digital Misogynoir: New report from Glitch

by Emily Barr

The online abuse charity Glitch has released a report on digital misogynoir, shining a light on the ways misogynoir shows up in online spaces, how tech companies are not doing enough to tackle this, and what we can do to dismantle digital misogynoir. But first, let’s get back to basics. What is digital misogynoir?

The term ‘misogynoir’ was coined by black feminist writer Moya Bailey in 2010 to describe the specific type of hatred directed at Black women. Whilst conversations around discrimination often feature gender and race, misogynoir highlights that identities intersect, producing abusive language and hatred based on specific tropes about Black women.

Digital misogynoir is the continued, unchecked, and often violent dehumanisation of Black women on social media, as well as through other forms such as algorithmic discrimination. Black women have been raising the alarm about the specific ways in which they are abused online since the 1990s. Glitch’s new report is the first study to examine digital misogynoir across multiple platforms – here’s what they found.

The report looked at over 200,000 highly toxic posts about women across social media and found that posts about women are significantly more toxic than the average social media post. Black women received a disproportionate amount of abuse and were most likely to be racialised through dehumanising language and stereotypes – the most prevalent trope being of ‘the angry Black woman’.

Despite the overwhelming amount of negativity, researchers found that Black online communities create spaces to honour and uplift Black women while also combatting abuse. These spaces serve as supportive environments that foster growth and empowerment.

The hateful rhetoric and insidious jargon of misogynoir is creeping from alternative platforms such as 4chan, to more mainstream platforms, emphasising that tech companies need to take greater steps to target the problem. Currently, technology companies are failing to adequately address the issue of misogynoir, choosing to scale up their operations rather than create safe spaces with proper content moderation.

Black women deserve joy-centred safety instead of just surviving violence, both online and offline. Tech companies should carry out risk assessments for new and existing features, release transparency reports on content moderation data so that they can be held to account for failing to keep Black women safe and create more comprehensive policies defining misogynoir to ensure that content moderation upholds their anti-discriminatory rules.

Whilst Glitch’s report highlights the type of abuse Black women face, much of the iceberg of hate lies hidden beneath the murky waterline of the internet. Researchers did not have access to private groups, and tech companies haven’t provided data on content moderation, meaning we don’t yet know the full extent of digital misogynoir and the lack of action from tech companies.

So, what can we do as digital citizens to fight misogynoir?

1. Understand the harmful effects of misogynoir, including the racist/sexist tropes that underpin it.

2. Challenge misogynoir by vocally supporting Black women online. Take care of yourself and others by setting digital boundaries.

3. Follow and listen to Black women, especially amplifying those with differing intersectional identities.

4. Demand better from tech companies! Check out Glitch’s work and get involved.

CEO and Founder of Glitch Seyi Akiwowo hopes that this evidence will reignite compassion and action against digital misogynoir, encouraging us to be active digital citizens and mobilise to hold tech institutions accountable. Because Black women deserve to feel safe online and we need to do more to make this happen.

Check out the full report here: Glitch Misogynoir Report