November 26, 2021 - 4:30pm

This week Vice World News ran a piece entitled “TikTok Has an Incel Problem”, documenting the platform’s apparent failure to remove instances of hate speech connected with incels, a community I’ve spent the past several years studying. The author writes:

…there’s concern the platform isn’t doing enough to counter misogynistic comments and hate speech, some of which is associated with groups with a history of real-world violence against women, such as incels.
- Sophia Galer

The piece centred on an August report from Ciaran O’Connor at the Institute of Strategic Dialogue examining hate speech on TikTok that revealed “numerous examples” of misogyny, including videos like “3 worst types of cockblockers” and “Do you think makeup is a form of lying?” It also showed videos featuring men giving aggressive dating advice or disparaging women’s educational achievements.

It goes on to assert that TikTok must be aware of this “incel problem,” but has failed to either acknowledge it or remove all offensive content.

The headline is misleading; much of the content described in the piece is rude and misogynistic, but unaffiliated with any male supremacist ideology. When it is, the ideology in question is unrelated to incels — a distinction the author herself makes early in the piece when defining the #redpill hashtag, which she explains is used in “red-pill adjacent spaces,” by Men Going Their Own Way (MGTOW), another group within the manosphere, and is allowed on TikTok. This kind of taxonomy may sound like splitting hairs; an excuse for differentiating groups of toxic misogynists who should all be censored. But even if that is one’s position, it shouldn’t be done under the guise of preventing violent extremism.  

Unlike incels, red-pilled communities like MGTOW or Pick-Up Artists (PUA) have never been connected with real-world violence, a fact likely understood by researcher Ciaran O’Connor. But he expresses a need for clarification:

Regarding why some incel-adjacent lexicon is allowed, and why others aren’t, it would be encouraging if TikTok were to share publicly, or in a limited capacity with researchers/media to help us understand, what definitions it uses for terms that have a clear link to hateful ideologies. […] Essentially — greater transparency.
- Ciaran O’Connor, Institute of Strategic Dialogue

It is somewhat ironic that “greater transparency” is expected of the platform when researchers of hate and extremism work so gleefully under the cover of night. Just a couple weeks earlier, another piece from Vice News, a short film titled “Hunting Down Incel Extremists” featured author and researcher Julia Ebner, also of ISD, describing her method of going “undercover” to infiltrate incel spaces. Ethical considerations aside, I question the results obtained from such an endeavour. In my own research I’ve found that “incel spaces” are filled with bravado and ironic humour. Conversely, they’ve been quite candid in their conversations with me, without any pretence.

This cloak-and-dagger approach to incels only erodes trust between researchers and a community already leery of their motives, and spreads panic among a public already living in fear. Experts are meant to be the measured, rational voice in a public discourse prone to inflammation. Not to feed right into it.

Journalists, for their part, are meant to judiciously consider which stories warrant the time and expense, and how to present them effectively and responsibly. Perhaps those days are over. But so much research and reportage just uses a massive dragnet that can’t distinguish bad taste jokes from true violent extremist sentiment. Beyond the sheer waste of it all, this has significant implications. If we obsessively focus on trivial and ambiguous speech offences in an effort to signal our virtue, we lose sight of the serious harms that face us.

Naama Kates is a writer, producer, and creator of the “Incel” podcast.