AD: Welcome to Right Rising, a podcast from the Center for the Analysis of the Radical Right. I'm your host, Augusta Dell'Omo. Today I'm joined by Dr. Julia deCook an Assistant Professor at the School of Communication at Loyola University, Chicago. She's here with us today to talk to us about what exactly is incel culture and how we should understand their place in society. Julia, thanks for being here today. JD: Yeah, thanks. Augusta I'm really glad to be on the podcast. AD: So I wanted to start off with a big, broad question. When scholars, the media, policymakers use this term Òincel culture,Ó what does that actually mean? JD: I think it's become a little bit complicated, and especially with how prominent I think incels have been in the media and stuff lately. I think it's a term that's like almost kind of, like lost its meaning. And I feel like it's become kind of a stand-in for referring to anyone who's misogynistic, whether they identify themselves as incels or not. And I think that can actually complicate matters when it comes to researching these groups, because there are actual, you know, incel communities online, and men who espouse misogynistic views online aren't necessarily incel. And so I think that can become a little bit tricky, right? It's almost like that whole thing, like all incels are misogynist, but not all misogynists are necessarily incels. AD: Right JD: you know, the term stands for Òinvoluntary celibate,Ó you know, it came out of a broader internet our culture that actually started in the 90s, with AlanaÕs Involuntary Celibacy Project. So it was started by a woman and it was more of kind of like a Lonely Hearts Club kind of situation. But then it kind of morphed and evolved into the incels that we've kind of come to know today. You know, they were really popular on like a 4chan message board. They had a pretty prominent, you know, community on Reddit, which got shut down. But they're everywhere, right? Like they do have like formal communities like there is the incels forum, there are kind of like, in incel communities kind of scattered here and there, they have YouTube channels and everything else. But it's become kind of convoluted, I think, because anytime anyoneÕs misogynistic, automatically, they're labeled an incel. But that's not necessarily what it is. And so it's becoming a little bit muddy in terms of like, what is incel culture? AD: Right, and when you, so I think it's important, like you said, to really draw down this wider, more expansive definition of incel to this particular, smaller group that you see, like you said, on message boards, that was on Reddit. So what do incels, in that very narrow sense, what do they want? What are the kinds of things that they're talking about? JD: Well, I think a lot of people have kind of come to know incels by their kind of like entitlement to women, right? They kind of say things like, you know, I as a man, like, I deserve to have a girlfriend or wife, I deserve to have sex, I'm entitled to women's bodies and everything else, which isn't, you know, something that's unique to incels. It's something that a lot of men believe and incel or not. But incel take it like a step further, right? Like their misogyny is like outrageous in certain ways. You know, when you look at incel forums, like they advocate for violence against women, they advocate for, like legislation that would like force the women to like date incel. Even on their forums, like they say things like, oh, if women want to make sure that there are no more incel attacks, and they should sleep with us, things like that. And so they take their misogyny and like a much like grosser more explicit way, then I think a lot of men in this society kind of hold some of these views, right. But the thing is, though, I think that the important thing to point out is that incels get these beliefs and ideas from somewhere. And so, to kind of like look at our culture and look at our media and look at like how much like media for instance, and like society's push, you know, romantic and sexual success as like something that people should aspire to. It's not hard to see, like why incels may become extremely bitter. But the thing is, though, they take that anger and direct it towards women, and saying that, like women are the cause of their misery, rather than kind of thinking about the larger societal structures that may be making them miserable, or the fact that they might just be huge jerks who women don't want to be around. So it's a nice guy fallacy. Right? AD: Right. And where, where's this actually happening? You've talked a lot about message boards. The stereotype a lot of people have the incel is he's just sitting at home on his laptop, writing on these message boards, writing these horrific misogynistic rants - is that an accurate depiction of what incel culture is doing on the internet? Are they organizing? Are they sharing content? Are they forming connections with other organizations? JD: I wouldn't say that they're forming connections with other organizations. And I think that's what makes and incels not only difficult to define, but like, also difficult to study, right? Like, they're not gonna, like go off and start like, an incel rights nonprofit. You know, they're not as formalized as, say, like a men's rights movement, which has been getting more attention in the news recently, because of the Roy Den Hollander shooting and suicide that occurred. Like, you know, the men's rights movement actually has organizations like they have, like lawyers who identify themselves, men's rights lawyers and stuff like that, right. But, you know, movements like the incels, like even the incels deny that they're a social movement. They're not an organization, and there is something that I think they find powerful, and that they're so loosely connected, and networks that it's hard to kind of like pin them down, it's hard to pin down, like, who might be an incel as a result. So I mean, like, the biggest incels forum, it's, it's kind of like hard to ascertain, like, how many registered users are actually real users. And that's always been a problem in online communities, right? Like, just because it has like, you know, say 10,000 registered users, only about 1,000 of them are going to be active. And then within that 10,000, or even 1,000, some of them might just being like seeking to just gain access to the community, right? Or they're just like, you know, morbidly curious and want to lurk. And so it's complicated to do research on strictly online movements, like the incels because of this. AD: I think that last point is so critical, especially because a lot of these groups want to appear more powerful than they are, right? It is a, it's a good thing for them to show, oh, we have 10,000 members. But then when, as you said, you actually start looking at who is active on these pages. It's a smaller community than maybe it appears. And I think this draws into a point that you brought up earlier about de-platform, and many policymakers have called to de-platform far-right groups. There's been a lot of discussion on Twitter about misinformation, abusive information, abusive rhetoric. In your research, how effective has de-platforming been? And can you walk us through some examples where websites have tried to de-platform these extreme right organizations? JD: Yeah, definitely. And so um, you know, my dissertation actually looked at how three groups in the ÒmanosphereÓ, you know, responded to being de-platformed. And how they kind of like reacted to, like censorship and bans, like even just like the threat of censorship and bans, and how that forced them to probably innovate in ways that they probably wouldn't have if they weren't under this kind of like duress. And so, you know, in the case of incels, since we're already just talking about them, incels, one of the case studies on my group, it was one of the groups that I studied for my dissertation. And, you know, it's, it's really interesting, incels like kind of always knew that they were being watched, they always knew that, like, they had like lurkers in their midst who like weren't actually members of the community. They knew that they attracted a lot of very negative attention. You know, there was a subreddit called ÒIncel TearsÓ that literally just existed to like, point out, like how awful the incel community was. And so, you know, like, incels knew that, like they were being watched on Reddit, for instance. So I actually started from Reddit, because, you know, 4chan is anonymous, it's very loosely structured. There's no formal like, incels board on 4chan. So I started from Reddit, and the incel subreddit like they were hyper aware of the fact that at any moment, they could be removed, because there were actual, like petitions and like movements and subreddits that tried to pressure the Reddit administration to ban subreddits, like incels, among other kind of like hate groups that had subreddits, like ÒThe DonaldÓ and ÒAlt-RightÓ and other kind of communities that we've become very familiar with. But in the case of incels, like, they knew that they were kind of like under threat. And so their tactic was to go private. You know, subreddits can go private, you have to be invited to join and stuff. But even that didn't like to save them from the ban hammer, right. Eventually, they were just completely banned, but they had done nothing to like back up their content, like other communities have. So for instance, ÒThe Red PillÓ and even ÒThe DonaldÓ to give a more recent example, you know, they made mirror sites off the Reddit platform, where a lot of users like migrated to before the subreddit was banned. So in the case of ÒThe Donald,Ó this subreddit was banned not because of its content, but because it was inactive. And I think that kind of missed a lot in some of the coverage of it. But ÒThe Donald'sÓ users like knew that something was coming. And the subreddit had been inactive since I think last year. But you can go find like the mirror site of ÒThe DonaldÓ subreddit, and it looks exactly the same. It's kind of eerie, but Reddit source code is open. And so they just basically took that source code and created their subreddit, on a separate, you know, domain. So they weren't like, beholden to RedditÕs rules anymore. And in the case of incels, they managed to connect with one another because I think they were communicating with each other via like private messages and stuff. Because just because a subreddit gets banned, doesn't mean that an account gets banned, like users don't necessarily get banned, after a subreddit does, and so they can be in communication with each other. And one of the moderators of incels started this incels forum, that's probably the biggest one on the internet. And a lot of them migrated there. And they've been a lot more intentional since then, and backing up their content like creating like a wiki, even. They tried to start like Discord channels here and there, but they got banned from there. So there is like, technically no official Discord channel, even though they keep trying to make one. But it's really interesting how much more intentional they've been in archiving and like preserving their content, because they know that at any time, it can all go away. And so I guess that's what I mean by, you know, they kind of like innovate in ways that they wouldn't have to, if they weren't under any kind of threat, because they also watch what happened to other groups that were kind of like them. They definitely keep their finger on the pulse of what's happening. So in the case of like, ÒThe Red PillÓ on Reddit, when incels got banned, they freaked out a little bit, because they thought like they might be next. And they've been quarantined, I think for the past two years, but have still not been banned. And so there is something like political about these choices, too. So it's interesting to see how things are kind of evolving online. You know, David Duke just got banned from Twitter recently. AD: Yeah, yeah. JD: So it's, it's been interesting how platforms have been responding. But the thing is, though, they tend to, I think, respond to what they feel is like the largest public pressure. Because, you know, for instance, on Reddit, like MGTOW (Men Going Their Own Way), incel is still active, men's rights is still active, ÒThe Red PillÓ is still active, you know, Red Pill might be quarantined. But people can still access it, and even still, the worst thing about the fact that they're quarantine it made them more popular and more people join the subreddit. AD: I think you really clearly captured the problems with de-platforming, that in some cases, it makes these groups, it justifies in their minds that the paranoia that they feel about that there's this great oppressive force on these platforms. And like you said, it just forces them to innovate and become savvier on the Internet at evading these kinds of controls. So, who then is, if you as you pointed out, there's all these different ways that groups can get around it, even on sites like Twitter or Reddit that have said that they have de-platformed some of these extreme groups. So then who is the de-platforming for, if it's not actually getting rid of these groups? JD: I think that's a really kind of, like, politically charged question in a way, right. So, you know, de-platforming is effective, but I think it's limited. So I think a lot of people, they were like, Oh, so you're saying that de-platforming doesn't work at all. And I'm like, you know, that's not necessarily it. But the thing is, though, you know, it, it does accomplish a few things, right. It takes them off of like the major mainstream platforms and stuff like that. And for propaganda purposes, like that's really crucial to have like as wide audience as possible, but it's so easy to evade moderation by just like, you know, using certain acronyms or not putting like the words in the title and stuff like that, right. And even if people take down the content, it always reemerges because people are saving it locally to their computers, like in the case of YouTube, you know, Elliot RogerÕs YouTube channel was removed, finally, years after, you know, his attack in 2014, after what happened in Toronto in 2018. But people had saved his videos, you can find like full YouTube compilations of every single one of Elliot RogerÕs videos on YouTube still today. Is it hard to find? A little bit but is it still there? Absolutely. Um, and so it's, I think it's complicated. So like, who is the platform for? I think, on some level, we do have to realize that these you know, social media companies are massive corporations that have stakeholders and who risk losing advertisers if they don't act and remove some of these, like more nefarious and awful communities or members. But that doesn't mean that all of them get removed right? So they remove like the people who are the most visible and who are like the biggest like celebrities and stuff like that. But they don't remove like the everyday people who believe in this stuff necessarily, right. And, like, of course, those people still can get banned from platforms and stuff. But I think some people pointed out with Twitter's recent announcement to remove QAnon content, like they're evading that, like you wouldn't believe. There's still a ton of QAnon on content on Twitter. And so banning hashtags, for instance, like doesn't work, we saw that in the case of Instagram. A few years ago they tried to ban you know, pro-eating disorder and pro-anorexia communities. They just completely evaded that. So sure when you search, like #anorexia on Instagram, it won't show you anything. But they just decided to start using different hashtags. And every single time like they catch on to this, like new hashtag, they create a new one, it's constant, right. And so I think that's the same thing that's happening with QAnon, is the same thing that's happening with things like far-right groups, it's the same thing that's happening with the manosphere. They're just trying to, I think, stay one step ahead of moderators and the platforms. And in some way, they're doing a better job, because they're using these platforms exactly how they were designed to be. AD: And I think that last point is, is absolutely critical. And one thing that you and I have talked about, but I think it's something that's lost and wider discussions about where incel culture flourishes, is there's kind of a boogeyman of 4chan, and 8chan in the dark web, is these spaces that are impossible to access when a lot of far-right people originally radicalized through YouTube. And how the algorithm works just pushes you further and further into suggested videos that are this misogynistic, racist, far-right content that, as you said, the very construct of the platforms, are designed to pull you deeper and deeper into these far right silos. JD: Oh, absolutely. And there's been research that shows that like, I guess, like, the more like kind of outrageous a channel is, the more followers and clicks and engagement it gets. And so the algorithm, it'll push their content, because it's getting the most engagement. It's not necessarily like looking at like the minute details of what's being said, but it's rather looking at like, how many people are clicking on this, how many people are commenting on this. And if more and more people are engaging with it, then it deems it as popular, right? Algorithms on these platforms they don't have like quality filters so much as people want to believe, they just push whatever is popular. That's what it awards. That's its metric. And unfortunately, what we're seeing is that this type of content is popular. AD: And the idea of popularity, I want to stick on that point for a second, because in some ways, the focus on incels obscures the wider problems have what's driving racism and misogyny in our, we'll just say American society, more broadly. Like, can you talk to us a little bit of how incels see themselves as separate from mainstream culture, but in many ways, they they thrive off an existing ecosystem of misogyny that exists in American mainstream culture? JD: Yeah, absolutely. Um, I think incels want to believe that they're separate from mainstream culture, because their perception of mainstream culture is very different. Right? So to them, mainstream culture is, you know, feminism and, you know, anti-racism initiatives and, you know, pushing for more like people of color and women of color in media and stuff like that. They see this as mainstream culture, right. And I think that, that really highlights I think they're the different kind of interpretations of reality that are kind of happening too, right. Because you asked like, anybody who's like a feminist or race scholar, or even like a far-right scholar, they can tell you that those things are absolutely not mainstream in our culture. And so I think that's where the huge I think like discrepancy is, and I think, understanding like the incel worldview, as well as like the MGTOW, like men's rights worldview, as opposed to what a lot of people feel that reality actually is. So for them, they see like, you know, trying to advocate for more women characters in video games or advocating for more people of color, and better representations of people of color and media and stuff like that as a direct affront to what they believe the world should be like. And so I think they, they are also angry too, though, because it seems as though they feel that there's this kind of like mythos in their head of the kind of privileges that masculinity is supposed to bring to them. And they've been denied those privileges, right. And so their entitlement kind of stems from that, and a lot of ways. And so it's, it's interesting, too, because I think I just said earlier in the podcast, and we've talked about this before, but they get these ideas from somewhere. They get these ideas from the very culture that they live in, the media that they consume, and the people that they grow up around, like, men and women are socialized into patriarchy, right. Even like the most like well-meaning parents who try to kind of like avoid raising their young sons in a patriarchal kind of like mindset, they end up learning patriarchy from their peers. It's so deeply ingrained that like, we may not even realize what's happening before, it's kind of like too late, right? That's something that bell hooks wrote about in-depth in Men, Masculinity and Love. But it's, it's so complicated, I think for incels, because they truly believe that they're, like, persecuted against because they're not conventionally attractive, or they just don't have what they believe are the qualities to successfully have a sexual romantic relationship, or they have very kind of, like ridiculous standards for the type of women that they feel that they're entitled to. So they want, you know, young women, for one, they want young women who are virgins, who like have like all these, like physical qualities and characteristics that, you know, are kind of impossible, unless you're, you're kind of a cartoon pretty much. But it's, it's, it's really interesting, because this, like animosity, and like anger towards women and like desiring like a very pure virginal woman and stuff like that, that's also an extension of, you know, hetero-patriarchal culture, right. And so, that extends also to that they feel that they need to punish society, not just women, because they see society as being the problem, like feminism gave women too many rights. That's like a really big thing in the men's rights movement, right. And they're like, defying, like, the natural order of things. And if things just went back to the way that they were, quote, unquote, then both men and women would be much happier. And so they, they constantly talk about like degeneracy and like how society is no longer pure and stuff like that. And that includes things like partying and like premarital sex and drug and alcohol use and everything else. And so the incel worldview is kind of like warped in a way pretty much because they see themselves as not just like, morally, intellectually superior to women, but to like everybody, normies etc, right. And so I think understanding that is really crucial to kind of, like, unpack their ideology a lot more. AD: Well, I think that explains a lot of the problems that you articulated about de-platforming and reducing the influence of these groups, both online and in our sort of American socialization patriarchal society, is these ideas, as you said, are pulling from somewhere. So that creates difficulty when you see politicians talking about, you know, throwing out words like de-platforming and de-radicalization as if they are solutions, when in actuality, they're trying to solve a problem that has actually already happened, right? The person has already been radicalized, the, the content is already on the internet. And they're not preventative really in any way. JD: I think so. And I think like, you know, that's, it's so complicated, right? Because like, you know, people are like, well, we should leave these groups open and accessible, because then how are we going to, like, keep tabs on them. And, you know, I'm not someone who advocates for not de-platforming. I think de-platforming is effective. But I like I said, I think it's limited. And I think, I think we also need to really confront like, why these beliefs and ideologies exist in the first place, you know, because online life is just a mirror to offline life. And these aren't separate spheres of existence. They're the same, um, our online world just reflects our offline world. And, in fact, you know, the online world can reflect some things that aren't possible in our offline world, right? And so people kind of like feel safer, perhaps to express these kinds of ideas and beliefs and you know, incels and men's rights, users constantly say stuff like that, like, ÒOh, I can't talk to anybody in my face to face life about this.Ó Or like, you know, Ònobody understands me,Ó kind of thing. And so they find community with like-minded others online. And so all of these like affordances of like, connection and community and like everything else that extends to, you know, marginalized communities and the offline world also extends to these far-right and extremist groups. And I think we need to kind of like be aware of that. And so it's like de-platforming, I would say is probably a good first step. But de-platforming isn't going to get rid of these beliefs and ideologies and our societies and cultures. It just merely remove them from our online phases. AD: Julia, I think that is a great note to end it on. So I just wanted to ask, where can our listeners hear more from you? Do you have anything coming out? Are you on social media? JD: Yeah, I'm on social media. I also have a website - if people google me, they can find me. My Twitter handle is @julesopolis. So JULESOPOLIS, I think. And I'm currently in the process of revising my dissertation for books, which I'm hoping to submit in the next two months. And so you probably won't see it until next year. I recently published something in MC journal about the political aesthetics of trolling. And I've written a few things for more popular media sources for conspiracy theories and stuff. So I've had some pieces published recently in Open Democracy and Rantt Media, and the American Ethnologist Blog. AD: Awesome, Julia, thank you. Thank you so much for being here with us today. JD: Yeah, thank you so much. This was great. AD: This has been another episode of Right Rising. Thank you, and we'll see you next time.