Augusta DellÕOmo: Welcome to Right Rising, a podcast from the Center for Analysis of the Radical Right. I'm your host Augusta DellÕOmo. Today I'm rejoined by some former guests, Ashton Kingdon, who is a PhD candidate in Web Science, and now an Instructor in criminology at the University of Southampton. And Dr. Ashley Mattheis, they're here with us today to talk about the ethics of researching the far right. Ashton and Ashley, thank you both for being here. Ashton Kingdon: Thanks so much for having me. Really excited to be back on. Ashley Mattheis: Thank you, Augusta, I really happy to be here. AD: So I wanted to start off with a big question. And it's something that we haven't really talked about explicitly on Right Rising before, and it goes into when we're researching the far right, why do research ethics matter? And as you both are answering this question, I'd love if you each could talk about maybe some of the ethical pitfalls that researchers in the far right can fall into. AM: So I'll take that one, Augusta. I mean, there's multiple pieces of this and Ashton will talk about some of it as well. But traditional research ethics in academia are a little bit confounded by several aspects of researching the extreme right, particularly for those of us that research online, and, and interact in online spaces about our research. So sort of initially, some of the issues that have been discussed by researchers, broadly speaking, are things like, how do you apply traditional research ethics that focus on subjects safety and subjects security and protecting subjects in an environment where the subjects that you're researching are potentially dangerous, criminal, violence. And, and it just works a little bit differently. So there's a strand of research or discussion that is looking at those types of things. There's also a strand and something that Ash and I both focus on that looks more at researcher safety online. So that might be in how you do your own research: What are your ethics in how you practice your research? And it would also include things like looking at distressing or disturbing materials, like how is your mental health and well-being, which I think Ashton might discuss a little bit, but and how we share our research as well, so many of us who do work on extremism and work on the radical right do public facing work, we work with stakeholders, we share our research online, because we want those stakeholders and the public to be able to see it and operationalize it. And so that often can put researchers in difficult positions, especially marginalized researchers, so women researchers, researchers of color, you know, LGBTQ researchers, are particular targets in online spaces by these groups and other groups just generally speaking online. And so it's important to think about the ethics of both research, like how it's enacted, but also researcher safety, as well as subject safety and how you'll navigate those things. In particular, because there's different rules in every country and many locales about how that should work. So we're having discussions about it with other researchers, and finding out how people are doing that work, and if they're doing it at an institutional level, or an individual level, can help us come to agreement about best practices can make things easier for different researchers, as they move forward with their projects. AK: Yeah, just to add to that, I agree completely with everything that Ashley's said. For me, the reason research ethics is so important is are the the sort of components that I look at, are in relation to mental health and also, keeping researchers safe when particularly when they're looking at online communities. So I spent eight months inside Islamic State networks before I moved to looking at white supremacy online. So cybersecurity was something that was really important in terms of me being able to protect myself and also the institution where I was conducting the research. And I feel that when it comes to the institutional level, not enough focus is placed on keeping students safe from researching these communities. So that was really important for me. And the mental health components that come along with I specifically look only at imagery. So looking at the videos looking at horrendous imagery all day, every day for years. And it's not that there's not a lack of institutional support from the beginning because obviously you go through IRB or ethical review and everythingÕs contained within that. It's about picking apart those layers and seeing, right, what happens if someone doesn't get along with their supervisor, and they don't feel comfortable to tell their supervisor that they're suffering mentally? And it's about the different stages as well. And making sure that people are aware of that support continuously throughout the process and not something that you're just kind of signed posted to at the beginning. So that's the things that I'm really passionate about, particularly now as I have so many students. I know I do this lecture every year in in criminology called Undercover in ISIS. Now I talk about my research and you have students lined up at the end, and I want to do this, I want to do that I want to look at the alt-right, I want to look at this ISIS, Al Qaeda, and you think there needs to be clear structures in place so that people can do this research more, and they're not left to kind of figure it out as they go along. Like, like I have. And I think that that's an important thing of why Ashley and I have brought our experiences together, working in different communities and the sort of shortcomings that we have. So we can make it better for the future researchers that want to look at this. And this is important. One of the things we talked about that Ashley mentioned, in the beginning, is this kind of need for you to promote yourselves and your careers. And there was no, there's no advice about how to manage navigate through that right? What do you do if you get trolled? What do you do if you are abused by these people? What do you do if you get doxxed? All of these sort of questions, we think that there needs to be some sort of institutional response to them, that researchers are aware of when they start the research, not three years down the line when they're in despair, and don't know who to turn to from the offset. And I think that's what we're really trying to push for. AM: Yeah, and I would just add to that, like, the structures of IRB that Ashton is talking about are in the UK. And they work in a specific way, my structures of IRB based on my research in the US, I'm at a, I was trained in an R1, right, where much of the IRB focus was on medical research and human subjects research. But I do textual, right visual rhetorical analysis of propaganda and my extremist work. And so my research didn't count as research for IRB. So some of the support mechanisms that may have been available in Ashton's case, were not necessarily available in mine in terms of at the institutional level. So another aspect of this that we wanted to encapsulate was a discussion of, what does institutional support look like now? How should it look, because even though there are supports, as Ashton describes, right, they may not be tailored to what researchers need, they may be tailored to what an institution needs in terms of its own compliance, rather than, you know, because traditional research doesn't have some of the components that online research and research on bad actors has. So then in other localities around the world, there is no structural IRB at all, like it works just very differently everywhere. So it's really important that we, we actively think about this and actively try to engage scholars to find out what is needed, how we think the best ethical ways to move forward with this are and then how we can interact with institutions at a more structural layer, to ensure that people have access to these things from very early on, right? Because that it may tie to things like people leaving the research or not ever doing the research, if they don't have the supports or being harmed in some way, it may also translate to research that is less ethically conducted, which I know institutions have a concern with, just because there aren't guidelines necessarily for all of this work. So it's really important, and there's a bunch of groups and different people starting to talk about aspects of this, which is great. AD: I really appreciate you both talking about not just the institutional component, but how the institutional component really impacts us personally, as researchers, one of the things that I experienced was being one of the only people in my department that was focusing on the far right and far right violence. And when I initially started, this intellectual journey it was not as contemporarily relevant as it has become, and that there's a lot of demand for our research, but then not a lot of support for the researchers who are actually doing it. It creates this really strange tension that we're forced to navigate of, how do we advocate for ourselves for the institution wants to promote us in a certain way. And I wanted you both to talk a little bit about your new initiative that you're starting at CARR, this new research unit that's focusing on these ethical questions of study in the far right. And you've both touched on it a little bit, but I'd love for you to go a little bit more into why you've launched the unit and what kind of questions you think the unit is going to be really focusing on and the issues that you'll be responding to. AK: So our work came together as an off chance, actually. So we both - oh, you're in it as well. We all take part in the far right ethics workshop, which took place over was it last summer now? AD: I think so. Time has lost all meaning. AK: Yeah, I have no idea what's going on with the years. Yeah, it's funny. AM: It was the it was the first one that they did. AK: Yeah, say we all took part in that in a far-right ethics workshop. And I had already contributed to a edited connection on cybercrime research in cybercrime. And the editor asked, they saw that I just done a presentation on Facebook, on Twitter, sorry, and sort of said, Oh, would you mind contributing to a chapter and I said, Actually, I really liked Ashley's work that she did in the workshop as well. And I thought our work would work good together in terms of merging both of our ideas. So that's how we initially started so we wrote a chapter and then held an event at through CARR with Antonia Vaughan, who was one of the organizers of the workshop and also Elizabeth Pearson, here is also working on ethics and researching, particularly does she do online Ashley? AM: Some, but she also does in-person interviews as well. AK: Yeah, so the ethics of researching violent extremism and things like that, say, AM: DonÕt forget Alex DiBranco from the IRMS was now AK: Oh yes, Alex DiBranco from the IRMS was on. And that was the kind of springboard for us. And it was really successful. And a lot of people reached out and said that they were, they thought it was really important area. And they're really pleased that people are going to be looking at that. So that's how our initial combination started. What I'm really pressing for, for the unit and what was really important to me is, as Ashley was saying, every country is super different for the way that they have ethical review. And some countries don't have it at all. With the UK, you are required to have ethical clearance to do my sort of research where you're online in nefarious spaces. With AshleyÕs country, the US, you're not necessarily required to do that. So it was about having a pipeline of support for people that were trying to do this research and might not have the ethical support or advice when they're starting out projects. Even when you do have it. It's not necessarily translating in real life as it as it does on paper when you apply. So for example, yes, there's access to counseling services through your institution, the wait time for those particularly at the moment you're looking, it could be six months, it could be longer. And so the ability to actually access them and whether or not they can help you is another thing. So that's why our recommendations were about having this money available through the Center for Analysis of the Radical Right, that researchers can access if they needed that I in an emergency situation, for example. So having that available is really important having the resources available to help people and then also having older and more established professionals and scholars that have been in the business for a lot longer being able to provide advice to the new doctoral fellows coming up. So having this pipeline, I'm sure Ashley can explain that in more clarity with more clarity than I had. But yeah, that was what was really important to me making sure that students are not only safe in terms of their cybersecurity, but also protecting these people. Because I think we've all discussed this a lot the mental health that comes from being an academic anyway, when you're in these situations, and then combining that with the sort of stressful research that we all look at. You're setting yourself up for disaster. And I think that if you don't have support mechanisms in place, a lot of people can suffer in silence. And ultimately that's what we're trying to get away from and building these really supportive communities that people can feel comfortable to reach out to you if they don't have that support from their institutions. AM: So thanks for that Ashton. I think there's I agree with most of that my, my particular focus was on this question. So in the lead up to how we came together, the question of institutional responsibility was the framework from which I was working. And in particular, because I don't really have that, or didn't really have that, beyond certain individual faculty members that knew my work and worked in a similar area, not unlike you, Augusta, I was one of the only people in my department for a number of years early on, I was basically, you know, treated as if I may be wearing tinfoil hat a little bit about this. And then, and then it became really, really relevant at a very specific point in time. But I just didn't have sort of like the same collegial support, I did have supportive people, but just not the kind of structures, more disciplinary bounded work tends to have. And then at some point, I went to notify the the legal office at my university that I was doing more public work, and it might be a problem. And I was at a public university where people can, you know, find out by law, they're able to find out about where your classes are in different types of things. And I was at an institution in the US south, and it was just there the potential for, you know, being doxxed, or having someone show up on campus. And danger just really was something that my institution had never thought about at all, and had no policy or anything in place. So for me, one of the goals in doing this unit at CARR was to start to try and build what are the institutional practices, CARRÕs a different kind of institution than a traditional university, but it is still an institution. And how can we think about supporting scholars? And how can we come together in that way, as well as building out like, as we're building and producing this knowledge and seeing what works and what doesn't work and hearing from our scholars - How can we build this pipeline, where more senior scholars, early career researchers, and doctoral fellows are working together in such a way that we build this knowledge base into the future, right, because CARR has a unique position wherein it has scholars from all over the world, it has scholars coming from so many different universities that we can use it as a train the trainer kind of environment where we get people really, and encompassed in this discussion and in frameworks that are the leading edge research coming from other great places like VOX-Pol the far right ethics workshop and their community, right? Data and SocietyÕs got separate, where we bring all this together, and start to actually apply it in an institutional frame in ways that can be supportive, and give people maybe tools to take back to their own specific university or specific environments to then advocate for those if they need them there. Right? So how can we build that network. So those are parts of my interest in doing this, and certainly CARRÕs, strong support for it, and willingness, we don't have huge pots of money, but putting money behind it. And things like helping in crisis, mental health, and helping in researcher safety through VPN, and things like that tools that you can't necessarily get your own program to pay for, but that you might need, right, you might not want to use the university VPN to go research a bunch of terrorists. So having your own would be helpful, but but you may not be able to afford that regularly. Right? So we wanted to make sure that people had access to those things basics. AD: I really appreciate the point that both of you brought up about the institutionalization of some of these things, right? Just the very emphasis in the US on the subjects that are being researched versus the researchers themselves. And this case with studying the far right often the the power dynamic, particularly if you're online, or you do interviews can be extremely difficult for researchers to navigate. So I'm very excited about having this ethics unit be a space where we can bring a lot of the work that's been occurring in different pockets with scholars who've been studying the far right all over the world and to kind of a centralized way that we can start to approach this and I wanted to ask you both just for maybe our listeners who are not as familiar with some of these issues, what kinds of attacks do researchers of the far right experience and I'm using researcher here broadly defined, journalists experienced these kinds of attacks, people mean political figures who speak out about the far right often experience these kinds of attacks as well. So what kinds of attacks are we talking about and what systems are currently in place that are designed to protect them but are maybe insufficient? And then obviously, Ash and Ashley are working in two different contexts the US in the UK, so I'm expecting that there'll be some difference between them. AK: I think Ashley should probably kick that one off because she's got a personal story. AM: Um, I mean, it might well and let me do the standard thing. I'm going to mitigate and say comparatively, right to what some researchers experienced mine was relatively mild, but really mild or not, it's not okay at all. So one of the there are quite a few online tactics. So one is what people refer to as pile-ons, which is basically a form of networked harassment, where a scholar will be identified or their work will be identified as problematic in some way to some group. And they'll share that through their networks and then people in that network will attack them on different social media platforms that may lead you know that it's very hard to deal with that happens like it's, it's incessant for at least a few days at a time. In my case, for like, two years like so it was three or four days in the beginning, and then like every six months, for two years, someone would find the thing that triggered them, and it would go off again, not as bad as the first time but I would just never died, I called it the zombie attack, it just would never die all the and, and that's one form. Another is threats. So people receive public threats, dm threats, people receive mailed threats, like to your office, I've gotten email threats, based on quotes I've given to articles that were published online. And then people may also be doxxed. So their information or information of people that they're close to friends, family members, is published online, and people are sent to essentially, attacker or harass them for their position. And I would say there's not a lot in place to support any kind of response to that at all. And this hasn't been a big change. I mean, this was being discussed pretty openly by journalists, back in 2014, when Amanda Hess put out an article called ÒThe Next Civil Rights IssueÓ about women being pushed offline through this kind of really violent trolling. But that's continued on. And it's not just women, it's also scholars of color, and LGBTQ scholars, and people researching these groups, it's a way of silencing right researchers and stopping them preventing them from speaking out and it does impact you it does make you think about everything you post and whether it's worth posting. Because you don't always have support. So in some cases, like, you have a couple of options, you can either respond to it, or you cannot respond to it. A lot of times people will tell you to get offline, which is not great. Because if you have to be online for your job, you can't really get offline. And then it depends on what the wider whether the wider community will come and protect you. So So in some cases, being vocal when someone is harassing you helps because other people will come and and essentially stand up for you online. But they don't always do that. And and it's not, there's no institutional framework for such a thing. And I know in the US only, only a handful of universities actually even have policies about how to deal with this and institutional recognition of it at all. And in some cases, there was a very interesting article published in the Chronicle last year of a woman who was experiencing a far-right attack based on her research. She went to her leadership and the leadership like, very publicly and putatively attacked her for being attacked. So it's not always the case that you're supported through it either. It depends on the politics and all the particulars. So one of the things is how do we address this? How do we do this and that's a place where we came back to the institutional layer and said, You know, it at IRB review, right at responsible conduct of research or trainings at things like that. The issue of researcher safety online needs to be included not in a way that would preclude researchers from doing the work but in a way that requires institutions to have guidelines and a plan and, and resources for people, places for them to reach out in emergency situations. If you're going through that, and you need to talk to a counselor, you can't wait six months, right? So is there someone that you can talk to that does counseling that does emergent counseling for people because it's not just even researchers of the far right or researchers of extremism that experiences anyone doing research and sharing it online may have this happen to them? So yeah, that's pretty much kind of how I've seen that working I don't know Ashton what are some other things you've experienced? Because I know your experiences are slightly different from mine. AK: Yeah, I say I think that the main things are the trolling from not just the far right, but other academics as well, which I think is one of the things that you're kind of very underprepared for particularly as a young scholar, there's been situations of people that I've really looked up to and admired that have said horrible comments about my work or me online. And, and I think that when you do this research, there's so many ethical considerations that we think about that I think are more sensitive when you're researching these types of areas. So for example, how often or sorry, how soon after an event, should you publish something? So for example, if there's an attack and you have data that you want to get out things like, how soon is acceptable for me to publish this? Should I even publish it at all? I work at propaganda - should I even put those images out there again, for anyone else to say, even though I know that I need to, because that's the whole point of my work, decoding these images. And, and like Ashley was saying, when you do get a pile-on, and it's not just you, it's your friends, it's your family that have to sit there and watch you be upset, sometimes for days, because some people are being horrible to you. And there's a pile-on, right that there's all of this information that is not given to you from the very beginning. And this was the kind of argument Ashley and I were saying, like, No, no, you need to be warned about this, this needs to be something that PhD researcher told, if you're going to put yourself online, not only do you need to be safe, not only do you need to know that this is something that could happen, but also what are we going to do about it if it does happen, and that is something that is missing in a lot of universities. And I know that we've had events before. And I know that Liz Pearson always brings this up about the smaller universities that don't have the money or the resources to be able to offer the kind of support that the bigger, larger universities can. There needs to be something that's kind of universal. And that's something that's important to us. But I think mainly just warning your students about this. And what happens to those students that don't think they can turn to their supervisors or to other people in their school. What responsibility to that, do your advisors have to protect you in these situations? And I think that I'm sure it's similar for a lot of people listening, you've kind of learned about this, as you've gone on the journey and dealt with it and when it's happened. And luckily, I've got a lot of friends in the fields that have helped me through those situations. And there's a lot of people that I've reached out to for advice and help. I have the most incredible supervisory team. And I think that if you don't have that option, it can be a very distressing and lonely environment to be in and learning how to navigate that is something that I think needs to have more attention brought on to it before you start the work rather than as an afterthought when you go through the situation. AM: I mean, I to add on to that I agree with that completely. I also think like, you can have stellar advising, you can have even very good collegiality, but many of these campaigns are also intended to shame people. So when I say silence, it's silencing through shame through you know, and so people may not even reach out to very good advisors or good resources in terms of personal things for multiple reasons, one they feel ashamed to, they might be afraid that their research will get shut down. Right? If people want to take care of particularly their students, if they want to take care of you make sure you're out of harm's way, the fastest way to do that might be to say, stop working on that. Right? So there's there's just some counter intuitive things about because there's no policy because there's no guidelines, because there's not even really a robust discussion of this, right? People end up dealing with it however, they best can deal with it in the moment. And, and sort of the way responses are applied is not equivalent, nor the same. It's very individualistic in a not good sense, right? So you don't know what will happen. So why risk it in some cases? So the I think all of that combines to put just to make this sort of a little bit of a hidden problem. And sort of the tendency that I often see online is like, or what people will often say and they mean it to be helpful is well, if you're if you're if they're responding to you like that, it means you're doing good work, right? It means you're getting to them. So you're doing the right thing, or, you know, you just got to like it's just part of doing business in this space, you just got to like, essentially toughen up and handle it. And I get those responses, but they're also not particularly useful. But those kind of discourses do also float around. And those might limit people from talking about it because they think they're just somehow not personally tough enough to take it. So it's it's very heartening to have this discussion in multiple vectors and people's different perspectives and different experiences, to be able to see how to move forward and what people might need. AD: No, I really appreciate this discussion that we're having between individual response and institutional responsibility. I think that It's been one of the big struggles that I have faced is and I think many researchers can relate to this is a sense that our personal communities, I come from a similar situation of Ashton, where I think everyone around me has been very supportive. But that really demands you know, it's very dependent on who is around you and what individuals are supporting you. It's very ad hoc, it's you trying to figure it out, and the lack of institutionalized or bureaucratized systems to protect researchers and to give them a sort of formal process for what to do if they experienced an attack or how to actually get that support. The absence of that, I think, not just limits our ability to respond, but also puts people in that headspace that you mentioned Ashley thinking like, oh, why can't I just handle this? Right? Everybody's telling me it means that I'm doing something right. But it creates a sense of blame. And I think it creates just more personal responsibility for something that should be seen as a larger systemic or institutionalized responsibility for how do we care for the people that are doing research in these often very violent spaces? So IÕd like to with the time that we have left, talk a little bit about your research ethics unit, and what areas you'll be working on what specific projects you have coming and what we can look forward to seeing coming out of this unit over the next year? AK: Yes, okay. Thank you, Augusta. I'm glad that people are interested in the ethics unit. It is something that we're very passionate about. So obviously, we had our launch event with the wonderful Andrew, Eke, and John, which was their FRETS framework. So that was about including sort of ethical review concerns within our unit, particularly if people are going to come to us for advice on their own ethical reviews. Now, unfortunately, there's been a little bit of pause on events because I am coming to the end of my dissertation. So I am on a bit of a hiatus at the moment. But we have got some exciting events planned, one of which, because I was on Right Rising, talking about it before, but one of them will be on Artificial Intelligence and ethical and explainable AI and algorithmic bias and all of that. So I'm really excited to organize that. We've also got one planned on former far right extremists, and also one on decolonization that I think Ashley might talk about a bit more. So yeah, we've got some exciting events planned. And some workshops in the pipeline. Basically, we want to frame the Research Unit around what are co-researchers and fellows that join would also like help with maybe one a particular area of training. So that's going to be really important for us, some collaborations may be with other units. Yeah, I've got lots of things planned. Very excited. And Ashley, I don't know if you want to add. AM: Yeah, I mean, I definitely think the research unit has sort of a dual focus. One is people who are interested in researching this and putting into practice ethical research considerations. In the event that there are CARR fellows at any level that would like research, like ethical review of their research, maybe they can't get it locally, or maybe they would just like a cross check. Because I know in some cases, you might not have specialists in the area of doing your IRB review. So if that if there's a need for that, we've said we would be happy to convene and facilitate ethical reviews, hopefully as a way to help train folks so if people want to participate in that or think about that, please join the unit. And that's one of the purview areas that we have. The other is these, like larger discussions and sort of research into what our research ethics in this topic. So definitely looking forward to the events. The decolonizing event is a little bit about decolonizing research ethics itself, some great speakers that were really, really looking forward to you on, on what it is to do this work, particularly from a scholar of colors perspective, and a non-Western scholars perspective, and and how that interacts. And we think that's really important. And I think that's going to be one of our next upcoming events. So looking forward to that one. But mostly, you know, we want to be flexible with what the fellows of CARR need from this discussion, as Ashton said, so we're willing to very happily adapt to and create programming around what the fellow base would like to have. AD: Fantastic. Well, Ashley and Ashton thank you both so much for coming on the podcast. AM: Thank you. It was really fun, Augusta. AK: Thanks so much for having us. Really excited. And yeah, it's always fun having a podcast of you Augusta. AD: It's always great having both of you having you on and I think I've really benefited from having this discussion and I'm sure our listeners are going to really enjoy hearing this today and thinking about this more critically. So thank you both for joining us. This has been another episode Right Rising. We will see you all next time.