Katherine Druckman (0s): Hi everyone. Thanks for joining us again today for Reality 2.0. I am Katherine Druckman and Doc Searls is here today. I'm quite excited to welcome a particularly awesome guest. Evan Greer is joining us. Evan is the deputy director at fight for the future. Please go check them out. She's also one of my favorite people to follow on Twitter right now at Evan underscore career, because she's always quick to offer. I think a perspective that I personally believe needs to be addressed today. We're going to talk about digital rights and a few other things, and hopefully really get into the weeds about current events and tech. Katherine Druckman (46s): But before we get started, I just wanted to add another quick plug for our newsletter. You can sign up on our website at reality2cast.com. That is the number two in the URL and subscribe there. We send out short, but hopefully useful emails with a handful of links and a bit about what we're interested at that moment. So with that, let's, let's get into it. So what a time to be alive Doc Searls (1m 11s): And doing something? Yeah, Katherine Druckman (1m 14s): It's been a, it's been a pretty, pretty wild couple of weeks, right? Evan Greer (1m 19s): I honestly feel, I honestly feel for folks who, who kind of don't work in activism. Cause I can at least tell myself that doom scrolling is part of my job right now. If it wasn't, I would just be doing a really bad job at my job. So what is the opposite of do I'm scrolling? I wonder it's like, is it, is it looking for, you know, if your optimal, if you're doing scrolling, optimistically as a drum scrolling anymore, I mean, is it looking for hope or finding hope because that's this kind of, and I think this is where Catherine is coming from and as we kind of get from you, I mean, you're there on the front lines, you know, given my antiquity, I, you know, I was very involved in the sixties and civil rights and other things like that. I, we may have been better off without social media and the rest of it, but, but I, I feel some of the same energy going on right now as we had, you know, 50 years ago. Evan Greer (2m 10s): Absolutely. And I think it's crucial to kind of grapple with the double-edged sword, that network technology and the internet has like plays in that, right. We, and we've seen kind of both extremes just over the last few months where over the summer, and, and in the midst of the pandemic, we saw these unprecedented in scale uprisings for racial justice and calling out systemic violence and racist violence from police that has been a problem in this country for decades. And it was largely spurred on by social media and, and websites and platforms, hosting videos of that abuse and kind of documenting that corruption that helped people organize. Evan Greer (2m 53s): And then of course we saw sort of the, the other extreme just last week with a, of explicitly white supremacist attack on our democracy and on the Capitol that was also organized on social media and with the benefit of, of this technology. So I think it's pointing us to this, you know, digital rights, no longer sort of means just, you know, save the internet and wave the flag. I think it means really making, you know, having the conversations we need to have and making the decisions we need to make as a society that will make sure that technology is largely a force for empowerment and upliftment rather than tyranny and greed. Evan Greer (3m 37s): And I think we're sort of at a watershed moment where the policies we fight for in the decisions we make will determine that future. Doc Searls (3m 44s): So it was, I have a question when you see we, I automatically think, well, there's our cohort, which is sort of generally blue academic coastal, and you know, involved in some way. And, but tin can, can that be a larger crowd? Can that be a larger group? And I'm especially thinking of what what's happening right now with the Republican party where it's getting a little bit split into between conservatives who may, for example, actually believe in climate change or actually want to drum the, the white, white supremacy out of the party, go back to its roots with Lincoln and all that. Evan Greer (4m 27s): Yeah. I mean, I think that cohort, as he described it, doc, we are, we're facing a moment where I think we're recognizing the failure of, of not having it be larger than it has been in terms of those who kind of have their hands on the lever of power, both within companies or in terms of building technology, as well as in shaping the policies that govern it. And I think, you know, one of the most important things for us to do in this moment as we determine what policies are needed, what technology is needed is to listen to the communities who have been most effected and most harmed by uncareful policies around technology and by the uncareful, or kind of thoughtless deployment or, you know, kind of uses of technology. Evan Greer (5m 21s): You know, so a clear example being , you know, the last major change we made to section two 30, which I know we're going to get into in a little bit, but you know, one thing that every lawmaker should be doing right now before considering further changes is listening to sex workers and LGBTQ online creators who were disproportionately harmed by that kind of misguided and, but well-intentioned effort. And I think, you know, we're in a moment where we're going to see a lot of misguided well-intentioned efforts related to, and it's not, you know, saying that we should get it right. Evan Greer (6m 3s): It doesn't mean that we shouldn't do anything. And I think that that's an important distinction to make, you know, this isn't just, laissez-faire, hands-off everything, it's, let's be thoughtful in what we do to make sure that we are, we are not going to do more harm than good. Katherine Druckman (6m 22s): That's an interesting thing. You say, you know, laissez Faire, hands off everything. I think we actually, we might have a significant segment of our audience. Who was, it was a little bit of that kind of techno libertarian sort of perspective, who is it? It's not so much a progressive approach or a proactive approach. It's more of a defensive posture where it's just like, just please stay out of our business. And we'll, we're all, we'll all be okay. You know, kind of, and I wondered if you think w what you think about, I mean, is that an, is that, is that stance going to help or not? Do you need, do you feel like we're in a position where you, you just really need to be a little bit more proactive in, in your activism? Evan Greer (7m 6s): Sure. Yeah. I mean, I want to draw a distinction here because I think this is something that's going to come up again and again, in the coming years, right? Like I totally also bear that kind of libertarian bent and skepticism of centralized institutions of power, whether they be governments or monopolies, or even in some cases, nonprofit institutions or religious institutions, or many others, you know, the essential utilized institutions, you know, centralized power in ways that can always kind of be weaponized or can be used to do harm. And, you know, that's why I think where we have broad agreement is in the, you know, the importance of creating decentralized, you know, technology and methods of communication and methods of decision-making as a society that, you know, take the power out of the hands of the few and, and put it back and where it should be in the hands of the many. Evan Greer (8m 5s): And so, but in terms of, you know, when I say, you know, kind of policy or, you know, what policy is needed, you know, it's, it's funny because we, I think the word regulation, you know, for myself included is, you know, sort of something I, I tend to get skeptical of because a lot of government regulation of technology tends to, as I just mentioned to do a lot of harm that said, I think, you know, there's some concrete things, you know, or a good example would be net neutrality. Right. And we did hear from, you know, mostly telecom shills, but, you know, some legitimately concerned folks on the right, like, Oh, well this is a burdensome government regulation. And I would argue that, you know, basic rules that say that, you know, internet service providers shouldn't abuse their gatekeeper power in ways that are unfair and harm people. Evan Greer (8m 50s): You know, I guess you can call that a government regulation, but it's really in many ways sort of preventing, you know, what is essentially a monopoly from abusing that monopoly power and sort of regulating your internet at the corporate level. And so I think, you know, similarly the way I think about policy approaches, you know, I think something like a strong federal data privacy bill that, you know, and doc, you mentioned correctly, the, the shortcomings of GDPR, but I think one that focuses primarily on data collection rather than kind of using a failed consent or opt in model, you know, could actually do tremendous good, and sure that is a government policy, but it's also one that is sort of clearly about consumer protection and protection of civil rights and, and kind of preventing harm. Evan Greer (9m 44s): And I think that's very different than, for example, anything that gets the government involved in dictating platforms, content, moderation policies, or anything that gets the government involved in dictating kind of other types of practices that, you know, are, are clearly don't fall into the category of, of something that's blatantly harmful. And anti-competitive so Doc Searls (10m 10s): Sure. Yeah. Since, since you mentioned that, it's interesting to me that the, and interesting where five the future is, is working on this in say Europe, because Europe, Europe is coming kind of becoming the world's regulator that's outside of China. Basically there's sort of the internet seems to be splitting in some ways, or at least a couple layers up on the internet or splitting between the fully, the Jew. I mean, every time we see a cookie notice, that's the GDPR and we're here in the U S right? And, and now here in California, we're seeing California regulations with the CCPA, which basically said, at least in the first iteration, before we had the latest proposition approved, that jeez, all your data horses are out of the barn. Doc Searls (10m 58s): You can at least tell, tell the companies what not to do with it. Now that they've got it. And that strikes me is just like horribly ill-informed and ill-conceived, and, and I'm wondering how, you know, and as you were at the front lines of this, how do you, how do you, how do you work with Europe and California and the activists that are like taking on things like, like the, you know, like this latest proposition, this I could have stopped yet, another proposition of him happening here, we have a low threshold of kind of populous lawmaking here in California. How do you, how do you manage that? And I mean, and, and what are you looking for and how do you make it work? Cause I don't know how I can only guess. Evan Greer (11m 41s): Yeah. I mean, again, I think, I think we should always be thinking strategically as well as ideologically. Right. And I think if there's one sort of failure of, you know, well, we've had many successful social movements here in the U S but I think also, you know, one persistent mistake that we sometimes make is kind of, you know, leading with ideology at, without thinking about strategy as well. And so I think it's, it's a matter of always sort of looking for, you know, where can we build the most consensus to get something done that benefits the most people, right. Or that mitigates the most harm, or does the most good in, you know, in the, the few and far between cases where we can actually get a positive piece of policy passed versus always sort of fending off the, you know, the SOPA Pipas and, and, you know, net neutrality, repeals and Patriot act 2.0 is of the world. Evan Greer (12m 40s): Right. But that said, I think on privacy specifically, again, I, I think we should take a note from the, again, you know, going back to kind of the, the, what I see as incredibly inspiring uprisings that happened over the course of the summer and, and this thought about abolition versus reform. Right. And I think, how do we apply abolitionist thinking to, for example, kind of surveillance capitalism, right. And I think it takes us into very different directions, whereas GDPR, or a lot of the kind of regulatory frameworks that have been proposed around, for example, things like facial recognition, you know, sort of are, are very solidly in that direction of reform that except like, okay, well, corporations are in, governments are going to harvest our data in enormous ways and they're going to use it for all kinds of purposes. Evan Greer (13m 32s): So let's set the rules of the road for how they should be able to use it once they collect it. Right. That's kind of the broad framework and you know, how, how should, how should they ask you first and what should they tell you about what they're doing? And, you know, should there be some limits or bright lines? And I think if you think about applying an abolitionist framework to the way we think about dealing with data and surveillance capitalism, you know, we're not, obviously we're not going to get complete abolition. We're not going to turn off all data collection in the world tomorrow, nor would we want to, right. Cause there's some certain types of, you know, there's many uses of data that are positive and, and many platforms and technologies that have a net positive impact on the world and our lives. Evan Greer (14m 19s): But we, I think it's about Looking for where, where are the, The outliers, where are the types of data collection that just should not be happening in the first place? Right. Facial recognition, biometric collection, I think is a clear example where there's just a growing consensus that the potential harm of using biometric surveillance or even biometric kind of collection for convenience or entry exit to buildings or things like that just comes with such a tremendous potential of harm that any benefits are just not worth it. And it's a worthwhile technology to consider an actual prohibition on certainly at the government level. And I think a prohibition outright on many corporate uses as well, then there's, you know, a whole host of other technologies that, you know, we need to think about and talk about. Evan Greer (15m 8s): But I think looking for places where we can just draw a red line and say, we need to stop doing this. You know, we, there's no reason that kids who are home from school doing remote learning should be subjected to these, you know, invasive proctoring apps that are more or less indistinguishable from stocker where let's ban those. Right. So I think looking for where we can find consensus and prohibit specific harms as we continue, and then, you know, beyond that allow the technology to, to, you know, to flourish. And, and as you said, doc, you know, look for the technological solutions. I mean, we're having this big conversation about platform, power and platform moderation, and meanwhile signal just got 40 million new downloads, right. Evan Greer (15m 51s): Because, you know, people learned about what WhatsApp has been doing with their data. I mean, that's incredible. Right? And that, I mean, so it's a good reminder. I think that policy is not the only way to solve things and some, some problems do and kind of run their course as the technology kind of changes and evolves, you know, and certainly creates new problems in the process. Right. Doc Searls (16m 17s): So, so let's zoom in for a moment on, on facial recognition. Cause that one is when and which I'm involved. I, we advised at least one company on this topic with, I might add a degree of futility because there's market demand for them to do the ethical kind you might say, but where they're stepping across the boundaries or risk stepping across the boundaries to what may or may not be because it's in a gray area ethical, you know, and, and right now we have, you know, for example, in, in prosecuting and in, in, in, in doing the detective work and a prosecutorial work of, of this insurrection at the Capitol, there's just, I, I understand it. Doc Searls (17m 4s): I've read the clear view, you know, use of Clearview researches, at least recoup Clearview have gone through the roof. And Clearview is the company that is already, you know, assembled a monstrous database of faces, including training them in part with ones that I put in the public domain without thinking through where that would go because I have 77,000 photos. Most of them, not of people, but some of them are up on the web that are identified and Clearview is using that. And boy, how do you draw bright lines in here, especially over what gets you, not just what gets used as, and what doesn't, but what laws do you want to make against it? Doc Searls (17m 45s): How can we turn his back into what wiretapping was back in the day? We need a corridor, right? And my own belief about facial recognition is the only entities that you recognize a face are, are people in their pets, you know, and, and that, and that we have short-term memory and kind of tacit knowing of other people's faces rather than explicit knowing of each other's faces for a reason. And, and that reason is a lot of what civilization is based on, right. That we do forget. We don't know exactly. And we recognize without being able to describe, and that's part of being human and how do we present? How do we migrate that from the analog physical world where we all walk around as mortals with boundaries, to, to the digital world where everything is knowable and an in an explicit way. Evan Greer (18m 37s): Yeah. I mean, I, I think it's, I think that's really important kind of thing to focus on doc, because I do think it illuminates the, the fundamental harm of using artificial intelligence broadly for surveillance it's that it, it almost perfects a piece of humanity that like shouldn't be perfected and we're clearly not perfecting it, right? Like since current facial recognition algorithms, exhibit systemic, racial bias, gender bias, you know, but those algorithms can and will improve partly by training their data sets on a more diverse set of photos obtained probably in similarly either unscrupulous or unintentional ways. Evan Greer (19m 20s): Right. That said, I think, you know, it's a mistake if we only focus on the algorithmic bias issues rather than also focusing on the actual application of the technology and the institutions that it's being kind of integrated with. Right. So if we accept that policing in the United States has been racially discriminatory for certainly more than a century. And then we accept that layering of technology like facial recognition on top of that policing, even if it's used perfectly, even if they have the best set of guidelines, the algorithms are tested, they're not biased. Evan Greer (20m 2s): Essentially what that technology does is it speeds up and automates a leasing process that we know is already to discriminatory, which means we're speeding up and automating and expanding the influence of that existing discrimination, unless we're taking steps to actually address it. So I think that's, it's so important. And, you know, an argument I often make is just historical too. If we think about the fact that throughout history, you know, privacy as in freedom, you know, spaces that are free from not just government, but also sort of social or corporate or religious surveillance have been fundamental to almost every major kind of evolution of social Moore's evolution of, of kind of values in human society. Evan Greer (20m 49s): Right? Like if the U S government had had something like ubiquitous facial recognition, software, something like clear view 50 years ago when the LGBTQ rights movement was first forming and homosexuality and disparate distribution of homosexual materials was still criminalized in many States in the U S our movement might not have formed. And we might not have made so many gains in terms of basic human rights for a large segment of the population over the last few decades. So I think we think of privacy as, you know, sort of what we have to hide. I think of it as how we create space to test whether our laws and social morals are just in, right, so that we can continue to evolve as a human society. Evan Greer (21m 34s): And I think that's my big fear around technologies like facial recognition or other forms of automated ubiquitous surveillance is that they'll kind of make those spaces smaller and smaller. So we have less and less opportunity to evolve and, and better ourselves as a civilization. Katherine Druckman (21m 56s): So that's interesting. So I think this is probably a good spot to, to bring up something that I think about a lot lately. And that's, how do you counter what I generally call anti-encryption propaganda? Like I think terrorism, for example, has always been a label that's thrown around to strip people of privacy. And with the recent domestic terrorism, I worry that anti encryption movements are going to gain momentum. And then meanwhile, on the other side, you have all these like Q Anon-ers, who for some reason are like obsessed with, with pedophilia. I have no idea what that's about, but, you know, I can't tell you how many memes I've seen going around what they're like, what do they say? They say, you know, well, you know, if you can take down parler, why, why is there still child pornography? Katherine Druckman (22m 38s): And you just kind of want to go, like, that's, that's not how any of this works, but, and I just wonder like, like how current events are shaping things that, that I think are potentially a little bit scary. Like, you know, you have things like the EARN IT bill, and I can't remember what the other one's called LEAD is that right? Evan Greer (22m 56s): Lawful access to encrypted data. Yeah. Katherine Druckman (22m 60s): Right, exactly. So I wonder if you could speak to that a little bit, how do you counter, you know, turning certain events into Frank, frankly, I think propaganda against encryption technology. Evan Greer (23m 14s): Yeah. I mean, I spoke to the Washington post about this just the other day and, you know, the, the, the good thing about encryption, this is, you know, partly for me, it's almost like a comforting area to debate in because it's almost like climate change where like the science is there with, with encryption, it's like the math is there. And so at a certain point, like with climate change, you hit a point where like most reasonable people who are like able to assess the math and are, and know what's going on, are gonna agree with the basic premise that there simply is not a way to weaken encryption for only the good guys or only the bad guy, or, yeah, sorry, this, I always get this metaphor wrong. Evan Greer (24m 0s): Yeah. There simply is not a good way to weaken or backdoor encryption in a way that only helps the good guys and doesn't help the bad guys, however you define them. Right. And so, you know, the good thing with countering, and I think you're absolutely right, that we will see renewed attacks on encryption as, you know, social media platforms engage in more aggressive moderation, whether you think that's right or wrong. And I think, you know, I guess where I would say is like, clearly we need to be having these conversations, right. Clearly we need to have robust debates about what type of moderation and should happen and where, and I guess that's, I think the crucial question is, is where, right. Evan Greer (24m 45s): And we can get into talking about different places in the technical stack, like Amazon web services, the hosting level CDNs app stores. And I think it is good to talk about that, but I think going back to encryption it's again, you know, I think we don't even have to make those kinds of philosophical arguments or convinced people on the value. If we can just explain to them that weakening encryption, banning, attempting to ban encryption or criminalize, the distribution of strong end-to-end encrypted messaging will actually make people less safe, not more safe. Encryption protects our airports and our hospitals and our water treatment facilities in our power plants. Evan Greer (25m 28s): And, you know, when you sort of remind people of that, they're like, Oh, okay, that doesn't sound like a very good idea to poke holes in that. So I think it's, you know, w it's comforting to me to just feel like we've been through this before we've been through it with Apple versus FBI, it comes up every now and then, and generally speaking, it doesn't go very far because there is really strong consensus even among like total national security Hawks, you know, save the ones that kind of, as you said, are, are spreading propaganda for a very specific purpose, but there's a lot of consensus that it's a bad idea. And my hope is that, you know, we can maintain that. That said, I, I am concerned and I have seen, and, and, you know, engaged in conversation with even folks on the left who are like calling for Apple and Google to ban telegram from the app store. Evan Greer (26m 16s): And, you know, I like, you know, there's just, it's important to remind folks. I think also, especially in the us, you know, I see this from, you know, you know, predominantly white progressives in the U S where they tend to only be thinking about how a policy might impact folks here in the U S and just completely ignore the global perspective. And the fact that telegram is an app used by hundreds of millions of people who are not neo-Nazis, who are talking about movies or organizing clubs for marginalized folks in really repressive countries, or are journalists talking to their sources, et cetera. And so I think just, you know, a reminding people of the global perspective, be reminding them that you can't ban math, you can just criminalize it and make it harder to access for vulnerable people. Evan Greer (27m 6s): And, and then trying to refocus people on the real debate that we should be having, which is again, you know, let's have, let's talk about moderation. Let's like really listen and have a good conversation about it, but let's get beyond kind of hyper-focusing on these individual acts of moderation or lack of moderation, and start talking about the systemic problems, both with technology and the ones that came before technology. And I think that's just one last thing I'll say here, and then I'll turn it back to you. But I think part of our knee jerk reaction to sort of blame the internet or blame technology or blame encrypted messaging or blame social media is part of our collective unwillingness to admit that violent white supremacy and far right ideologies and harmful disinformation and all the rest of it have been part of who we are as a country, since this nation's inception. Evan Greer (28m 6s): And I think, you know, we hear that persistent reframe, this is not who we are. This is not America. This is part of who we are. And the sooner that we admit that and figure out what that means for what we need to do. I think that the sooner we can get past this kind of back and forth, or, you know, these, you know, having conversations about technology that are really sort of just scraping the surface rather than trying to really get at the root of where the harms are coming from and what we can do about them. Katherine Druckman (28m 36s): Yeah. I like that. I, you know, it's, it's so true. I mean, technology certainly didn't invent extremism on, you know, whether it's religious or political or, or anything else, you know, it's, it's just a lot longer than, than telegram, for example, Doc Searls (28m 52s): The parents medical question, and then that, that you might be able to answer because I haven't, when my friends on, on the right or the near right. Asked me, which is what is meant by systematic there, the question there is, what is the system? I don't see it, you know, even though I see the results of it. And I'm wondering if you have a good enough answer for that. I mean, I think got answer, but yeah. Yeah. Evan Greer (29m 21s): I, you know, I think systemic to me means looking at things at scale, right? You can always find an individual example to prove your point, right? And so I think we, that's part of again, why math exists, why our ability to analyze at scale exists. So for example, if we look at the criminal justice system, you know, it's just simply is a fact that black men are disproportionately incarcerated in the United States, which incarcerates more people than any other country in the world. Them's facts. You know, you can argue with them all you want, that is a symptom of systemic discrimination. Evan Greer (30m 6s): You can point to me, you know, any number of hundreds of examples, of poor white people who end up in prison or of, you know, interactions between law enforcement and individuals that went well or individual law enforcement officers, who mean well and genuinely do care about protecting people's rights. The fact is at a systemic level, there is copious evidence of systemic discrimination that takes the form of disproportionate levels of violence over policing of certain communities, different sentencing for the same crimes, et cetera. You could do the same with, you know, the wage gap, gender wage gap, right? Evan Greer (30m 48s): You can find as many individual examples of high powered women making tons of money, Sheryl, Sandberg, et cetera. Doesn't change the fact that, you know, there's women as a class still make, I forget what the latest number is, but, you know, not nearly as much on the dollar as their male counterparts. To me, those are examples of systems at play and, you know, to take it back to social media, I think again, you know, we've had how many hearings over the last, you know, couple of months in Congress about this topic and they almost all are, you know, Ted Cruz yelling at Jack or Democrats yelling at Mark Zuckerberg or whoever. Evan Greer (31m 31s): And they're basically like, all right, you know, why did you take down this specific piece of content, but not, you know, this specific piece of content? Why did you sensor my brother's friend's dog's Instagram account, but not the Ayatollah of Iran, you know, like that's basically the level of conversation or, you know, and folks on the left, I think, you know, very hyper-focused on Donald Trump and his accounts, right. And, you know, Twitter should ban Trump know this is before I think, you know, the, the situation last week, which I think, you know, I think we could have a valid discussion about Trump's speech in that moment crossing even the first Amendment's line of, you know, the kind of fire in a crowded theater. But that said, I just think overall focusing on those individual moderation decisions rather than zooming out and looking at things like algorithmic amplification, that's maximized for engagement as the kind of core driver that is sort of just turning the volume up on everything and kind of making every social movement be on steroids again in ways that are profoundly good. Evan Greer (32m 34s): And I think have actually opened the Overton window in ways that are necessary, but also in ways that have done really serious harm. And so I think looking at things like scale centralization, transparency, you know, kind of due process for moderation, micro-targeting data, harvesting, filter, bubbles, those are systemic ways of looking at social media and both the good and harm that comes with it rather than individual, which is kind of the constant back and forth over should this piece of content be up or down yes or no, which I think just sort of ends up missing the point. And we're never going to fix the harms that we're seeing by just constantly calling for more content to come down. Evan Greer (33m 18s): And we're also not going to fix it by just, you know, kind of saying, everything's fine, just leave it alone. I think we need to, you know, get surgical and start figuring out, you know, what, which business practices particularly at the largest companies are doing harm. And then also, how do we create more competition in the system so that we have more options, users have more options and are there kind of clever technical things that can be done, like, you know, policies mandating oppositional, or rather, you know, interoperability, adversarial, interoperability between platforms as over a certain size or just ways that we can kind of create more decentralized alternatives to the, the system that we have now, which then ends up, you know, a lot of the conversations we're having right now would be basically moot points in a world with a less centralized social media market, for example. Evan Greer (34m 14s): So Doc Searls (34m 16s): I have to say that I, I love how you put that because it obviously there's a system there. If we see if we see a result such as a disproportionate incarceration and so forth, but also that easy answers, like let's ban big tech or something like that, you know, are not going to be adequate. It's more complicated than that as deeper than that. And, and, you know, and, and, and as good in some ways that these, that we have exposed things like, like white supremacy as something that's more than typical. Doc Searls (34m 56s): So I just want to say amen to everything you just said there. Yeah. Do we want to go to the section two 30 as, as, as a thing that's in front and center right now, because that is it really endanger Evan, Evan Greer (35m 11s): You know, I think it, I think it is. And I think this is one of the most important fights for our generation that will kind of determine the future of the internet and have profound implications for the future of freedom of expression and human rights. And I think it also has, you know, serious just sort of legal, broader legal implications. And really, I think it's at a, at a certain point, we have to admit, we're not just talking about section two 30, we're often talking about the first amendment and, you know, Mike Masnick over at tech, dirt has written, persistently bless his heart to just remind people that many times the problems that they think they have with section two 30 are actually problems that they have with the first amendment. Evan Greer (35m 54s): And they have to start to grapple with like, you know, you know, like just say it, if, if you're basically saying you don't like the first amendment than just come out and say it, but, you know, for example, you know, we've heard from conservatives or, or maybe not real conservatives, but we've heard, you know, there's been a, you know, consistent narrative on the far right. Pushed primarily by Donald Trump and his enablers over the last number of years that, you know, section two 30 is what lets platforms, censor people take down their content. In fact, section two 30 is really just what makes it easier for those platforms to do that without the constant fear of excessive, but largely basis litigation, but really in the end, it's actually the first amendment that allows platforms to not post speech that they don't want to host. Evan Greer (36m 43s): It's the same reason why I don't have to host a band in my basement if I don't like their politics, you know, you platforms have a first amendment, right. To not associate themselves with speech that they don't want to associate with section two 30, just sort of makes it functional for them to do that at scale, in a way in a world, in a very litigious society. So I just think that's important to safe first and foremost, but in terms of the risks doc, you know, I don't think we're in danger of a full repeal of section two 30, the way that president Trump had called for, and, you know, Mitch McConnell put his name on, in a, in the attempt to repeal section two 30 during the NDA negotiations and later the COVID stimulus bill. Evan Greer (37m 31s): So, you know, I think it is it's worth acknowledging how dangerous it is that, you know, the leader of the Republican party is on the record in support of a full repeal of section two 30, which would just be completely outrageous and throw the internet into chaos. But I do think what we're in danger of is seeing carve outs to section two 30, that seem reasonable on their face, but do tremendous harm. And I, and I mentioned earlier in the show about siesta foster, which was the last major change to section two 30, it was billed as legislation to prevent sex trafficking. And it ended up not doing anything to prevent actual trafficking and criminal non-consensual abuse, but instead it led to mass de platforming of consensual sex workers and others who found themselves in more danger without having the internet as a way that they could control their own work. Evan Greer (38m 31s): And it also led to platforms like Craigslist and tumbler shutting down categories of content or Craigslist shutting down its entire personal section because of the way the law was so poorly crafted to create liability for an enormous amount of activity. I think it's the biggest thing that I'm most worried about coming out of the events of last week or that we would see a misguided bill that tries to create assess to foster style, carve out in section two 34, something like speech promoting domestic terrorism, right? That sounds perfectly reasonable. Why should platforms be shielded from liability if they're allowing people to call for terrorism? Evan Greer (39m 13s): But what we know about the way that this would actually play out in practice is that the lawyers at companies like Facebook and Twitter and others are going to tell them, you know, this is really broad. Any number of types of speech could be considered to fall under this category. You just need to disallow, you know, you know, enormous amounts of speech and, or, you know, have no moderation whatsoever to kind of avoid liability. And so I think generally speaking, you know, there's more thoughtful and less thoughtful ways we could talk about making changes to two 30, but at this point in this environment with this Congress, I think it's crucial that we recognize that if we open the door to section two 30 changes, what comes out the other side is almost certain to do more harm than good. Evan Greer (40m 3s): And I think we just need to keep that door closed for now and try to point lawmakers toward, you know, approaches like antitrust enforcement, like strong federal data, privacy legislation, and like taking aim at specific business practices. One thing I say to lawmakers all the time is, you know, section two 30 is not the only lever that you have. If you don't like the way a big tech company is operating, write a bill that says XYZ business practices illegal. Instead of writing a bill that says XYZ business practice is no longer protected from liability. As soon as the enforcement mechanism becomes, you can get sued that inherently benefits the largest companies that have the most money for the most lawyers. Evan Greer (40m 45s): And so legislation that's intended to hold accountable or reign in the power of companies like Facebook and Google could quickly end up crushing or destroying the Reddits and Wikipedia is, and you know, Etsy's of the world. And I think it's also notable that section two 30 changes would also disproportionately impact platforms that have a more decentralized model, right? If you're Facebook, you do moderation in two ways, you either have AI do it, or you have human moderators who are contractors who are paid by Facebook, do it. So if, for example, there was a change in two 30 to mandate certain types of transparency reporting. That'd be pretty easy for Facebook to do if you're Reddit and you rely on thousands of volunteer moderators and people can create their own new community forums and set rules the way they want, or if you're Wikipedia and you rely on tens of thousands of volunteer editors to, you know, create this resource that does a tremendous amount of public good. Evan Greer (41m 45s): It's much, much harder to comply with a situation where each of those individual volunteers could create liability for the platform with each of those individual moderation decisions. So yeah, I'll leave it there, but I think that's my biggest point is that whatever your problem is with the internet, whether you think platforms are being too aggressive in moderation or moving too much speech, or you think they're not being aggressive enough and they should've gotten their act together sooner to address some of the harms that we're seeing either way ripping up, carving out, blowing up repealing section two 30 is not going to fix the problem. And in many cases it'll actually amplify the problem and solidified the power of the largest companies while making sure that we never have anyone else come along who can compete with them and maybe bring a better vision for what the future or the internet should be like, Doc Searls (42m 38s): Wow, that was a, that was a great answer. That's fantastic. I, I, these are these brilliant soliloquies and standalone. I just another sort of the parent that a good question. I mean, who do you, who do you talk to? I mean, I'm just curious. I mean, and you might not even be able to say, but to just say, turn the heat down, don't, don't do anything right now on two 30 with other things to deal with. Wait, till things settle down. Do you go to both sides of the aisle? Do you, I mean, you have a sense of how, how you go, how you go about your activism for that. Evan Greer (43m 12s): Yeah. I mean, I think there's a number of different things that are needed right now. And, and, you know, to answer your question at a broad level there, you know, it is one of those areas where there's actually a lot of cross partisan agreement is focused on the right in groups like tech freedom. And even, you know, some of the Koch brothers founded funded, you know, kind of tech policy groups and Cato Institute and reason magazine and you know, many other, you know, folks over on the right, totally do understand how ripping up section two 30 would actually lead to more censorship of conservative speech, not less and would fundamentally violate the first amendment rights of those companies. Evan Greer (43m 58s): So I think there is actually, you know, this is an area like with surveillance government surveillance, where there's actually a, you know, a lot of agreement across the aisle among the grassroots. And I think with lawmakers, you know, it's an issue that's just so challenging. Right. You know, Catherine, you mentioned the earn it act, you know, and that's just the perfect example of a bill that's, you know, kind of almost designed to make it impossible to oppose, right? Because on the face of it, it claims to be addressing a harm. That's so horrific that what lawmaker politician would want to, you know, run against someone who can say, well, they voted against the bill to protect children, right. Evan Greer (44m 39s): And when in fact that bill wouldn't do anything to address those harms and really is just sort of politicians posturing on and kind of exploiting a real problem to, to push for something that, you know, just to get headlines basically. And so I think, you know, when I talk to lawmakers about this, I urge two things. One is there is a piece of legislation that exists that was introduced by and Elizabeth Warren, which is the safe sex workers study act. And basically it's a study bill that would require Congress to investigate the impact of SAS to foster, which again was the last major change that we made to section two 30. Evan Greer (45m 19s): And, you know, it's a lot easier to go to a lawmaker. You know, it's maybe hard to say, Hey, I want you to come out with a statement that says, section two 30 is sacrosanct and I'll never support changes to it. It's another thing to go to them and say, Hey, don't you think Congress should do its due diligence and investigate whether the last time we changed this law did more harm than good. Before we rush into making any more changes. That's, you know, a pretty easy ask or at least it should be. And so my hope is that we can get more lawmakers, you know, maybe they're not going to come out and say, we can never touch this law. And you know, and frankly, maybe no law should ever be sacrosanct, right? Maybe we should always be open to the possibility of changes. But I think, you know, at the very least before Congress makes any major changes to section two 30, they should pass that safe sex worker study act to investigate and kind of, you know, make sure we don't make the same mistakes that we've made in the past. Evan Greer (46m 11s): And they should hold some hearings about the human rights, civil liberties and freedom of expression, implications of changes to section two 30 before legislating further. And I think it's not an unreasonable thing that we could get a number of lawmakers on both sides of the aisle. And you only need a handful in the Senate of folks who were saying, all right, let's talk about this, but I'm not voting for any last changes to section two 30 until we've held some hearings, listened to some experts and you know, really made the conversation smarter. So that's kind of the direction that I've been pushing in on the Hill. And then, you know, at the grassroots level, again, it's about breaking down th this is it. You thought net neutrality was complicated to explain to people, you know, the section two 30 is, you know, two levels above, right? Evan Greer (46m 57s): And so I think it's our job as fight for the future. This is one of the things that we see our role in this space is doing is, you know, how do we break this down and, and, and explain to the broadest range of internet users possible that like, Hey, you like being able to post your memes, your photos, your opinions, your jokes on the internet. This is the law that makes that possible. If lawmakers get rid of this law, the things that you like about the internet are going to go away and it's not going to fix any of the things you don't like about the internet. So, you know, we've got a petition and is going, and we'll be working with, you know, all kinds of online creators and, you know, YouTube and Twitch streamers. Then the folks that kind of make the fabric of the internet who have always been sort of our core constituency, the army that we mobilize when these, you know, bad when bad legislation comes along. Evan Greer (47m 47s): And, you know, we did it for SOPA PIPA. We did it for the repeal of net neutrality. And I think we can do it again for section two 30, Katherine Druckman (47m 57s): I think. Well, but you, but you need to do that. Right. You know, so you know that one of the things in your answer that you just gave out, you know, it reminded me, I basically, I like to refer to all of these bills as like boogeyman bills, because you can slip a lot of digitally harmful stuff, you know, through if you've just veil it in this some really scary boogeyman cause, right. It's either terrorism, pedophilia, you know, something just totally indefensible that nobody's going to be, how could you oppose that? And what kind of a person are you, right. But, you know, there's a lot of harm that can be done. And, you know, some of the same people that are pushing for, you know, protect the children over here by, you know, screwing over the internet or also the people who were defending like Cajun kids on the border. Katherine Druckman (48m 38s): So like, anyway, I find it pretty hypocritical and somebody has to point that out and you have to point it out to the mainstream because otherwise they just, I don't know. There's Evan Greer (48m 48s): Yeah. I mean, I think it's, I think it's important to just, you know, one thing I, this kind of goes back to the point I made at the very beginning is, you know, social movements often sometimes get bogged down in an ideology and then ignore strategy. And I think activist messaging can also get bogged down in trying to recruit people through ideology and, and we can forget about emotion. Right. And, and people are scared. People are really scared right now and they, and legitimately so, right. And so we do just have to kind of acknowledge, meet people where they're at and recognize, like, I know you're scared. I know that this is frightening. I know that these things are horrible and it's, it's, it's not okay to just say like, these are, you know, like, w let's not do anything about it, but we need to get it right. Evan Greer (49m 39s): And just doing something to say, we did, doesn't actually address the harm. And I think, you know, we just, I think it's, you know, one, one pitfall, I think, you know, folks in tech sometimes have, is like, because we have, you know, such a more, in-depth understanding of kind of how everything is fitting together. We can dismiss people's concerns because they express them incorrectly. Right. Because they're wrong on, you know, the substance of section two 30 or they're wrong on how the algorithm actually works or they're wrong on, you know, what Facebook is or isn't doing with your data. And sometimes we can then forget like, well, this person is concerned and they're scared and they're angry. Why, and, and what's the real root of that. Evan Greer (50m 20s): And, you know, what's the truth that's embedded in that even if they didn't, you know, even if they're not expressing it exactly. Right, right. Instead of just dismissing their concern. And I think, you know, if we can have a conversation like that, that acknowledges where people are coming from, you know, that can inform not just thoughtful policymaking, but how we mobilize, how we speak to people at a time when, you know, they have a lot go, w people have a lot going on, we're in the middle of a pandemic. People are trying to figure out how to pay their rent. And so I think, you know, making sure that we're grounding our tech policy, you know, messaging and decisions in that lived experience in reality, rather than in kind of theoreticals or slippery slope arguments or, you know, philosophy, even though of course, that all informs our thinking on it. Evan Greer (51m 14s): But I think when we talk to people about it, we have to remember, you know, we need to present these as, you know, lived concerns and immediate concerns, not just theoretical concerns. Doc Searls (51m 25s): Wow. That's, that's great. I think, yeah. I think you've, you've kind of summarized everything for us pretty nicely. It's wrapped up in, well, thanks. Evan Greer (51m 34s): Thanks. Y'all so much for having me on and this was great and yeah. Be happy to come back anytime and DACA come out and visit you in Santa Barbara, when all is said and done. I used to play music at the university there all the time when I was a musician. So, Doc Searls (51m 47s): Fantastic. Well, yeah, that'd be great. I, I think, I mean, the plague is going to end at some point and, and it'd be great to have you here, Evan Greer (51m 57s): So. Cool. All right. Well, thanks to you both. Doc Searls (52m 1s): Thanks, you too. I'm kind of fan girling, so I really appreciate it.