00:13 Katherine: Hi everyone, welcome to the Reality 2.0 podcast. I am Katherine Druckman and joining me as always is Doc Searls. And today, we have a special guest, Bruce Schneier. He is a well-known security expert, a prolific author of many wonderful and easily digestible books about security, and I highly recommend. And one of the reasons we wanted Bruce on the show is there's a lot of talk right now about contact tracing apps as if it's some great savior that's going to rescue us from a pandemic. And we thought Bruce would be a pretty great person to ask about that. And then Doc I think you had some other ideas and I'll let you take it from there. 00:54 Doc: Well, let's go down that rabbit hole, as far as it goes, 'cause I think that's a huge one and it's is very relevant right now. 01:03 Bruce: Sure, what are you... How do you want to do this? 01:08 Katherine: So I can start off a little bit. Just I think that there is a sense that if only the tech giants would save us from this with these wonderful innovative apps, then it would all just magically go away. I feel like I get that impression from people. So we have to have these contract tracing apps and I think you've brought up in some writing and others have that one, there are several problems with that. One is just, it won't be effective and two, given the fact that it's not effective, how could it possibly be worth all the risks associated. People could take advantage of it, it could be manipulated, or privacy concerns, and I thought maybe you might... 01:50 Bruce: Privacy concerns. We're already carrying cell phones in our pocket, which are pretty much the best surveillance device mankind has ever invented. I don't see anything much additional. My complaint about contact tracing is that it's not effective and you're right, we all want an app to solve this. An app solves so many things. It would be great if an app would solve this as well. And tech companies are stepping up, but I just don't see it working. So this actually is a basic security issue that whenever we have some sort of authentication system, we have to worry about two types of errors false positives and false negatives, with an ATM machine, a false positive is someone else registers as me and they withdraw money from my account. False negative is, I can't get my money out of my account. And both of those errors they're different, but they're both important. So we think of contact tracing, you could think about false positives and false negatives. So we have some definition of a contact and let's say it's on a less than six feet, for more than 10 minutes, and that's what it is. So the first positives rate is the percentage of contacts that don't result in viral transmissions. And you have them for a bunch of reasons. One, the app location of proximity systems aren't accurate enough. So this is GPS, this is blue tooth. They both have error rates. I play Pokemon Go. I see GPS error rates all the time. Bluetooth isn't really designed for this, so it's being hacked to make this work. Second area is that the app won't be aware of any extenuating circumstances. I could be less than six feet from someone for more than eight hours, and it could be because we're both sleeping in an apartment separated by a wall. They won't know there's a glass partition, it's really bad at the z-axis. We can be on different floors. And the third is that not every contact is a transmission. We know a lot more about how this disease works, and it isn't a matter of standing next to somebody for 10 minutes. Are you speaking with each other? Are you wearing a mask? Are you indoors? Are you outdoors? What is the airflow? 04:11 Doc: Yeah, right, what have you touched? That's one of the biggest ways the transmission happens. 04:14 Bruce: Right, so all those things. Just the raw data of how close you are doesn't capture. So there'd be a lot of contacts that don't have transmissions, then go back to false negatives. So that's when the app doesn't register a contact and you get sick. So a lot of reasons for that, again, errors in location or proximity. Two, a lot of people won't have the app. And even Singapore had a 20% adoption rate before they abandoned their app. And again, third, not every transmission is a result of that precisely defined contact. You've probably all watched those animations of a sneeze transmitting way further than six-feet over partitions in a grocery store. So I have this app, these two error rates and my problem is it doesn't give me any useful information I go out, I come back and the app says beep. Does that mean I have the disease? It doesn't, I go out, I come back. It doesn't beep... Does that mean I don't have the disease? It doesn't. Should I quarantine myself? I don't know, I can't get tested, so I haven't had anything useful the app has told me and my worry is if we produce an app with all of these errors in a few days, people be tweeting the app doesn't work. I got sick it didn't tell me. Yeah, told me I got sick, I didn't. And suddenly people don't trust the app. And trust is the most important thing we have in this pandemic and the fact that it's so low is making a response harder. You can't make it worse. So I think there's too many sources of error for the app to be reliable. 06:02 Doc: Bruce, did we... two cases, I remember early on in favor of contact tracing were China and Korea, the China case is one where infinite surveillance and minimal privacy were features rather than bugs in their system. In the Korean case there, the app was kind of this fun thing where everybody's sort of keeping track of everything and reporting on everything. And I don't know, I've hardly heard of either of those since that. I'm wondering how well they actually worked or if they did. So Korea did a great job with manual contact tracing, and contact tracing is important, but you actually need to talk to people. It's a human trust issue, it's not something you do with an app. Lots of ways to design the app, the Apple... Google system preserves privacy to a great degree. The Singapore app did less so. And we can build different privacy protections into the app. I think it's important, but I don't think... I wouldn't loose a lot of sleep over it. These are extraordinary times and making these extraordinary trade-offs are reasonable as long as we can back off for them after the pandemic is over. But my worry is more that I'm not getting the value that's been promised and well, I can understand that that little explanation I gave is a pretty complicated one. And for most people will be... Well, this app doesn't work, I can't trust it. And then we're, we're much worse off. Really what we need, where I want tech to solve this, I need ubiquitous, cheap, fast, accurate, testing. That's what I need. Give me that and we can do a lot of things. Without that, I really can't do anything. I think I agree there. I am concerned with one thing that you mentioned is that it's... As long as we put it away when we're done and I think I'm a little bit concerned that we can't... And I followed it a little bit. 08:03 Bruce: Anyway... You mean Google already knows every place you go where you live, where you work, when you wake up is, who you sleep with. The app is already doing that level of surveillance. So a little bit extra for contact racing, who cares? 08:18 Katherine: Perhaps it's not even except that it... perhaps correlates health data. 08:22 Bruce: Health data is a separate issue. Yeah, that's a very different sort of question. And really, what we're talking about is, I think a very... My very defining issue of this century and that is data in the group interest versus data in the self-interest. And so right now I use Google Maps, Google Maps gets me where I'm going faster, because it knows traffic patterns, because everybody using Google Apps is under surveillance. That's a trade-off. There is group value in putting all of our car driving data in real-time, in a database yet it's very personal. These contact tracing apps are another trade-off. There's value in knowing where the contacts are. Yet, it's invasive. So you're talking about yet a third issue. I think there's real value for humanity of putting all of our medical data in one big database, and letting researchers at it. I bet the benefits can't even be comprehended at this point. Yeah, yikes, yeah, yeah. How do we balance a tech is the same thing. Give me all your data and you get free email and free web search and free this free that I... How do we balance these? It's gonna be different in each application. And for driving data. I can make this up, right? Your data is only good for 10 minutes, you only need one out of every 10 cars and we can anonymize it done. Solved. For medical data, you can't do that, so we're gonna put it in a very well-regulated database, require you to submit your research questions and queries in advance, we'll use differential privacy to give you results. Very different answer, but problem solved. Now, this question I think is important for us as society because there is group value to all of these issues. Yet, that's very personal. And we haven't really made these decisions consciously. We kind of allowed corporations so to do whatever they want, but deliberately deciding when the data's value in the group interest trumps the value in the self-interest and when it doesn't, and how to balance the two, I think we'll be doing that the next couple of decades. 10:47 Doc: Yeah, this may be too general, but I think it's my perspective on all of this, Bruce, to some degree is partly that the older I get the earlier it seems, I feel like balancing all the things you were talking about is hard in part because we were suddenly graced with all these capabilities. All these digital capabilities, and it's kind of like, maybe when humans first found fire or the wheel or something like that, and anything could be done with it, and after a while they figure out what's the right thing to do and what's the wrong thing to do. And what do you do under different circumstances? But we, we don't have that yet. We, for privacy, we invented clothing, and shelter. Those are privacy technologies and norms for respecting them with other people. But here on line we basically between Google and... mostly Google, but also Facebook and other large companies with which we've made, I think, to some degree, consciously, Faustian bargains with for using their things that allow them to know a great deal about us for purposes that we don't entirely see, and even they don't entirely see and we haven't worked out the norms for that yet, or the laws that would follow the norms. Today is the second anniversary of the enforcement of the GDPR. And with that too in here, we have a regulatory approach, that in many ways, the most obvious outcome, of it are these endless gauntlets we have to go through with the face or in front of every website to consent to exactly what the GDPR was meant to prevent. 12:22 Bruce: This is an example of the law being hacked and in fact all of these systems are designed not to comply with the law, but to not comply with the law in a way that is compliant with the law. 12:35 Doc: Yeah, it's like they're all about obeying the letter, but not the spirit of the law. That's right, you look at GDP plus compliance you get about 200 million results and all of them are from firms that are going to sell on a B2B basis to a company. Ways to subvert and avoid, to hack the GDPR to totally hack the GDPR. This gets us to a related subject that you and I have talked about before and I'll share with the audience, and I'm very intrigued by what Bruce has been thinking out loud among his cohort which I'm in. The notion that hacking which is dear, I think to our listenership here, is in fact almost like it's part of the way humans work, and it's the way that we change. Invent change and improve systems that all systems could be hacked and all systems are improved if they do improve to some degree by hacking. And I've started thinking of us today that hacking is basically how we get along in the world in a more less constant basis. And so, I'm wondering if you just unpack that a little bit, Bruce, 'cause I'm sort of riffing off stuff that you said. They're just ricochets around in my skull. 13:44 Bruce: So this is something I'm thinking about for a while, at the RSA conferences this year I gave a talk about it in very preliminary stages, and it's really evolved a lot since then, and I'm really trying to extend the notion of hacking to broader social systems. So the example I used at the RSA Conference, is the tax code which is really interesting. The tax code is code it's an algorithm, you give it a bunch of inputs, your financial history, and you have an output, which is how much money you owe and like any algorithm, it's going to have bugs, it's going to have vulnerabilities. We call is vulnerable tax loop holes. It has exploits. Those are tax avoiding strategies. It has black hat hackers, we call them tax attorneys, whose job it is to find exploitable vulnerabilities for their customers. We can patch a tax code, it's called passing new laws, and you can use the entire hacking metaphor on to the tax code and if you extend it further, you can use it. When you think about market, the market economy or our political system or how we pass laws, how we choose our leaders. We can talk about hacking democracy and you will see these articles with headlines of Facebook hacks our attention, that in the '80s the director at NASA hacked the Government to have missions, faster and better, that in the Obama administration, people hacked bureaucracy to get things done. And we kind of have intuition of what this means, and I'd like to formalize that somewhat... And to think about hacking these systems, both as a method of subverting their intent, like you mentioned GDPR, but also as a method of improving and letting those systems evolve. And this is a human thing and as we move into this world of fast-paced tech change and then fast-paced social change, you detect change that this notion of hacking will become more important. 16:08 Katherine: So actually, this... There's a nice kind of dovetail here between the hacking and the contact tracing because you link to an article from the Brookings Institute goes into some concerns I think that are interesting about the consequences of contact tracing and its inaccuracy, but it's also its ability to be manipulated. But in particular, they talk about the possibility for people to be labeled as pariahs, blocked entrance to activities, public spaces and what not, falsely, especially in which might impact more vulnerable communities disproportionately. But I think it's an interesting... dovetail with the hacking conversation because inevitably, if those things came about, if I am a person who is negatively impacted by this, won't I just find a way to hack the system and I'll figure a way to falsify my information in order to get access to whatever I want, but maybe that's only a select few who would be able to... And then the vulnerable population gets left behind. It's an interesting, I think, issue that they raised. Yeah, what are you any thoughts about that? 17:18 Bruce: Fake driver's licenses. So I don't think it'll be contact tracing where this happens. There is talk about some form of immunity passport the something on your phone, a card, a wrist band something that says I've had the disease, you can let me in the restaurant you can let or into the concert into The Club, and that's that kind of, that have and have not tier now and we have that already. Driver's license drinking. There's an enormous amount of effort spent trying to fake driver's licenses when you're under age. So I think we know how to do this pretty well. Yeah, we're not gonna be perfect, but my guess is... We'll be pretty good. I think we have to decide as a society, do we want that kind of tiering, that kind of people allowed and people not? That feels like a very social question. 18:10 Katherine: That's pretty scary, but, I don't know, I... But I think the depends on the the consequences. 18:15 Bruce: Depends on the consequences, depends on the venue, it depends on what we're doing. I think we get the tech to mostly work for that because drivers license as proof of age documents, which is really primarily what they are these days, works pretty well, and we seem to do that. 18:36 Doc: It's interesting, a hacking discipline, you might say, around SSI, which could stand for self-sovereign identity. That's where it started. It could stand for a lot of other things. The whole idea is that, pretty much everything we do, when we identify ourselves in particular ways are what they're calling verifiable claims or verifiable credentials and that in an abstract way immunity passports is a terrible name for what we're talking about here. It's basically just you wanna be able to have the fact that you've had this that you can make a verifiable claim that you are immune to be useful to you. But the social side of this is, we don't wanna caste system, in which is completely manifest out in the world where there are the privileged there already that have survived this infection and the UN-privileged who haven't and other pandemics, will come along and I suppose as a way of preparing for them, but do we wanna go through the world with a portfolio of our... Of being turned inside out with is a portfolio of health characteristics that may have social implication really. 19:49 Bruce: When I used to travel a yellow document and I forget where I got it, but some health organization in the United States which does have list all of my immunities right then I... And there are countries where I on-demand have to show that I've gotten the yellow fever shot. So now we have examples of this. And in fact if you got through the disease you do have something that someone else doesn't have, and if you personally have gone through disease, I actually might be more likely to invite you over. So, yes it's going to divide into have and have-nots, but those aren't artificial divisions. And the question is, as a society is this useful? Does a restaurant wanna say "covid survivors only?" Yeah, I don't know the answer, but it's not an obvious answer. 20:47 Katherine: No. Nothing is obvious, anymore. 20:47 Doc: Well, yeah, that's true too. Part of the problem with that is that there are too many conditionalities involved. It gets back to your criticism of contact tracing in the first place, or there are so many possible considerations, so many contingencies, conditional ties and arrest of it, that it's impossible for the thing to work effectively at all. Quite a side of it wether or not it might be accurate. 21:08 Bruce: It will work Okay, so now the question is how good is the antibody test and then how... Yeah, how good is the security? 'cause merit for passes full negative. And I wanna get into the club. There are two ways that fails one is right, I have the disease or it and I get in, and the second is I'm immune and I'm not allowed in. So both of those are real failures and how we have to look at the efficacy of our testing system of our credentialing system of our credential verification system. The same thing, we have for driver's licenses. It proves that I'm 18 or I guess 21, so what are the breeder documents from which the government knows my true age... How unforgeable is the piece of plastic, I am given... And how much is the door guard at the bar paying attention? So it's the same thing in a different application. 22:13 Doc: So you use the term "We" a number of times, so we all do that, we in the largest sense we among the three of us at the moment, who are those I think they're a couple of groups, one is hackers themselves. That's a lot of our listeners here and there are an awful a lot of people, I think, listening to this who are in a position to actually do something, technically, but there's also... law-makers. And I'm haunted by something that Michael Powell, the former FCC chairman, said a long time ago, he had just recently resigned as the FCC Chairman, we were talking about Internet neutrality a small group of us with him and... And he said, "Well here's the thing: I've met with almost everybody in Congress and I can tell you almost... So there are two things, one of them knows what is technology and the other is economics. Good luck and I worry about that. I think that's part of what happened at the GDPR. I think sometimes a new law protects yesterday from last Thursday. I think the GDPR protects 2015 from 2012 in ways that are with us probably of the next 20-30 years, so I'm wondering how... Who is the "We" and how do we make this happen? And there may not be an actual answer, but I wonder in your mind, when you say we or we as a society, who's that? 23:37 Bruce: Yeah, I do mean "Us" as a society. And you bring up... I think an important point in our field pretty much written large is that the people who we entrust to regulate our systems, our tech don't understand it. Now, there's gotta be an answer to this. Already the people in Congress don't understand what they're regulating and whether it's on production or pharmaceuticals or airplane safety or trade issues, they're forever passing laws and areas that not in their expertise and so we have systems for this, it's members of their staff, who know things, it's people who come in and talk to them and tech shouldn't be different. I think it is different primarily because of a generational gap, which will close itself in 20 years. I mean, it's annoying to be patient. I actually think that when the older people start dying off, the younger ones will be much more intuitive about tech, but we are a very technological society, and a lot of our laws are more complex than the expertise of the people debating them, and voting on them. So I need a general solution. It can't just be for privacy or data or even IT. It got to include things like GMOs and robotics and Future of Work and monetary policy and everything, but I think this is a big issue. How do we get the tech expertise into the minds of the people who need it in a way that doesn't... I think a lot, I spoke about is in my last book, when I think about the way it should work, I think about the Picard's "Ready Room", remember what happened on Star Trek. He calls people into his ready room asked them What's going on. They all speak by tech at him, and he wouldn't debate it, He wouldn't deny it, he wouldn't demand often at science, you would say. Okay, I get it, here's what we're gonna do. I want that to happen. I wanted to be tech-informed policy by policy makers accepting the tech and not fighting it and not denying it. We don't have that right now, I think we could, it's not beyond the realm of possibility, we would have tech read politic, not the other way around... 26:11 Doc: So I'm seeing a gate between tech expertise and law-making that you see in a lot more than I have. You get the call from the Senator's office. Can you please, for a staffer please talk to our staff. And I've only had has happen to me a couple of times, but in both cases, I did not have the sense and it was over that it made any difference at all. But I'm wondering, you must get many, many, many more calls than that. And I'm wondering if you have a sense of what gets through and how and if that can be hacked 'cause that's where I'm going, is if this isn't working quite right, maybe there's a hack here on the system as it now stands. 26:54 Bruce: I wanna have it. So this is another thing I haven't thinking about a lot, and it's Public Interest Technology and re-steer technology to the public interest. Now we have examples of this working. There is a program called Tech Congress that put technologists on legislative staff in Washington and I maybe a dozen of them, right now, and they're doing an enormous amount of good so I go. 27:18 Doc: We should put this in a link, "Tech Congress" is this called? 27:22 Bruce: Yeah, I'll send you the information. It is a great program. We need lots of this, we both know people who have been chief technology officers at the Federal Trade Commission. Yeah, so there are ways we're trying to make this work. You're right, I've done a lot of briefing to staffers and some of it's good and some of it's not as technologists, we tend not to understand how policy works. Confusing it's annoying, it seems irrational. That's okay, because policy makers feel the same way about us, and this is really a matter of trying to bridge very different worlds. And this is a new. In 1959 CP Snow wrote a great essay called "The two cultures", I had, I did a damn problem yeah, but back then it was kind of okay, that they were separate because it is interact that much. It was the "Space Program", it was the "Nuclear Program". It wasn't the device that's in your pocket sending you phone calls, but he had that same complaint about these two traditions that would talk past each other. And we really need to figure out how to bridge this gap. And it's Washington, Silicon Valley. You'll hear it talked about, and both sides don't trust the other side, both sides think the other side is lying as disingenuous. I think of the going doctorate, right? And I, about back doors and iPhones. That is the exact same problem. Policy makers not understanding the tech and techies not understanding the policy, the solution is going to be, were both sides understand each other and build policy that takes the tech into account. I'm a big fan of giving the FBI actual digital forensic capabilities. The reason they want your iPhone because they don't understand how much data is available but that can help them solve crimes. They don't have the digital forensics, all they have is "open the phone". Yeah, and if we want phones, to be secure and there are a lot of security reasons why we do, we can't ignore that the FBI needs to be able to solve crimes. So how can we build both security into our devices and give law enforcement tech tools to do digital forensics. You kinda have to do that. 30:05 Doc: Do you have an answer for that? 30:07 Katherine: Yeah, how do those coexist? 30:09 Doc: Do you want the back door and... Or maybe it's not a back door, maybe it's some other contraption. What is that? 30:13 Bruce: Backdoors and contractions are all dumb. The problem is sort of fundamentally that law enforcement has myopic view of the issue and they see the phone as a source of evidence when in fact the phone it turns out to be a piece of national critical instructor 'cause it's in the pocket of every single lawmaker and government official and judge and Police Officer and CEO and nuclear power plant operator and election official that having things insecure would be crazy! Simply because they are so important to national security. Barr makes the point, attorney general, in one of his speeches that... No, no, we don't want backdoors in critical tech, we just wanted a consumer tech, but consumer tech is critical tech, but there's no difference anymore. So we have to make our systems as secure as possible. That's clear for national security. But you know, I know there's so much surveillance data that's being collected by corporations by systems that already exists that could be used for crime solving. But this phone knows everywhere I've been, because that's how it rings when someone wants to call me. It is part of the system. So what I want is to build up the FBIs capability and digital forensics like in fingerprinting like in tire tracks, like in DNA testing like... Any of their other Tech capabilities, so they don't need to get into the phone, they can do other things. Now we see FBI hacking in Pensacola, they eventually hacked into the phone, they didn't have to make it insecure. There are lots of things we can do if we can get the FBI to abandon make all phones insecure. That's a non-starter. There's, I think, there's a lot we can do but it has to start from the tech realities of what a backdoor means how it works when is it good when isn't it? And there's a lot of, I think, talking to past each other, even when I give lectures on this, we're not really talking to each other. 32:36 Katherine: so is it really... Our personal device is really that secure to begin with. Some of them may be I guess if I have the latest model, and it's computer if they can hack into it anyway. Is it all that secure? Maybe that's a whole other conversation, but I don't know, I just wonder we've talked in the past about a cell phone is the most intimate thing you have it knows everything about you and knows you better than you know yourself. I think you even said in one of your books, or is that was search is trying... It's the next best thing to having access to your brain, which is frightening. 33:09 Bruce: It is basically true, right. Google knows what kind of porn everybody likes... And that's the way it is. Are they secured? They're okay. I think they're the way to tell best is to look at the market for "0 days", but there is now, a robust market for "0 days" in operating systems, both fixed and mobile and how much a company will pay for an exploitable iPhone zero day it's like a million a couple of million dollars that says something, they're not perfect, there are insecurities, we know that we can design secure systems, we're doing our best, they're good, they get better all the time, but we have to make them as good as possible. 33:52 Katherine: I guess what I'm getting at is the FBI really that good. Or is the phone just not that secure? 33:58 Bruce: Not that the FBI is good, but that the FBI has budget and they go to third parties that build exploit systems for law enforcement. It turns out, usually, not to be true. So it's not that the FBI has... And this has been a tech transfer of the past 30 years. The government doesn't have the expertise anymore they buy it. 34:24 Katherine: Yeah, they hire some Israelis, right? 34:27 Bruce: Maybe... In some cases Israeli companies, 'cause the laws and morals are a little looser there, but it's also companies in Germany and Italy in the UK, in the US. Better building these tools and we're probably okay with them if they are used properly and were not okay with them when the government of Kazakhstan uses them to spy on dissidents, but no. But that is another question about international trafficking, in ours and whether these devices, how does restricted items and how we should restrict them, do we want the government of Saudi Arabia when they buy an exploit allows them to drop a Trojan to Jeff Bezos's phone or whoever else. 35:13 Katherine: Right, that's actually a really good point. I've thought for a while but the cell phone again, being so personal and with you everywhere is such a really good upper... Well, bad, depending on your perspective, opportunity for a vulnerability if you really wanna get somebody take over their phone. 35:31 Bruce: True and you will learn everything. 35:33 Katherine: Or will you learn everything or can you in fact even manipulate what's on the phone to tell your own narrative or fishing to whatever it is? 35:43 Bruce: Supposed to too lazy to do that. I'm not gonna have a phone which I'm gonna seed with a fake persona that way. That's a way too much work, right, right, I'm going to search what I search and call who I call and message to a message and email, who a email and go where I go. I mean Yes... 36:02 Katherine: I mean more in terms of accessing files... 36:07 Bruce: Yeah, maybe... If you were a national terrorist organization you probably have a program for giving phones. Take back stories. Yeah, as you you've lost all your ability to have a... To have give agents a fake histories because like why weren't they on Zango when they were seven? I'm sure we're working on some new ways to do something similar, but that is very much beyond the reach of us. 36:41 Katherine: Sure, sure, not a consumer capability. 36:44 Doc: We can continue on this. But there's something that interests me in this moment, 'cause it's kind of like in the midst of the pandemic that were in right now, it's like a reset button got pushed, or the giant three-dimensional chess board that was reality got turned over, and it's a whole different bunch of games now. And I like to look down stream at consequences of consequences and where these go. And this one particular area. I wanted to ask you about a little bit security-related in the sense that you created this term "security theater", that became a meme in the world, and also defended it. You meant it. You have a wonderful way of looking at both sides of everything, but it's travel. You and I both travel a lot and live in multiple places. I first met you in Minneapolis, where you lived at the time at Mini web Con, which is a thing, then we both spoke there one after the other, but we've both been in Boston, I live in New York and Santa Barbara, I'm in Santa Barbara now I have a million and a half miles through United alone, and I seriously wonder if I'm ever gonna travel again. I really wonder this. Yeah, and I'm wondering if you're looking down stream with that and seeing not just where that goes in a general sense, but how the systems that are gonna have to hack themselves and have to be hacked in order to work in a future where none of the companies in that business, now are gonna be in exactly the same business because the whole thing got sphinctered down in almost nothing for a couple of months. 38:24 Bruce: Yeah, it's gonna be more than a couple of months. I don't an easy way out of this, I've been thinking about that a lot, I think we don't know, I think we are in the midst of a system reset that we've seen three other times is to his country in the American Revolution, the American Civil War and the Great Depression 'slash' World War II. I think things are gonna change in ways we cannot predict and we are learning the effects of an economic system that squeezes all redundancy out of itself in the name of profit, it turns out, redundancy is very valuable for security and you could build for-profit hospitals, for normal times, you don't have any excess capability for at normal times. Travel, restaurants, a lot of things I used to do all the time, I travel about 28000 miles a year. My average speed list was 36 miles an hour, and that's all gone away, some of that will come back. I think airplanes are safer because of the way air is circulated... We have a much more sophisticated view of the disease right now. Now, come months ago it was things like less than six feet more than 10 mine. Now, it's about number of particles in the air. And the way the air flows and how long you're in an environment, and there's some kind of level of toxicity we're much more sophisticated and I can easily imagine airplane. They're already do. It are amazing, that of air circulation ramping that up more back to the levels when people were smoking on planes needs to really filter out the air, airports will change. I'm seeing... Reading essays about restaurants and Hong Kong with plastic partitions between the tables which sounds awful, but maybe that's the future. Things are gonna be real different. But yes, I'm in Minneapolis, now. Thinking about, how do I get back to Cambridge for the full Harvard semester? Assuming there is gonna be one because dorm rooms, or basically cruise ships that don't move that. Yeah, that's really so in a... Yeah, and wondering how any of this is going to work. And I taught a class that was 60 people in the room. You could fit 20 people in that room under the six feet rule, but how they get in and out. There's just so many things that were thought and a lot of this depends on how fast we can get a vaccine and how long the vaccine lasts for. 41:04 Doc: Yeah, but the is an inoculation a durable thing or not. There's no way to know. 41:08 Bruce: It's a lot of unknowns, but really, we are. Unemployment is a crazy number. We seem to be in the United States in '2008 we basically sacrifice the middle class to the millionaires after that crisis. This looks like the crisis, we sacrifice a millionaires of the billionaires and that worries me. I think the people who are gonna left holing the bag are the landlords... 41:34 Doc: I know a number of those and we're one of those two. 41:40 Bruce: There's no more unsympathetic group than the landlords. Nobody wants to be nice to the landlords... A lot of the bail-outs are going to the billionaires and the companies, they're not going to the unemployed now that can sustain. So things are gonna change to an enormous degree. I don't know how... Yeah, really seeing the results of neo-liberalism of the market gets to figure out what to do because the market isn't design for this sort of thing. This is a collective action problem markets don't solve collective action problems, they don't... Now, that's fine, then I'm supposed to... But we do have collective action problems that need solving, once in a while. 42:32 Doc: Yeah, yeah, or the solution is simply all you over there, you go die. Yeah, and that's fine. That's a collateral casualty of the market doing what it does. 42:46 Bruce: there are gonna be a lot of people dying. What our goals here and spreading up the curve are two-fold, and one is to make sure that when you get sick there's a hospital bed for you, if you need it. That's the only thing that doesn't change number people getting sick and it may change the death rate, because it's better care, and then if we delay people getting sick, we know more about treatment and we do already, we know like ventilators are overly prescribed. And they actually did Harm in many circumstances, so we know a lot more about how to teach someone who's sick and if we delay long enough, they'll be actually preventive systems. Yeah, for one being a vaccine right? So you how to look at that now we can't remain close for two years. That's just not possible. There's gonna be a point where we say, "Okay you're gonna get sick anyway, we can't stop it, we just gonna try to keep the rate down to a low boil, so that there's a hospital bed there for you. Yeah, it depends a lot on the models on how the disease spreads that we're still learning. The hard part about this is it. Yes, it's about risk, but there's enormous amount that is uncertainty and we actually don't know the data, to calculate the risk and that makes this doubly hard. 44:16 Doc: Wow. 44:18 Katherine: Well, I'm depressed now. 44:25 Doc: Well, it's what we're doing here. I mean, I, I'm in the demographic where depending on what numbers you're looking at, there's a gun with the Russian Roulette I'm playing is one bullet and at 10 chamber gun. 44:43 Bruce: That depends a lot on other factors. Yeah, I had people who actually die. It's racially biased. So access to healthcare, not just now but through your life matter a lot, right, that you have a high blood pressure, whether you are a obese whether you have lots of comorbidity factors. Lay here in different Germany as a meteor led the United States does in 10 years. We'll probably understand why, but yeah, I... Where there's gonna be something about the lifestyle, the health care, the environment that made a difference. Smokers have a better survival rate... And it's annoying everybody, but they do... 45:29 Katherine: What it's interesting as I feel like at the beginning of this or earlier in this, I was a little bit more optimistic about not about the approach to the disease but the opportunity for human ingenuity and redefining how we live our lives. And I feel like with time, I'm getting less optimistic about it. 45:51 Bruce: I think we're doing a lot and we will... We can't imagine going to arena rock concerts anytime the next decade that that might be true, but there will be other things that will do it's not the... I think we will figure it out. 46:05 Doc: It's the end of the marsh pit. 46:09 Katherine: Everyone has their own comfort level, right? So the different groups of people will adapt, I think in different ways. 46:14 Bruce: Yeah, but be careful when the problems is not like sky diving. I know my risk I'll decide whether I do it or not. The question is, will you take other people with you and you go skydiving? And you apparently afect others. And that makes it hard, it just affected you. We could treat it like any kind of high-risk personal activity, right? Yeah, if you decide to open up your barbershop. You can infect 100 people and you could spark a huge spike in the disease and that would harm lots of people, and that's why this is not sort of a question about freedom, and liberty, and do what you want. It's really a society deciding. 47:00 Katherine: So not to take it too long as we always do, but I think you might be a really great person to answer the problem of there seems to be a political divide between the way people digest information. People are inherently suspect... Or some people are inherently suspect of their information sources they don't. There are people who you flat out don't believe it, they don't understand that they need to wear a mask and I wonder if you can kind of put a security and technology hat, on and figure out a way to address that is that even possible? 47:37 Bruce: I think it's bigger than security. This is the result of decades of political division, the fact that health has been politicized that wearing a mask, it is a matter of which political party you listen to is crazy. And I wish it wasn't, so... But that's the world or... And I don't think that it's not a tech solution here we are seeing the problems of this kind of cultural political divide, and yes, it is very harmful, not just in health policy but the areas of... Well, but now it's pretty obvious in health policy and I don't think there is a tech solution. I yeah, I, I can't figure out why there's a political party in favor of more people dying or dying faster. 48:29 Katherine: or presuming that certain media outlets, want the economy to fail, because they're advising you that certain activities are at risk and I don't understand it is... Maybe there's a... 48:42 Doc: I have to say. And maybe we can... Maybe This will be helpful as we went us down. Is I'm actually very optimistic at the moment, in part because I think some things will work and things that work are going to win in the long run, because it's in everybody's interest that the things that work with and I think it's gonna be a very... It's all gonna be in a Tumblr, it's not gonna come out exactly any one of us think, but we're gonna need system, we're gonna need ways that things work. They're gonna have to work politically they're gonna have to work technically, they're gonna have to work emotionally, but I think we will come up with ways as the body of knowledge around this grows as you're saying earlier. Person You know, we, the larger we learn more about this all the time, that we didn't know before, and that's all of those. A good inform a general knowledge that in the long run is going to play out as we'll lick this thing, in some ways, and we'll have a different thing when we're done with it for preventing the next one. 49:47 Bruce: The theory that truth wins because it actually conforms to reality. Yeah, I love that "Truth wins 'cause it conforms to reality" would that is a one-liner there. Well, so you are true matches reality. Yeah, likely to walk in front of a bus at a... Yeah, not true, but what we're learning is things can divert to reality to a surprising degree before it gets that bad. 50:19 Doc: That's true. And that's part of the social disease that is out there right now that's a... Yeah, yeah. 50:25 Katherine: I think you've nailed the ending thanks Doc... Yeah, thank you so much to a refreshing. This was actually a really great information here, thank you, it was a pleasure. Thanks a lot Bruce, bye-bye.