Katherine Druckman (0s): Hey everyone. Welcome back to reality. 2.0, I am Katherine Druckman Doc Searls is with me as always Petros Koutoupis is joining us again. And today we have a special guest Chris Bronck, who is a professor in the college of technology at the university of Houston. We're going to get into some interesting topics related to disinformation and cybersecurity and a lot of good stuff. But before we get started, I wanted to remind everyone that we have a newsletter. So please go to our website at reality, to cast that as the number two in the URL and sign up for our newsletter, you can also now get some cool swag, like, you know, you never know when you might want my face on a t-shirt. Katherine Druckman (50s): Maybe that's weird. Anyway, you can follow that link as well. So yeah, let's get started. So, Chris, why don't you tell us a little bit about yourself and your interests, both academic and not. Dr. Chris Bronck (1m 3s): Okay. So why don't we start in 1982 in 1982, I was 10 years old and like every other red blooded American kid in 1982, I begged my parents for an Atari 2,600 cause I didn't have one yet. And that was the year that the price point at Sears was just right. And I was going to get an Apple too for about $200. And, and under the tree that year, much to my surprise was not an Atari 2,600 button Apple too. And that began a lifelong experience in computing. Dr. Chris Bronck (1m 46s): For me that has carried me through a whole bunch of experiences. I did a startup in the nineties cause you know, you gotta have that ticket punched. Then I did the next logical thing after that, where I was a diplomat, I was a foreign service officer for about, Oh, I guess about five years. Then I spent some more time at the state department as an advisory because you know, internet people don't belong in diplomacy. And then finally, a few years ago I decided to become a bonafide professor. I did a PhD along the way at Syracuse on cybersecurity and kind of the interdisciplinary nature of it. So not just the, you know, the computer science or, you know, kind of the ones and zeros, but also the, the kind of, you know, the whole socio-technical picture basically. Dr. Chris Bronck (2m 34s): And I came to UAH because a friend of mine said we're going to build a pretty decent cybersecurity program and it sounded like fun and I didn't have to move. And, and a couple of years ago they, they gave me tenure, which I guess means that I can, I can say things with an opinion, but not in a stupid way. And because we're under Susan, when we can see you, you appear to play bass. Is that right? I do play bass. I've been playing bass not many years after I got the Apple to a, a very inexpensive knockoff fender. Precision bass showed up under the tree and, and now I have, I I'm, I'm officially one of those doctor, lawyer jerks who has all the classic guitars, but doesn't play them that much. Dr. Chris Bronck (3m 25s): So the real practitioners in the field are like, Oh, is that an original whatever? And I'm like, Oh yeah, yeah, yeah. To the last screw. It's like, it's the same as it came off the line and you know, Fullerton, California in 1976, you betcha. So be honest, be honest with me. Okay. Are you still disappointed? You didn't get the Atari 2,600, you know, I bought a 2,600 when E-bay became a thing and it had all of the games like every game and I was delighted not to get the 2,600 because as long as I convinced my parents that, you know, I needed a dozen more, you know, 3m or MCI or whatever, floppy disks. I mean the hacking skills came and I realized how I got software as a minor, of course. Dr. Chris Bronck (4m 9s): And this is way outside of the statute of limitations for the, the, the, I mean, in fact, I don't think there was even statute at that point to really come down on me for what I was doing, but, you know, I did get the Atari, I got all the games and I put you on each one in, I'd play it for five or 10 minutes and I'd be like, you know, a skis kind of lame. And I, I remember I pulled out a venture and put it in and I was like, wait, I'm just a pixel. That's like walking around inside this weird maze thing that I can't really see this, this is awful. So I think it was the right choice. My parents, I give them complete, you know, just, you know, deference on their decision. Dr. Chris Bronck (4m 52s): They did the right thing. Katherine Druckman (4m 55s): Awesome. Well, thanks for that. So, so let's talk about why I contacted you about doing this podcast. So you, have an interest academically and probably otherwise in disinformation and especially from a sort of cyber security lens, but I appreciate that you mentioned your interest in cybersecurity from a more interdisciplinary perspective, because I think that's really interesting and important, you know, what can you, what can you tell us, you know, more about, about the kind of stuff that you're interested in right now and, Oh, sorry. Katherine Druckman (5m 35s): The most important thing. Why is disinformation more believable? Dr. Chris Bronck (5m 42s): Well, I mean, I like to think of a, imagine a knob where you have a ratio that you're establishing from zero to 10. So, you know, you're big, you know, you're, you're, you're mixing nod from left to right. And on, on, on your right channel is, or of the street, the right side of the stereo channel is, you know, the truth and on the left side is BS and you kind of figure out how much of each you can put in and mixed together to convince people the thing. And we, we can call that rhetoric, right? You know, you, you make statements and other thing they're believable and there's narratives and their study of all that stuff. But really where I got into this is during the 2016 election, I, I convinced a social media account or two of mine that I was a very, very activist, conservative American, very activist conservative. Dr. Chris Bronck (6m 34s): And I was very surprised at the memes that I started getting and that so many people communicated in means that really, you know, that means we're now this, this form of communication. In fact, at one point I told my department mates that I was only going to communicate with them in memes. And they were like, please don't do that. What's an example of a meme. So a meme is a photograph or an image that has texts laying over it. The S it sends a message. So a very simple concept and the, it has to be very simple. The image can be very powerful. Dr. Chris Bronck (7m 14s): So it can be a picture of Hitler. If you're trying to appeal to anti-Semite neo-Nazi types, it can be red, white, and blue. If you want to reach out to patriotic folks or people who think they're patriotic, it can be simple. It can be complicated. And then the idea that's conveyed in text has to be incredibly simple. It can't be, Oh, here's 150 words overlaying on this graphic because you're T you're talking about it a little square that occupies, you know, you know, 180 by 180 pixels or something, something fairly small. And that's really that's that, that became in the 2016 election, the, the medium by which people were immediately marketed ideas through this combination of a graphical and tech. Dr. Chris Bronck (8m 1s): So it's basically slogany on a, on a picture. Doc Searls (8m 4s): So you, so you communicated by memes. So you are, you're there, they're reaching out to you by memes. And so, and, and, and with were all of them dissident formative in some way. Dr. Chris Bronck (8m 14s): Well, that's, that's the interesting thing. I, I had a op-ed war with Facebook for a little while over the 2018 election, because when you watch a political advertisement, so let's just focus on politics. First, when you see a political advertisement on television, even if your television is being strained to you by Hulu or somebody there's that, that, you know, you know, there's usually some sort of tagline that's given where the, the, the, the candidate says, I approve this message, and there's a tagline on the bottom of the ad. And you see the same thing in a newspaper here at, on radio. And Facebook basically said, we don't need to do that. We can't do that. We can't verify any of this. Dr. Chris Bronck (8m 54s): We can't figure out if, if this is political speech or not. So we're not labeling any of it as political speech. So, but, but the idea was that some of these things probably were fairly incorrect things about the U S Mexico border things about immigrants, things about political parties. They were, they were not always what I would consider to be on the money, correct, but in the world of negative political advertising, that's a, that's a really, that's a many shades of gray of, of information that I would deem not entirely true. Katherine Druckman (9m 34s): Let's unpack this idea of disinformation as it relates to security, because, you know, obviously these things have, have impact significant impact. And how do you view them from the perspective of a security expert? Dr. Chris Bronck (9m 50s): Okay. So the simple answer on that is the democratic national committee. So what some of my colleagues and I have been doing a lot of talking and thinking about for the last couple of years is repurposing information that's purloined. So the New York times and other major news publications were very willing to take email that was taken out of the DNC servers and put it in their news stories. You know, about Debbie Wasserman, Schultz, and others who, who said things that didn't necessarily make Bernie Sanders look very good. Dr. Chris Bronck (10m 31s): So it was this idea that that stuff could be stolen, which is that the cyber angle, which, you know, I know a lot about that. You know, purloining information off someone's system is something I get really well. But then the idea that you were purposed this information in a rhetorical stream to convince people of something that may or may not be entirely truthful, but fits your narrative to make them believe a larger message. Katherine Druckman (10m 58s): Do you, do you feel like that sort of meme level of disinformation is as harmful as other attack vectors? Dr. Chris Bronck (11m 7s): I, I see it as the, you know, they are memes are sort of the billboards of social media. You see them a lot. They float by, you know, just like the real ones do on, on the freeway. So they they're very much what impressed me was that the political organizations that really use social media in 2016, and I'm really pointing the finger at Brad Parscale and the Trump campaign, they knew how to, to unpack a message and purpose it many different ways in many different ads that would attract different people's attention. And then they combine that. So what isn't like a, you know, necessarily a highway Bobo billboard is they could target individuals based on their demographics. Dr. Chris Bronck (11m 55s): So I don't know if any of you have ever bought a Facebook campaign. I have, it's very instructive on, I mean, I was, you know, I, I used to work in the diplomacy business and I've worked a lot with the intelligence community. I was like, wow, this is awfully granular. I sure can get to people from company X who live in city Y really easily. And I just pay money and I get them, they get their attention. Interesting. So that was, you know, it was the combination of simple message and highly precise targeting that, that really got my attention. I found very interesting Katherine Druckman (12m 29s): Is social media like a security hole by design then? Dr. Chris Bronck (12m 33s): Oh, I don't. I, I, for me to put a simple, I mean, we are talking about the concept of social cybersecurity now, which is when people insert, you know, everything from links to obvious malware kind of downloads and stuff like that, which that was a big problem on social media a few years ago, they've gotten better at it, but it's still a problem. So it's like the click, the link problem for your email just moves over to your browser. Fortunately, the browsers are getting better, but, but social media is basically used as this vehicle for not presenting information. I mean, first off, you know, I don't want to get into a section two, two 30 debate because I don't want to go there. Dr. Chris Bronck (13m 14s): I'm not a lawyer. I don't want to play one on, on this, on this program, but essentially, you know, the, the real issue for me is, you know, these platforms don't have to, you know, make any assertion. Like the Washington post cannot run things as either news content or advertising content that are flagrantly untrue, or they're labeled as advertising and, and so forth. You know, it's this squishy, you know, we're not responsible for it, but we are a dissemination channel. So there's a, there's a gray area there. And then, you know, the other thing that we noticed that was going on was action, like NATO and have an exercise in Norway and the Russians would do everything possible to make it look like the participants in the NATO exercise were, you know, some of the countries are doing bad things. Dr. Chris Bronck (14m 5s): The Russians did a bunch of stuff with the Baltic republics, like Estonia, Latvia, and Lithuania, where they would try to, you know, get these stories out there that say a German NATO soldier, you know, went into town in Lithuania and raped a local teenager or something. And you know, that Lithuanian should be outraged by the behavior of the soldier and kick out NATO, which, you know, for a Lithuanian to do, it'd be absolutely nuts. So it's this idea of inserting, you know, so I try to have a good operating definition of information, misinformation and disinformation and information is, you know, somewhere between data and knowledge, it's just a block of stuff. Dr. Chris Bronck (14m 48s): Disinformation or misinformation is I heard it wrong generally. So misinformation is something I attributed to confusion, not malice. I see a lot of misinformation on, Oh, where are those COVID vaccines right now? You know, they're at the football stadium, really? No, they're not, Nope. That's not true. And then discern information is when, as someone is actively applying a amount of false information or narrative into, into discourse on a thing that's just absolutely untrue and they know it. And they're trying to convince you, the untrue item is true for an amount of time. Petros Koutoupis (15m 26s): So, you know, taking a step back and going back to social media, you know, it's, it's kind of hard to get past the meme BS, right? Yeah. I opened up, I opened up my timeline and people that I like, people that I don't like, you know, it's just, you see it. And you know, that 99.9% of 'em, it's just misinformation. But what I've noticed recently is that Facebook is starting to flag things. And what is your take on how that's being handled? Petros Koutoupis (16m 8s): And is this a good thing? Is this a bad thing? I mean, where do you see this going from here? Dr. Chris Bronck (16m 15s): Well, I mean the immediate retort of, of company it's like Facebook, I mean, you know, you know, being here in Texas, you know, when you say the word, the death sentence, you know, it runs like, Oh yeah, that time that Southern Methodist got banned from playing football. And so, you know, our contemporary, you know, death sentence for misbehavior and social media now is to be banned, to be barred from, you know, so president Trump has now barred, I guess, permanently from Facebook and Twitter. And he has to send out an email saying what he wants to say, which muzzles him in a way that, you know, it was pretty fascinating, but essentially, you know, I think that these companies are realizing if they just allow for a, a free for all of, of just everything from untruth of all stripes, basically operating on them without being a responsible intervener that their business model can collapse. Dr. Chris Bronck (17m 17s): I, you know, I advise all of you to read know because Facebook is a publicly held corporation. They, they enumerate all the risks to Facebook going under every year. And it is a, a, just a litany of things that could go wrong for them that will make them the next Iomega zip drive or, you know, prodigy online. They are very frightened at the things that will, I mean, in Facebook is a one point and billion or 2.1 billion person, you know, or 2.1 billion account endeavor. Now it kind of seems like, Oh, it's never going away. And, you know, when they add all these new platforms like Instagram, you know, heaven forbid, and speaking of Instagram, that is a platform of beads. Dr. Chris Bronck (18m 0s): I mean, that's what it was built for. So I think there is a self-regulation afoot, you know, with the, the idea that it, it should be done to avoid actual, real regulation by governments. Doc Searls (18m 19s): So I, I have a theory and I don't know, this just speaks to the, your, the, the original purpose or convening this, but it was around cybersecurity. But I I'm very interested in is in this misinformation, disinformation as a, as a technique, as a, as a, you know, as a technique for Russia for any country I've met. So I wouldn't be surprised if the U S does its own disinformation as well, and how easily people who be made to believe this stuff. But here's the, here's the question I have since you've been observing Facebook, I have a theory that, that, that they can't be fixed, that they actually can't be fixed. Doc Searls (18m 59s): If, if your business is, is advertising to people who are algorithmically, nudged toward engagement, you're going to want to reinforce engagement no matter how it happens. And if you also have it set up in such a way from the get-go that you really can, you have so many advertisers, and they're so hard to monitor as a systematically and so easy to game systematically that you just don't do it. I thought it was a really critical point. You made her earlier that you know, that if you're advertising, if you're doing a political ad on TV, you have to say, you know, my D my name is Joe Schmo, and I approve this message and all that stuff. Doc Searls (19m 41s): Whereas on online, you don't have to, on top of that, you, you may have whole countries coming in and faking up news in order to help you out, because they don't want the other guy to win or something like that. And it's not a it's, it's like, it's so impossible to monitor that. It seems to me that there's something so inherently narrow about what they do, that it may collapse if its own weird wait or regular regulators, come in and just say, screw you. I mean, Tim, Wu's in there now. I know Tim Wu, you know, is pro he probably wants to break them up. I dunno what you break them into. I mean, w what's what's a piece of a Facebook, you can break away Instagram, but that doesn't change. Doc Searls (20m 24s): What's wrong with Facebook in the first place. Dr. Chris Bronck (20m 27s): Yeah. I mean, you know, a book that I assigned last year to my, my cyber great books class, which is a very lengthy book is Shoshana Zuboff surveillance, capitalism book. Doc Searls (20m 40s): I have a blurb on the back of it. Dr. Chris Bronck (20m 45s): That's all Doc Searls (20m 45s): Right. I didn't need to read it. Is it it's a Dr. Chris Bronck (20m 48s): Fantastic book. I mean, and at some point she makes it a lot of different ways, but you know, we know that advertising model of attention being the valued, you know, metric, you know, that you want people to pay attention to a thing for a period of time and hopefully monetize that attention in some way is interesting. And, you know, I, I read a piece of research. I mean, so there's a lot of stuff. There's the MIT guys who did the paper on Twitter and that, that false information spreads across Twitter more quickly than true information because it's sensational, you know, it's like, it's like the rumor that, you know, J-Lo is dead or something. So she's not dead. Dr. Chris Bronck (21m 28s): She's fine. Although I guess she's breaking up with a rod, so that's over. And I think that's valid. And I tried to inject misinformation. There's a, there's a great, you know, during the invasion of Iraq, I guess a rumors spread around our troops that J-Lo had died. And I always thought that was kind of a music that a bunch of people out in the, in the desert, you know, we're worrying about that, not war with Iraq. So I think you, you, you make a very solid point that, you know, this doesn't seem to be sustainable in some way that just, you know, having platforms that want us to spend all their time on them and the other, the other thing I read a, a piece of research recently, and it said, well, Facebook wants people to argue because people who are arguing are engaged, I mean, and it's no different than being on, on the street in Cairo or any of these Arab cities where, you know, political argument and cafes, or, or it could be, you know, Western European cities too. Dr. Chris Bronck (22m 35s): But the Arab street was always the place you went. Like, if you're a diplomat, you need to know what's going on in Beirut. You goes to go to some cafes and, you know, pretty fast because people argue quite noisily and, and, and emphatically, this is, this is our street. This is where people argue. And if you, if, if you're saying that you're going to monetize something and it's people arguing, I'm not sure that's good. The thing is that it's like Doc Searls (23m 1s): They've created a machine that's really designed to reward or to, to attract disinformation and to reward it and to be, and to make it part of the system. I don't think that's what they wanted to do in the first place. It's what algorithm rhythmically works out because people, it, it, it becomes our street. I mean, it might as well just be called street book, right. Because that's at least in a, in the era of street sense where people go there to have political arguments, but then they get, but then they get hurted and, you know, by hum awfully you tend to group with your own kind to a giant amen corner of one kind or another. The, I guess the question I have is probably not an answerable one until it actually happens is whether or not that's a fatal flaw or is it, or is it just a really great design? Doc Searls (23m 47s): And it just amplifies the tendency of humans to gossip and that's going to be, that's fine. That's what we do. Dr. Chris Bronck (23m 54s): I would suggest this. So one of the set of papers that I worked on about a decade ago, we had a grad student in the cyber security lab at rice when I was there, who was very interested, I think, as his spouse was a Falun gong sympathizer. And she was very interested in how Falun gong search terms were censored by China. And this was when there was still a google.cn that operated in China. So that gives us a pretty strong piece of evidence that if you have that censorship capability, you know, which the Chinese government, you know, so China doesn't allow Facebook, but they have, you know, Chinese, you know, said that joke from Silicon Valley, it's, there's Chinese, everything, there's your Chinese Facebook, there's Chinese Twitter, you know, so there are all these platforms that are parallel platforms to what we have here, but, you know, the Chinese ban them and then they create their own, I guess they never created a PRC Wikipedia, which I think is their loss, but that's another story. Dr. Chris Bronck (24m 51s): But yeah, without that highly intrusive, straight jacketing force of a totalitarian government, I think that becomes very difficult to moderate. And then the other side of it is, I mean, if you look at, I mean, I think Chinese people like to have political debate as much as any other people on the planet. And I work with lots of Chinese Americans and Chinese nationals who've come here to work and lots of Chinese students. And they don't like to argue too, but, you know, I think the question that I would throw back that's also unanswerable is, you know, is our culture of political argument and the culture Wars that we see, you know, I'm in Texas where we had this, this, you know, weather disaster a couple of weeks back and, you know, our governor immediately went to, Oh, everyone take off your masks. Dr. Chris Bronck (25m 40s): That's all better. And I think that's kind of a, I think, yeah, techies would call that a pivot, but I, I would call that a, you know, a sideshow or a smoke street or something to distract us. And I don't think that, you know, when it was three channels of news, plus the PBS guys, when I was a kid, that stuff didn't get that kind of discourse, I don't think existed in the same way in this country that it does now. It still drives me nuts. The news stations are always like, check us out on Facebook. And it's like, wait, you used to own this space. And now you're just giving it to those guys. And, you know, there, I think there, there are many problems, but, but essentially it all boils down to, you know, what is the definition of yelling, fire in a crowded theater in social media space, then Doc Searls (26m 26s): Maybe it's nothing but yelling, fire it's different colored fire. So the other, the other thing I wanted to make sure we got into before we have to wrap is unless actually doc, did you have else that you wanted to talk Dr. Chris Bronck (26m 44s): About in terms of disinformation and that sort of thing, or Doc Searls (26m 47s): No, I think, I think we touched on a lot of it. I, you know, I, a lot of it just an answerable, I think we just have to sort of wait and see how it works out. I'm I mean, personally speaking, I'm just not, I'm not a big believer in the, in the regulatory answer in part, because I think most regulators don't understand tech very well or economics, and, but worse, you tend to, you know, every new law protects yesterday from last Thursday and, and lasts for hundreds of years, you know? So the section two 30, I mean, it's an interesting law and we're still living with it. You know, we're still living with the DMC. If you want to know why we don't have music podcasts. Doc Searls (27m 28s): Well, look at copyright laws. It was as laid out in the nineties and the DMC, you know, I mean it's, and, and we don't even realize that's what was behind it. So I don't, I don't know. I don't have an easy answer. I, it is fascinating to me. I mean, I actually, what I could ask you, Chris is actually going back to your diplomatic thing. So do you, do you buy, I assume, I assume you do, but I don't really know that that, that the Russians were very active in, in, in the last election and were actually expert at playing Facebook and playing the social media. Dr. Chris Bronck (28m 5s): Yes. So a million years ago I got to do a study abroad at Oxford university and it was right. It was right at the very tail end of, of that thing, the Soviet union. And I, I was one of those people that, you know, I, I had read too many Tom Clancy books as a teenager and John liqueur, eight books and other things like that. And I wanted to, you know, I wanted to stick it to the sows and be right, you know, and, and, and, you know, and fight against what I believed was a, a very wrong system. And, you know, what I learned in studying the Soviets was that they were exceptionally good at. And Thomas read over at Georgetown has written a great book called active measures. Dr. Chris Bronck (28m 48s): It's all about how this basically how the Soviet screwed with West Germany in the 1950s to eighties, you know, playing with their political system, which they were trying to figure out. And I think that the Russians have always been extraordinarily good at, you know, the, you know, essentially disseminating propaganda, you know, propaganda of the communist international from, from the, the very beginning of the Soviet regime. And then the second element that the Russians, if you start digging into like how Russians do statecraft or more, there's a, there's a concept of theirs called Maskirovka, which is the, the idea that you're, you're trying to, in some way alter the, what is known by, by, by, by measures you put out there so that you confuse your, and that can be stealing information from the democratic national committee, and then putting it out on, on the wiki-leaks website, which, Oh, by the way, you know, after the U S banks, you know, were blocked from, you know, allowing credit card transactions to pay WikiLeaks donations, to keep them operating, you know, there was a pretty good paper trail on, you know, Russian oligarchic, interests, CEO, passing money to astonish and company. Dr. Chris Bronck (30m 9s): And I, you know, I don't want to get into slings and arrows on is, is such a good guy or a bad guy. I have a strong opinion in it, but it doesn't really matter, but these platforms, you know, the radical, you know, I, I, you know, I call them radical transparency platforms. And as a diplomat, I don't believe in radical transparency because making policy is not always a pretty thing and not everyone would probably should shoot her, wants to know, but, you know, and that the real problem is that you can take pieces of that, that, that policy creation process. And, you know, you can, you can manipulate it to something that, that you use to, to serve your agenda. I mean, the irony is, you know, we also have a, what was it, a Podesta, his recipe for risotto got out because you know, his Gmail account got hacked. Dr. Chris Bronck (30m 57s): And that's just funny, but you know, the idea that the guy who was in the driver's seat for the democratic, you know, the, the Clinton campaign, you know, his Gmail gets nailed and that's weaponized against the campaign. That's a big deal. And that's where the cyber intersects with the informational information, this information piece. Interesting. Katherine Druckman (31m 16s): Yeah. Okay. So the other thing that I mentioned, you know, I wanted to make sure we made time for, was it something we talked about offline and that is acoustic sensing and security and certain vulnerabilities we may have in our IOT devices, things that we have around the house, that sort of thing, our personal mobile devices, et cetera. And I thought, you know, I think that's an interesting, well, it's an interesting vulnerability that I don't think people immediately think about. Dr. Chris Bronck (31m 51s): Yeah. I mean, so, you know, putting on my security hat, any one authentication factor or vehicle is bad and like anyone else, when the version of the iPhone came out that had the facial recognition authentication. I mean, my first response was, Oh, it's not gonna work. Right. Nothing, nothing like that ever works. Right. I've I worked in speech recognition in the nineties and I was like, it would work, but it took a lot of like training of, of, of computer models with lots of different audio types. And it's like, Oh, you have the Japanese guy say it again. So we get that more. Right. You know, and you know, so to authenticate solely on voice is interesting. Dr. Chris Bronck (32m 37s): And, you know, we just had a wonderful who came through and gave a talk at my school last week. I think talking about, you know, the acoustical, you know, and you know, something coming out of your mouth that is heard by a microphone microphone on one of your devices has different qualities than playing the exact same, you know, utterance through a lot speaker because loud speaker is a, it's a, it operates on a single access where, where is your voice is complicated biological machine that has all these different parameters to it. Your ear is also that, you know, complicated in a way that, that a speaker diaphragm is, is not. So, you know, the, the crux of security there is, you know, how do I defeat someone, replicating my voice to break into my thing, then the broader issue. Dr. Chris Bronck (33m 23s): And I think you mentioned was also the problem of these always on devices and the privacy ramifications and the discoverability issues. And, you know, we've already seen court cases where, you know, somebody gets murdered at home and it's like, wait, this person has all of these beacon devices, things from manufacturers, X, Y, and Z in their house. Do they have the goods on who the murderer is? And do we want to know that? So, so it's a multi, when we started looking at the acoustical elements of security, it's kind of a multi-phase problem that covers a lot of terrain as well. Katherine Druckman (34m 0s): So, you know, how, how concerned should, I mean, okay, so backing up a little bit, we, we talk about these types of devices all the time. We talk about ring devices. We've talked about the Amazon flying ring device and poked a little fun at it. We've talked about wearables, and we have talked about the Alexa echo devices that I think, well, Petros may have one doc and I don't have them because they're creepy. Petros Koutoupis (34m 30s): I refuse to have those devices in my house. The only thing that's always listening is just my smartphone. That's it? Dr. Chris Bronck (34m 38s): Yeah. Actually I shut off my smartphone acoustical trigger because I just, I, it creeps me out Petros Koutoupis (34m 48s): And the iPhone is actually pretty good. It's, it's Facebook, the Facebook app that is creeping me out, but everything else on the iPhone, it's actually pretty, pretty decent in terms of spying. Katherine Druckman (35m 0s): But in your opinion, how, how, how concerned should people be about having these types of devices? What steps should they be taking to make sure they're not giving up a little bit more than they realize? Dr. Chris Bronck (35m 11s): I, I, in general, I don't believe people understand well when they, when they are trading away their privacy. And it's interesting because, you know, I don't want to plug for any company, but Apple has definitely made privacy, you know, in the San Bernardino shootings occurred that were, you know, labeled as a terror event. I I'll, I'll, I'll accept the FBI's categorization there, that there was ample evidence of that, you know, Apple said, we're not going to, to break the module. You know, that's going to give you a passphrase on it. And then the FBI essentially said, well, we know some Israeli dudes and they're really clever. So buzz off have a nice day. Cause there's always some clever dudes someplace that can figure something out. Dr. Chris Bronck (35m 52s): But you know, the, the issue for me is, you know, yes. And start, you know, and I, my model for everything is always like, well, how does it work in star Trek, the next generation? How did they do that? Oh, you know, and I remember a colleague of mine, you know, do you know, Amazon program, the, the Picard T prompt with response for the replicator in the device, in their version 1.0. And I was like, Oh, that's very clever. I still don't like it, but that's clever, you know, that, that appeals to me because I watched all of those shows in the 1980s or nineties or whatever they came on. But I, I, you know, I think that we're always looking for a way to, to break ourselves of the user interface for the computer that we've had since the 1970s, you know, a keyboard and some sort of pointing device, you know, and I have colleagues who write paper after paper using dictate software, and I just can't do it. Dr. Chris Bronck (36m 49s): It just cause it's always like, Oh, why did I put a period there? And it's still screws up, but when it's perfect, well, I want to use it. Maybe, you know, I've got tendonitis, I play bass. I mean, my, my tendonitis is awful typing for me is not forever, probably nor driving. And, and that's really, you know, so when we start to worry about these things and how they operate, you know, once again, when I put my cybersecurity hat back on, it's like, you want all these devices to talk to each other in some way, Oh brother, good luck. Because, you know, while we have Apple marketing on privacy to a degree, I don't think any company can make a pitch to need it. And unless it's a security company saying, Oh, well, security is a feature for us at, at, at our company. Dr. Chris Bronck (37m 33s): And I just, I don't, I don't buy it. I, you know, when a purveyor of something other than security products comes to me and it says, Oh, security is really important here. I'm like, okay, well, I'll take that with a grain of salt for now, but, you know, prove it to me and I'll feel better. Katherine Druckman (37m 48s): Hmm. Yeah. You know, it's funny, you mentioned, you know, typing won't be forever and neither will driving. And, and that makes me think there, you know, there are so many great applications for a lot of these types of technology for, in terms of use and assistive technology. There's an accessibility angle for sure. But, you know, I, the, the cynic in me just wonders, you know, if, if we're responsible humans, I mean, if we're responsible enough to have all of these great things exist without exploiting them in some way, you know, I took offense to that comment because I live in the command line. I will always be typing, Dr. Chris Bronck (38m 28s): You know, you know, except when you can prompt the command line, you know, and you know, and when it has an assist of logic for when you get to be my age and you're like, what's, what's the Linux command that deletes this, but stores a copy. How can I get SSH to work some way to do this thing I want, you know, I don't remember, remember, I mean, I've been away from the command line. I mean, you know, I get back to it from time to time, but you know, my, my fluency is not as good. And if there's a cognitive enhancer for me, that allows me to have that capability, you know, I'm going to want it. But yeah, I think it really does come down to, as the computer does more, I think already the smartphone is a cognitive and enabling device is changing the way our brains are wired. Dr. Chris Bronck (39m 14s): I think there's some interesting research on that as well. I'm not sure I buy all of it, but the idea that that device that's in your pocket is trying to be kind of intuitively, you know, approaching you and understanding what you want and delivering it to you, you know, that is happening. Katherine Druckman (39m 35s): Yeah. Yeah. I think, you know, there, there are certain boundaries. I feel like we all need to, to put in place that I don't know. I think not to bring up something totally not safe for work, but I think there's a guy somewhere who got his penis cage hacked, who was a good, a good example of why we need to, and I don't know, bill, it'd be a little bit more concerned about the devices that we're relying on in our lives and what Doc Searls (40m 1s): They're capable of, what hacked his penis. Katherine Druckman (40m 4s): Oh, you didn't hear about that. Yeah. There's some kind of like sex toy that like puts your stuff in a cage and locks it up and somebody hacked it and literally held his junk for ransom. Doc Searls (40m 14s): Oh my God, it's horrible. I didn't know that it was possible, but that it is possible. What new sites are you reading? Catherine thing really interesting scene if that's information. Dr. Chris Bronck (40m 30s): So that, that basically means that you have to change the default password on your junk. Doc Searls (40m 38s): You need factor was that occasion opera Dr. Chris Bronck (40m 42s): Text message or that thing's not coming off. Got it. Oh man. Katherine Druckman (40m 46s): You know, I, I tell people who are, you know, I, I am not a security expert. I am interested. And, you know, the last Def con I wandered in with, with, with a friend to the medical hacking village, which is, you know, it's, they do a lot of it. You know, there's, there's a lot of great work happening there, you know, that benefits a lot of people, but at the same time as a novice, it's terrifying, like you go in and you just, the thought of it is terrifying. And, and yeah, it just kind of changes the way you think about technical devices and, and yeah. Dr. Chris Bronck (41m 22s): Anyway, sorry. Doc Searls (41m 24s): So, so here's a question for you, Chris, because this is sort of where I'm coming from, and it may not be a tenable position, which is that privacy is, is, is basically personal. And we should have control over our privacy at our end. In other words, as long as you're, in fact, I wrote a piece of Linnaeus Charlotte, about this, that if your privacy is up to others alone, you don't have any, which is, you know, so it, it doesn't matter what every company's privacy policy is. If I have to deal with every company differently and I have a different trust metric for all of them and, and, and they can change it at any time that they want, when would I really need is something on my device that prevents them from doing whatever it is I don't want them to do. Doc Searls (42m 9s): And ideally, that's what we want. I mean, because our, our phones, for example, we feel like they're an extension of us, which is, you know, which Marshall McLuhan says, all, all technology extends us. You know, I've been drinking from a cup, you know, I have the capacity to hold water because I've got a cup, right. And I can't do that with my hands, but that extends me. Well, our phones extend us in all kinds of ways, but they also were full of suction cups on the tentacles of countless companies. Most of which we don't even know are there. And they're infected with all these things that are busy following us everywhere. And it's pretty creepy, and we should have full control over that. And as Petros is saying, Apple gives us a few more controls over that, but there's still a bag of, of, of apps. Doc Searls (42m 51s): And those apps could do all kinds of stuff that probably they can't even monitor that well. But if we had real prophylaxis at our end, if we had real policy control at our end, where we could even put in an app, that's going to harm us, or we would have enough filtering of some kind policies on our end, where if they don't comply with, with that in a way we can audit, they wouldn't even get in. But almost nobody's talking about this possibility, which is amazing to me because these are personal devices and we are people, you know, Dr. Chris Bronck (43m 23s): They're kind of niche products. So obviously Apple's a walled garden. You know what Cooper tick Cupertino wants to integrate it integrates and builds. And that may be option B as we're going to, I'm going to have to take a Samsung Tizen operating system off the table and whatever the quality operating system is, I'm going to junk that too, although it's, it's, it's advancing rapidly. So that means picking on Android and Android, you know, so I worked with a PhD student years ago on a, I did an article for, for first Monday, which is one of my favorite kind of internet culture journals, all about ad where on smartphones and, you know, on the Android phone that, you know, ad mob and the other ad libraries are, you know, there like any other software library on a computing device, but their sole purpose is to serve up ads. Dr. Chris Bronck (44m 14s): And we were doing a research review about, you know, catalog, you know, we built virtual phones, the student did all this great work and, you know, he was like, okay, so this, the, these, these apps use these ad libraries. These apps use those, these, these libraries don't serve up ads. What are they doing? And that was kind of the crux of the research. It was like, so you have things masquerading as advertising vehicles that are obviously not, and that's not going to go away. I mean, and we do the, we see the seams of this, you know, once again, Apple's privacy work with the rest of the Valley where it's saying, you know, we're going to go war with Facebook because we think that they're manipulating our platform for their, their business ends. Dr. Chris Bronck (44m 58s): And we don't like how they do it. And, you know, we're, you know, I don't see Apple throwing Facebook off the device anytime soon, but could it happen in the future? Maybe Doc Searls (45m 10s): Th the interesting thing with Apple is that they, all they did was announced that they're getting rid of the ID for advertisers, the IDFA, which is an appalling thing that never should have been in there in the first place. And have had they been mindful of privacy back when they invented that thing, they probably would never would have put it there in the first place, but Facebook is like, Oh, Oh my God, if they're going to kill us, because they're taking off this thing that allows us to target Katherine Druckman (45m 34s): Well, I think, I don't know we've covered a lot and we, we should do it again. But I think for now, we've, we've hopefully given our listeners some, some, some interesting stuff to think about Dr. Chris Bronck (45m 44s): For all the tech we have in our society. We sure don't talk about tech and society together very much. Katherine Druckman (45m 50s): Wow. That is, that, is that too long for a title? That was, that was a great, I think that was man. That was a great final note. Thanks for that. Dr. Chris Bronck (45m 60s): Sure. Maybe, maybe my next NSF proposal, who knows. Katherine Druckman (46m 3s): So thank you, Chris, for joining us and thank you Petros. So as always, and thank you to doc for everyone listening. I hope we have inspired and given you a few things to think about, we always love feedback. You can reach us all over social media, just look up reality to podcast anywhere you are. You can find us through our website, reality to cast.com. You can reach us via email at info at reality, to cast.com as well. And we love feedback and suggestions and even show ideas. So please keep in touch.