Katherine (0s): Hey everyone. Welcome back to reality. 2.0 and Katherine Druckman Doc Searls is with me and were going to talk about facial recognition, but specifically a tool that helps you identify whether or not you were a flicker photos have been used to train AI systems. So before we even get into that, though, I wanted to remind everyone that we, we do send a newsletter and you can sign up for it at Reality two casts.com. That is the number two, there's a link on that site to subscribe to it. Katherine (43s): We send out a short emails every week. We usually just include a few links about what were talking about so we can dig a little deeper and yeah, that's it. We promise to keep them short and or interesting or possibly even both. Doc (58s): And it occurred to me that somebody had to have a newsletter called Newsletter just a, you know, maybe a spy one, or it, you know, something that has to do with that. I just, there's just a thought, I just had an eye when you were talking about how to like the equivalent of like a typo, but from my years, you know, so it knows letter, you know? Yeah. So, Katherine (1m 20s): So this tool, so you've tried it out. Well, let me, let me give everyone a little bit of a bit of background. So we, we, we came across this in a CNN article, actually linking to this tool called exposing.ai. And you can plug it in your you're a flicker, a username, and it will tell you all of the various systems that have used your photos to train AI. And I dunno about you doc, but I feel like, I feel like in my account, I didn't have as many as you, but Doc (1m 53s): I've actually seen this before, but I hadn't seen it recently. And, and I'm glad somebody surfaced it again, just so you know, to be fully clear on it. It is it's CNN business. And I guess we can put this through to the show notes or into the Newsletter. Katherine (2m 5s): Yeah, it would be this blink will be in, in, in the notes, on the episode. Doc (2m 8s): Yeah. And in the headlines, there was a, this new tool can tell you to tell you if you're online photos or helping train facial recognition systems. And you know, so, and, and it's specifically for flicker photos. Now, flicker does a really good job of not only it letting you annotate your photo's, but you can capture in them. You can put, you can actually like pull it a little rectangle around somebody's face is saying, you know, this is Katherine Druckman, you know, or just, you know, Katherine or whatever else it might be. And, but also they do a good job of letting the photographer who's a creative commons license. Doc (2m 49s): And so I have like, you know, 70 some thousand photos they're and I think of the ones that I have bought, or Delices, there's like a 60,000 or something like that. This is since 2005 have accumulated a lot of photos there. I really like flicker. It's attractive to photographers for a whole lot of reasons that I think are good. And, but there's this thing where, you know, every good told you it could be used for bad purposes in a train. Or if you think training AI is a bad purpose than here we are. And so when I used to exposing tied AI, that's what it's called exposing.ai. And I recommend anybody. Who's got to flicker account, use that just to see what happens. Doc (3m 32s): And you put in your handle there. Mine is Doc, Searls all one word. And it, it came back with this 855 unique photos from the flicker account. Doc Searls where you used in the last two image data sets related to facial rec face, face recognition. The photos were uploaded between 2005 and 20 1420. Yeah. And I think probably since then, I have not, I've done many, many fewer photos of people since, since that time. And I've done a, whether this is, you know, this particular a cut at the use of something for facial recognition is to one, it says two, at least two images, data sets related to it. Doc (4m 19s): Isn't Oh, it says that we found in exposing AI, like read a little more firm it for an eight and a 50 or 51 on, in mega face, these photos and 3 million plus others. We are used for facial recognition, you know, at least seven or eight projects in 42 countries. And then it goes into a more about mega face. Now I'm red. Some of my photos are being used by clear, clear view is another, a far more well-known I think company. And I think it was a smaller number of that case. It was like 200, some of my photos, I don't like it. And that's, and it makes me think, you know, wha what's the remedy here? Doc (4m 59s): Well, the first remedy is one that have already done, which is, I just don't upload people's faces anymore. Yeah. Because I know there's a risk. And even if I have their permission, you don't know how you're going to be used. Now, if its a public figure or like I'm one of those photos they show here are two of the photos that are Joey TTO and Elena Kagan, a guy who, you know, Joey was the head of the media lab at MIT. And he was in a news of about a year and a half ago. I wouldn't say it cause it's, he's a friend. And I thought he was over exposed for something that was not fun. And Elena Kagan and Jonathan Zitron sitting next to her. Doc (5m 41s): It was from 2008. I shot a lot of pictures of Elena Kagan when she was the, the Dean of Harvard law school as suspecting that she would in fact be nominated for the Supreme court. And in fact she was at an interesting thing happened then the om, there was one, especially I think flattering shot. I wanted to get good shots of her. And, and there was one in particular was she was speaking at a day to set an event and, and that one was used a lot when she was in fact nominated, but it was used mostly by some, you know, by right wing news organizations to try to pull her pants down or whatever, how do we might put it on judges is probably a bad metaphor, but try to, to expose her as to, to expose her sexual orientation, which she talked about. Doc (6m 38s): But, but you know, frankly should not have been a factor, but they were trying to say that she lied. And an interesting thing is that we were going all about her lying and not a damn one of them are acknowledged that he had shot the photo, which is what their license required. I had, I had a, a creative commons license that, that requires a, an attribution. Now I don't care that much about getting the attribution. I just, that's sort of my, my, my way of seeing if anybody's using them, you know, and, and just seeing where they show up, which was kind of just interesting to me, but Katherine (7m 14s): That's a good, I just, I just looked up that photo by the way. It's a great photo. I love like the orange jacket. Yeah. It's a really fun it's on her Wikipedia article. Doc (7m 24s): Yup. It is. Yeah. And is great. Katherine (7m 27s): And, and I shouldn't say, you know, I mean, this is one of the, Doc (7m 29s): The thing is, is to kind of disappointing to me. I have, far as I know, there is no way to tell an M probably this is probably, if this is not a true statement, it's close enough for me to yack about a little bit. It's possible that I have more photos on Wikipedia than anybody I've put. None of them there. I shoot lots of pictures out of airplanes, of, of geological formations and small towns and, and national parks. And so the coastlines and radio and TV transmitters, all sorts of stuff. And I put them in flicker and I annotate them generously and, and they go and they show up in Wikipedia because people want to illustrate them. Doc (8m 10s): And then all you need to do is put them in Wikimedia commons. Wikimedia commons gives me the credit, but then because Wikimedia and commons in Wikipedia related. And if you click on the mule, find the attribution at Wiki media comments. So we can do to your comments as kind as this buffer, this holding a place for photos that show that might show up in Wikipedia. There's one I took of the very distinctive exterior of the airport in Denver, the Denver international airport den is this is the short hand for that. And of course it looks like I'm mountain range. That is the largest Tant in the world. And it's, it's interesting and I've shot the shot that a lot. Doc (8m 53s): And it was one photo there that is in like 20 different Wikipedia because a lot of them are just the same one in different languages, but there's, there's a lot of that. And, and I would like to do more of that. You know, I would like to put up more pictures of people, but I don't want them training AIS because that's, you know, and I'd like to see maybe if it could be done a, a, you know, an extension of the license, the, the, the creative commons licenses is not for training AI is, but the thing is, is nothing to stop. That is kind of like doing a track was in the first place. It was a polite request. And frankly, all those licenses are a replay requests. You know, you can violate them all over the place for a few of them ever get litigated. Katherine (9m 35s): Okay. And, and unless you're in Canada. So that, that was something else that, you know, we, we brought up that Canada declared Clearview. AI is activity effectively illegal and being declared that, and it was illegal, I believe, under existing law. So, I mean, it's, you know, that, I think that any legal gray areas are going to be worked out imminently. You know, I think they think that this is going to be an ongoing thing. I'm glad that that Canada did in fact, to make such a declaration and, and a interpretation of their law or clarification, if you will. And I hope that we're not, you know, too far behind, but you know what, I'm also, I'm not holding my breath or getting back two, exposing AI. Katherine (10m 20s): I don't know about you, but I had a definite visceral feeling when I plugged in my username to this app. And it, I didn't, of course I don't have anything like the number of photos that you do. And then I don't have a Elena Kagan on mine either, but I do have some interesting people, but I have found 28 photos and they were, they were used by a bunch of different entities, mega face as one data set dyve face is another one. There are different, like, yeah, it lists different ones. And and I had it, my, my reaction was a, it felt so icky that I felt like, and I know not to be gross, but I kind of felt like it was like being notified. Katherine (11m 5s): Then I had a communicable disease and now I need to go back and notify everyone I've been in contact with. That's kind of how it feels. I feel like you are now go back and tell all of these people and these photos, Hey, you were a photo has been used to train, you know, whatever project it is, mega face, let's say. And there it is. Like, I just, you know, I feel like, I feel like I owe somebody that just at least a basic notification. And it it's a very strange feeling. That was, that was the that's the closest way I can describe it is just, it feels like it, it definitely feels violating, but it was just a weird, weird, it's hard to describe how, but that that's the best analogy that I can come up with. Doc (11m 50s): Well, it's a little bit of a, you know, there is, I'm going to be profane about this, but you know, that the there's, this apocryphal probably never was written the song, which that the title for which it was, you were only following up while I was making love it. It's a little of that though. I mean, it's like, I where I am doing something nice for friends, you know, because, you know, and, and here was this birthday party. There was this wedding more commonly for me with, especially with the Berkman Klein center, I shot a huge percentage of their photos. I think they're not a pro account. Any more meaning that flicker is probably gotten rid of all of their older ones, photos. Doc (12m 33s): But if you look at they're at, you know, I, I, you know, it, it there, or if I went to a Berkman center or a share or a search for that one, it used to be called just Birkin center. And it says there are 246 photos there. And of the ones that they've used, you know, that the, the photos that got used for his Twitter 40 or more photos, and there's, one's a Charlie Nesson, there was one with some other people I know, not so many of mine that I thought, but it's in it. It's sad. And, you know, I just said it annoys me, but here's the thing. So here, here's a thing that the thing, right, there are many of the things, or this is a thing, I think a number of people hearing this would say, well, what about China? Doc (13m 21s): You know, how many billion people who are in China that China is the government has every one of those phases, the government is following every one of those phases. If you, if the government wants to look up where those phases our right now, it can tell you exactly within a second, supposedly, anyway, that's the report that I read. And they have that whole social, a social grading system, where if you get caught up, you know, if the, if the surveillance camera sees you're jaywalking, it, it, it, it goes against your record and all that kind of stuff. Its it, it is one big, big black mirror episode. And there's this other one, which is both Scott McNealy. And I forget who else said this as well, which is forget privacy. You just don't have it, forget it. Doc (14m 1s): You know? And this is like decades ago because we on the internet now, there is no more privacy and, and that's kinda the new default and, and we, we haven't worked out how privacy works here yet. And I think my belief is the only way we work that out is from our side. We have to have ways to say what's okay and what's not okay. And, and it's, and, and we haven't got those. And as long as we were doing it, only from the policy side, we're not gonna get it. And we are not going to get it. If it suddenly becomes illegal to do this. I mean, and the state of New Jersey, for example, its legal data is clear, clear view AI, but you know, there's really nothing to stop a cop or it was somebody from, you know, there is, this particular thing is exposing. Doc (14m 47s): AI says they search something like six different databases that they were able to get into six of those. And I'm going to look at a, okay here, here's one So VGG phase PIP, a big, a phase IGB dash See face scrub and dive phase. Those are the ones they can search And but that's the way that doesn't include a clear view AI right now, I just, I just typed in the name of another one of my flicker accounts called infrastructure and its all about things. There are no people in it, but I just wondered if it had the people in it because if we did, they might've seen it, but they did. And, but I also don't know whether it's because I licensed those two allow, you know, we use with attribution and they're using a loophole in creative commons in order to look at those then, and they're not looking at ones where it's all rights reserved or are they, I don't know. Doc (15m 40s): I mean, that's an interesting question and I know Katherine (15m 43s): The photos, well, the photos I'm looking at are, are their creative commons, but there are a non-commercial and I would argue that these data sets are definitely for commercial use. So they are not respecting the, the, the, the license in that regard. Doc (15m 56s): Yeah. And here's an interesting thing. I, I was advised years ago by somebody may have been at the Berkman center, but it might've been somewhere else. That the license I should be using as simply attribution only, and not non-commercial attribution only because it was more likely that they show up in the Wikipedia if they were simply attribution only. So I haven't I've, I haven't done the non-commercial thing and I actually don't mind if there are, some of them are used for commercial purposes, I photos. I took of ice and the windows of, of an apartment that I had in Arlington, Massachusetts ended up being the wallpaper for the 2010 Vancouver Olympics, winter Olympics, and NBC used them and, and they contacted me and they said, Hey, listen, now you, you got these a CC license photos have of ice crystals and a we'd like to use them, but we can't attribute it on every time it appears on screen. Doc (16m 58s): Is it okay with you? If we run them in the credits? And I said, sure, go ahead. I didn't realize it would be their wallpaper. I mean, it, it was in every shot behind every score. It was in everything. And, and, and again, I didn't mind, it was kinda fun, you know, what the hell, you know? And, and a friend of mine said, well, didn't you at least get tickets in an after that? I said, I said to the guy who wrote to me saying like, Hey man, can I have tickets? You know? And he said, Oh yeah, come on out. You know, but the Olympics are already underway at that point. And I was busy and I couldn't go it's, this is too bad. I probably could have gone at a bit of a, probably could have gotten to something out of it. But you know, there's a, there's manners involved in this. Doc (17m 38s): What, what, what should the manners be? You know, we, we gesture to each other all the time in the physical world about what's okay. And what's not okay. We even have it like with, with masks, right. I, I go for walks here in my neighborhood. Right. And it's a somewhat sparsely populated neighborhood, but I mean, as in the sense that there is a lot of people don't that aren't out. So, I mean, the houses are not that far apart that it's just aren't that many people are on the road. And but I, but I do go for walks and, and I have them, I've a mask of my pocket of people are coming along with the mask on my face, but I don't always because you were 40 or 50 feet apart. And, and people, you know, generally nod to each other and waive and stuff like that. Doc (18m 18s): And in fact, I think there's a new ritual. We are, people will make more exaggerated gestures cause you really can't reach or read each other's faces. We are wearing masks like, like bank robbers would have used years ago, you know, or like an old westerns. The, the robber is always, you know, covered everything, but their eyes. Right. And it's kind of what were doing. And, and, and there are, there are new, new norms and mores That that happen. And every once in a while, before I get my mask out, I'll get like a scalp from somebody who has their mask on it. And, and, and I feel the rebuke and it's sort of interesting, but these are like little gestures that we have in the physical world. We don't have them in the online world at all. You know, so how do we get those? Doc (18m 59s): And I think their, this has to be done on the individual side. They are, or have, there has to be ways of doing that. And as long as all agency is sitting on the side, have systems into which we operate in an opt out at all times, it's a completely broken or a way of operating. And I don't have an answer for that. I just have, you know, I, you know, what are the, you know, my customer commons.org, we have a privacy manifest. I'm in fact it ran on Linux journals at first. And it it's been updated a little bit since then, but basically it says, this is all personal. We have to be able to, you know, cover ourselves and have all the clothing and shelter in ways of signaling. Doc (19m 40s): But this is just abusive and Katherine (19m 43s): Yeah. And, and consider considered going back in time to the moment you took these photos, like I'm looking at this. And then all of the photos, I think that it's, that it's uncovering were taken in 2008. A lot of them were at events. Some were at South by Southwest and, and, and that kind of thing. Right. And, and it, I mean, I would have never considered that this was something that needed to worry about that at that time. You know, again, we, we were, we were in a good, we, we posted these things are in good faith from a, a, a, a place of positivity let's say. And, and, you know, all of these years later it feels so icky. Katherine (20m 24s): And, and, and that's the part that I really struggle with. Like how, you know, how could we have prevented this? I suppose we could have locked down our accounts. We could've deleted everything, but we didn't know the, the, the very, the very beginning of, of our awareness of have facial recognition that, and, you know, an image, a scraping, but you know, it, I don't know. I don't know if we also read the description earlier, earlier on in this conversation, but this is exposing AI just has a little description under, under each a data set that it, that it's uncovering and, and mega face in particular is a little bit it's, it's really sketchy. Yeah. I mean, it, it says, this is according to reports from the New York times, the medic mega face data set has been downloaded thousands of times by companies and government agencies around the world, including us defense contractor, Northrop Grumman in Q2, all the investment arm of the CIA bite dance, the parent company of the Chinese social media app tech talk and the Chinese surveillance company, you know, I don't know what that is. Katherine (21m 29s): It also found it download requests from the Danish national police team, defense science and technology group and Edinburgh, European law enforcement agency. You're a poll Facebook Google. He envisioned Huawei I a R P a Microsoft military technical Academy of Bucharest, Russian law enforcement and security agency contractor still soft since net, since times 10 cent, Turkish police and, and the weight who else, Oh, the CEO of Clearview AI among thousands more. And you just keep going. Like, I feel like those people who have just broken into my house, that's literally what, what it feels like. Katherine (22m 10s): And it's, I don't know how it feels like there should be a way to prevent it, but once it's out, you know, you can't get it back. I mean, like, you know, I don't post photos of peoples faces on the internet any more, but we're in the minority. Most people do, and they post our children and they post other people's children and they post, you know, how many images have you seen recently that are people posting on, on the internet is a screenshot of, of their kids' online learning, right? Doc (22m 40s): Yeah. I know. I mean, my, you know, my, my, my grandkids were online. It had been completely scraped off at this point by their dad. You know, how many years ago? I mean, there, there, there are now 13 and 10, but you know, when they were younger, you know, we all felt a little bit more okay about it, but, you know, the family made the decision that wait a minute, you know, leave it up to them to put themselves online eventually if they wish. And there was a fantastic onion parody back in 2011, about, about Facebook that, that Facebook is, it ends Zuckerberg have self as a CIA agent, a called the overlord. Doc (23m 24s): And you manage to create this thing that allow, that gets, that, that became every law enforcement system is dream come true, you know, which is, here's a way to get everybody to reveal everything about themselves. So when it comes to the law enforcement, it makes our job easy. And that's what's happened. I mean, that is exactly what's happened that Andy and most law enforcement, I mean, do you know your trying to solve a crime and you really, you know, you don't want to have to go to the trouble of getting a court order. And the rest of this is a, Hey, look, here's this free thing. And I can find these people, and this is happening right now, of course, on the, the, the January 6th or the event when the, when, when Congress was, he was invaded, right. Doc (24m 6s): And, and the, and the FBI is it and other law enforcement bodies. Cause there are a number of all of them involved in that are just not so much having a field day, but they have a monstrous number of photos to go through. And video has to go through that. We've had to obtain from the people who were shooting this stuff live and it's, and that's an interesting thing is that because we've got, you know, the, the curb weight of people who were involved in that event, shooting video and pictures during that thing, and the number of reporters is doing the same thing with like, you know, a, a thousand to one, you know, meaning that, wow, there's just so much more information there, but here's, here's another place I want to go with this. Doc (24m 54s): And, and, and it's another one of the things that I don't have an answer for it, but it's the way that, you know, there was this rule in technology that says, what can be done, we'll be done until we figure out what's wrong with it. And then we just stopped doing that, or we try not to do that. And we're way past that point with facial recognition right now. I mean, it's the genie that's out of the bottle and granting every request to ever, and, and not going back in that bottle, it's just done. And it's a little bit like a Michael Crighton's and trauma to strain or, or, you know, Ice-Nine in a cat's cradle, you know, something that it changes the world. It isn't going to go back up, 2 (25m 32s): Sorry, it's kind Doc (25m 33s): Of like COVID Yeah, but you know, at at least we have a, you know, we are getting a vaccination for COVID, but, you know, and we'll keep up with it. I mean, flus mutate in the rest of it, we sort of stay out of the flu at the annual flu shot's and so forth. But My where I'm going with this though, is that it just may be that one of the things that happens with, and we can never get rid of with living in a digital world is, is near absolute surveillance by centralized systems and systems. We don't even know on the one hand and including infinite wrong information because that appeals to people's prejudices just completely out of control. Doc (26m 16s): And I mean, to me, one of the things that, that, I mean, it's one thing for the government to say, to say, Facebook okay, you're not going to share these pictures with anybody. You know, you know what people look like, you're not going to show that with anybody and they could maybe obey that, but they can't change what they do. And, and what they do is, is amplify engagement. And w what causes engagement, you know, it is, well, misinformation is a really great for that. So they have like a giant misinformation system that works really, really well, and they can't stop it. It can't be stopped So, but maybe we're in a world now where that's, that's one of the things you're not going to be able to change. You know, it's, that's kind of like, you know, Katherine (26m 57s): An interesting point. Did you read a, a piece in the New York times by Kashmir Hill recently about a woman who went on a massive revenge misinformation campaign that ruined this guy's life. It had something to do with the former employer and a family business, and I'll link to the article in the description, Doc (27m 15s): But it was horrifying, Katherine (27m 18s): Horrifying, I mean, all over the internet. And, you know, you couldn't search this guy without finding just the most vile, vile stuff. And of course it was all false and eventually it did catch up with her, but not before doing a tremendous amount of damage. And, and, and, you know, we were talking about the vaccine's and whatnot, and where do you know where's our vaccine for this? You know, it's, I don't know, it's something surely they're, you know, something must be done about that. I don't know what the answer is. I'll have to do a lot of hell of a lot more research, I suppose. But but it is, it's a disturbing thing that misinformation can catch on a lot quicker than then truth. Doc (27m 58s): Yeah, yeah, yeah. I read that and it's, it's, it's brilliant in troubling and a, and, and, and horrifying because it can happen to anybody. I mean, I've had, I've had a couple of a couple of occasions where somebody just decided I needed to be brought down in some way. And, and it was really disturbing. And it was the kind of thing where I finally, it was just able to, you know, talk to the person and become human to them. So, you know, and it was nothing like this. I mean, it was just more like, Mmm, I'm just going to get this guy's some shit, you know, but it it's, you know, especially when somebody's is wrong about You, you know, they're like, I know of one case is actually not about a person it's actually about a technology. Doc (28m 54s): Where are the attack on It? Is it has some decent criticisms in it, but it proposes an alternative that has no specifics in it. And, and it's just like, there's a, a decision was made by somebody that I'm just going to give a shit to this tech. And, and it's a different thing in a tech does, the tech is fine, you know, it was that it can take it. There is not like a human being is going to feel the feelings, but it's very, very easy, you know, to, to make somebody else's life miserable and it's yeah. And, and this kind of thing with the facial recognition, is it, you know, I think part of the project that the problem for me was that it is it's not human, and yet it wants to, it's good for you. Doc (29m 42s): What is human? And what's not human about it is that we, as human beings, don't remember specifics about faces. We have a gestalt understanding. We have a general understanding. It's not specific. It it's tacit. It's what they call tacit. It's, you know, we recognize the face, but as to describe it, they think eyebrows, which within eyebrows or the idea is, you know, how far is there a mouth from their nose? And you know, what are they, where do they have dimples? So do you have a cleft chin? How long are their ears there? We don't remember this stuff. It's, it's, it's a very general and in a very similar way that we don't even remember how he start the sentence is we end or no, how are we going to end? The sentence is we are beginning most of the time, and yet somehow convey meaning to each other. Doc (30m 27s): And that's a grace of being a human being. Computers want to be specific and explicit at all times. And, and so that's, you know, and so by having computers at our disposal as law enforcement, for example, or marketing, we can, Oh, well, I can, I can follow this person all over the internet. I can learn everything I can about them. And we could do bad things like what this person did in the cashmere Hill case. Just getting a vengeance at a guy. And wow. I mean, it, you know, and I don't know how to fix that. I mean, it's like we can get rid of computers. Katherine (31m 4s): Well, So, so that's another thing that as far as how it, how to fix it, you know, so some people have a skill set that at least it helps the mitigate damage, or it may not fix it completely, but we can take a little bit more control of our digital identity. We can, you know, if you have, if you own your name, you know, you own your domain name, you can, you can do it. You know, that's a step you can, you can populate the internet with content relevant to you that, that you approve of, or if you have those skills. And also if you, if you have the skills to do so, you can set up your own image, gallery, password protected, share it on the way you are with your family. So you never post pictures of your family on Facebook or whatever, but that's, you know, that's not doable for a lot of people, you know, it's, it's, it's a hell of a lot easier to just give up and say, well, it's impossible. Katherine (31m 50s): It's impossible too, you know, to have any privacy, but that's that, you know, that can't be the answer. It, I see a little bit of a parallel actually, but again, going back to COVID in vaccines, there is, there's a lot of discussion recently about the fact that people who do not have access to the internet are incredibly disadvantaged because this is a, you know, most of the, of the vaccine rollout has been via online forums. Online registries, you get, you know, you get a text message to your phone and you go and fill out this form really quickly. But I mean, who, who, who is, who was an advantage there, obviously people who are well connected, connected at all times can drop everything to go fill out a form. And that's just, that's not everybody. So, you know, it's hard to say how you solve this problem. Katherine (32m 36s): I, I like to say all the time that, you know, nowadays everybody to be a cybersecurity expert, you have to, you have to have a level of expertise in order to not just get totally screwed is somewhere, you know, something is going to get hacked something. And, and even people who are, they still get screwed But, but at the same time, it's just not a reasonable goal for every body. You can't expect every single person on the planet to have that skill set, to protect themselves. So, you know, so what do the rest of us, who do I do about it? Doc (33m 7s): Yeah. And I don't know. I don't know. Well, I do know what I know is that it's impossible. It's actually impossible for all of us to be security experts. And even if we are security experts, what can we do in many cases, the answer is nothing. Or, or just go live, you know, we live in an igloo somewhere with, with no connection to the outside or, you know, and I mean, I'm, I'm not exaggerating here. I think, I think you need, you, you, you need the, you know, some kind of Faraday cage around ourselves to, to, to prevent personal information from leaking out. Yeah. But you know, there, you know, as far as Kevin M. Doc (33m 51s): Kelly put years ago, the internet has a copy machine, you know, I mean, you send me a photo, a, you have to actually just copied it. You know, you send me an e-mail, your actually just copying, you know, we both have it now. And so it multiplies, everything wants to multiply and, and of course getting, you know, miming things and making sure they multiply all over the place is, is to some degree a sport. So a man, I don't know. I mean, I would like, you know, I, I just think it's good to take years to work this stuff out. And it's kind of like, you know, and some of it's going to be with policy. I, you know, my inner libertarian chiefs at that, because I think, you know, right now is, you know what I mean, the most new laws per take yesterday from last Thursday, And right. Doc (34m 41s): And right now it was last Thursday and it was a future yesterday that is going to get protective from now, and now it is going to change and there will have those law's for another X yours. And, you know, it's, it's just like, you know, it was just bizarre. And so, and I know, I know who to trust lawmakers is either a whole lot, you know, so that's another thing. Katherine (35m 0s): No, but whether or not the solution is regulatory or, or not, you know, I think, I think that the scale of the solution has to reflect the scale at the problem in some way. And, you know, I don't know, aside from, you know, legislative channels, I don't know what that means. I mean, it could be, you know, it, it could mean something that I haven't thought of it that I would love it if people listening could weigh in on that. Because if, you know, I would love some, some additional talking points on that because I feel, you know, it's funny, we bring up a lot of really frustrating problems on podcast. And I think that's part of it. Part of it is we were thinking them through and we are talking, we're thinking out loud together about these, these really complex issues that we are all are facing and, and, you know, increasingly facing and not to give too much away. Katherine (35m 53s): But I actually wanted to give a little bit of, of a, of a, a, a preview of a teaser, if you will, about next week, I expect we're going to have a pretty interesting episode next week. So assuming it all, it all comes together and we're able to record what we want to, we have some interesting stories to tell about, about people with some skills, with the aforementioned, a tech skill set, to protect yourself, still getting in to some really uncomfortable security situations. And, and we'll talk a little bit more about that next time, but, but it's, it's definitely something to think about. And, and I love it if, if our listeners could, could give us some feedback in some, from some additional talking points so that when we explore this a little bit further next week, about how to protect ourselves and, and whatnot. Katherine (36m 44s): Yeah. If we can have some additional and maybe listen to the questions, that would be great. Doc (36m 48s): That sounds good. That sounds good. I think we were, that's a great teaser. Katherine (36m 53s): I know. I was like, it's exciting. When are we talking about, Oh my gosh, something terrible happen. And we're going to talk about it were going to dissect it and, and hopefully, hopefully anyway, so it should be good. And if not next week, it might be the following week. But I think, I think we might be able to pull it together by next week. So now I feel so much pressure and, you know, so that would be good. That'll be good. Wonderful. So we cover for this week? I think so. Unless you wanted to get into a more COVID stories like we talked about before, Doc (37m 24s): I'll give you in the listener is a very brief way in which is that I'll just put it this way. I'm over 65, but I'm under 75. And I'm the, where I live in Santa Barbara. They aren't vaccinating anybody who is under 75 or Oh, you know, in the thee, in a, in the healthcare business or something like that. So, but we found that Los Angeles is, and I don't really care what your live in Los Angeles. We're not. So we drove down till we, we, we actually got on a site, that's run by something called a carbon health and drove down to the, the two Dodger stadium two, which have never been because there was a Brooklyn Dodgers fan and it never forgave them for leaving Brooklyn. Doc (38m 12s): And that's what I would say to him, but, but it has vast doesn't cover how big that parking lot is that you could fit whole college campuses and that parking lot. It's, it's, it's enormous. And, and, and, and so there is room for a lot of cars to sneak back and forth many times before finally getting in line for people to get COVID shots, which not an obvious thing, but it isn't like you drive up and it's like, you're picking up a burger that give you a shot and you keep going. I know they move line of people into a length of parking. And then they go down that line, shooting those people after the, those people who can you give them a verifiable claim that they are, who they are, and it's remarkably efficient considering what it is. Doc (39m 1s): But the reason I wanted to bring it up is that not only is Did carbon health, I think do a good job, but there is an organization called core C O R E, which is short for something longer. And all of their workers that are mostly volunteers were incredibly helpful and friendly. And even though it took us three hours to go through this line, they had porta-potties, there are people who have used the port-a-potties with a guy, like, you know, one of the cars are parked for periods of time. And then if you have any questions, they come right over an answer, the questions and, and, and, and we are really nice. So cor is started by Sean Penn, the actor, and that's his job now. He's not acting very much, but he runs this thing and it's one heck of a good organization. Doc (39m 45s): So I wanted to put in a little plug for them because they did a great job. I just thought they did a wonderful job. So, and we just got our first shot. We'll go back to the other one. It's a modern, a shot. I'm not sure if it matters, except that you have to get the same kind the second time around. Yeah. Anyways, but there was a good story. So I just put it that way. Katherine (40m 6s): Well, you know, I followed way, you know, what you posted here and there and it was, it looked like an adventure, for sure. I'm glad they had port-a-potties. That was good thinking. It seems like an in Texas, the, which vaccine you, you, you get, it depends on the facility and its the, the, the big hospital Hubbs and we'll have the Pfizer one and generally everybody else has Medina because they have the ability to store it at the correct temperature. Doc (40m 28s): That's right. Yeah. And there are both had to be stored at super low temperatures, but what, or how low, I guess makes some difference. Katherine (40m 35s): The Pfizer one is way, way lower it's it's Doc (40m 38s): Right, right. Yeah. So like 99 minus 90 Celsius. There's something like that. Or am my, you know, but yeah, but anyway, but it's an interesting thing to, is my wife is so private. She doesn't want me to use their name as a, you know, she is identifiable because she has the black rectangle on all the zoom calls. She was like, well, what does the black rectangle anyway that hers hurt like hell and her arm still hurts two days later. Oh wow. And mine, I hardly felt it and I still hardly feel it. So yeah. Who knows it? So it did, it just depends on the person, I guess, Katherine (41m 17s): You know, I've heard it's a pretty common to have some arm pain for the first one I heard. The second one is the one that really, that really is, Doc (41m 25s): I've heard that to you as a sort of trading that a little bit of it I've, I've been around the medical stuff for long enough that I actually don't, you know, a, my threshold of pain is pretty high. Yeah. In fact, I'd been told, I was told that by Dr. Wentz who was operating, then we go into it. But the way that it was something where he couldn't use anesthesia and, and, and I didn't flinch. And he said, did you of dentistry as a kid with no Novacane? And I said, yes. As a matter of fact, I've never had Novacane or anything like that until I was 17. And I had a lot of Delaware and, and he said, Yeah that you've you were taught to tolerate pain. Doc (42m 7s): I had no idea that that was a kid. And that was, that was the thing. Apparently it can be taught. And that that's yeah. Yeah. It was good because it kind of sad actually. Katherine (42m 18s): So is it makes you feel better about the second dose? I haven't got an elderly relative who was actually in Israel, which is an interesting story because they are so far ahead of everybody on vaccination. I mean, of course they have a small population then we do, but there is there they're already know they're already faxed and adding 20 some things or something like that. They expect to have everybody vaccinated by March are there by the end of March may be, or maybe even earlier. But so the, but he got, he got the, the, the vaccine very early on and then said, you know, it was uneventful and nothing, neither dose bothered him at all and you know, no big deal. Doc (42m 52s): That's awesome. Wow. Well, that's interesting. Yeah. Yeah. You know, I know you're really bad people. Well, you know, this is one of the interesting things that, that it bears mentioning, which is that everybody is different by design. We are, and that's part of the facial recognition to an end and vocal recognition. We, we know each other by our faces and voices, even if we can't describe exactly what that is. And that's, you know, computerizing that costs is something as human beings there, that we lose something with that. And there's no way around it. I don't think because we're not going to put that genie back, but we have lost something with that. Doc (43m 34s): So there we are. Katherine (43m 36s): So are there we are, you know, be careful what you post. Doc (43m 40s): Yeah, exactly, exactly. A much more careful than it ever used to be. Katherine (43m 44s): Yeah. So, so thank you, everybody is, has made it this far up. Please reach out to us, go. You can find us in various places on Twitter and Facebook and on Mastodon. And Reality two casts.com with a number too. You can find us that we have a, or you can contact us to do the Newsletter or there are lots of ways to get in touch. So please feel free if especially if you have something, you know, you'd like to contribute to the conversation about, you know, where do we go from here? How do we protect ourselves? How do we, how do we make the internet a better place? Lets say, so yeah, reach out and thank you so much for listening to us work through this and we'll see you next time.