Katherine Druckman (0s): Hey everyone. Welcome back to reality. 2.0, I'm Katherine Druckman. I think you know who I am by now. Hopefully if you don't it's somewhere on our website, which is reality2cast.com, and I hope you'll check that out. So joining me today is Doc Searls as always, except for only one time, which was, you know, brilliant, always as. And we have some of your, I hope, returning favorite guests. One is Kyle Rankin and the other is Shawn Powers. Katherine Druckman (42s): If you don't know who they are, then I also encourage you to go to reality2cast.com and find out. But yeah, so we have, we have a pretty short agenda today and we really just want to talk about this Signal Cellebrite thing, because it's too juicy not to. And yeah. So if you haven't seen the Signal blog post that's going around, we're going to link to it in the description here. But the short story is that Moxie Marlinspike hacked a Cellebrite deviceS Cellebrite being the device used to hack into people's phones, which is used by law enforcement and oppressive regimes all around the world to violate people's privacy among other things. So they claim the claim of being the journalists who are saying like, Oh, I'm super fine. Katherine Druckman (1m 27s): Yeah. Anyway, so, so these, these devices are controversial they're, you know, anyway, so with that, let's get into it. So we just watched a really fun video that we will also link to, but it is in the Signal blog post. And I think for some of us, or at least for me, it kind of, I don't know, it inspires my inner techno rebel to go, yeah, this is awesome. But you know, there there's a lot more to discuss here and I think we'll, we should just get into that, but before we do, Oh, let me remind you, please again, go to reality2cast.com and sign up for our newsletter, which we send out hopefully every week. Katherine Druckman (2m 13s): And we include a quick little list of links and whatever other supplementary information we think is relevant that week. And we hope that you enjoy that too. So let's get into it. What do you guys think? Kyle, Shawn doc, Kyle Rankin (2m 28s): I mean, to start with it really kind of hearkens back to old school Moxie and old school kind of hacker ethos for maybe two decades ago, this is the sort of thing that you would see a lot of times it's like a sort of a playful kind of flipping off the establishment kind of thing. But it's rare to see that these days, because everyone's very kind of in insecurity is very buttoned down and very, you know, straightforward about. But yeah, I mean, just, just the one it's a very good security writeup of the vulnerabilities in terms of just technically two the snark level is just at an absolute maximum. Kyle Rankin (3m 13s): Yeah. So it's just a fun read because it just, it begins with the fact that the Cellebrite device fell off the back of a truck while he was jogging one day. Doc Searls (3m 23s): Yeah. And the fact, not a fact, not a fact a claim, but I mean, yeah. It fell off a truck and see how many results you get. Yeah. Kyle Rankin (3m 34s): And then it sorta, before I get to the middle, it ends with implications that they might be violating Apple licensing by including libraries without Apple's permission, which apparently there has been some follow up on that already. And they have to, my understanding maybe have discontinued some Apple support right now, maybe while they figure all that out and veiled threat to maybe have add some what Moxie called the aesthetic files that improve the aesthetics of a signal that may also coincidentally, you know, cause Cellebrite to flag stuff, just to kind of go back to what Cellebrite does is this is like you set up, so law enforcement can, if they, or governments capture someone's phone, they can image it. Kyle Rankin (4m 22s): Use this device, use this software in hardware to image the device and scan and capture all of the files and data. They recently got signals attention because they claim signal support too. And a lot of people, when they think about signal, one of the things I think about as well, it's end to end encrypted. So no one can Snoop on my conversations and that's true unless someone has your phone because if they have your phone, it's, you're at one of the two ends. So it's, it's not encrypted at the end. So Doc Searls (4m 53s): They physically have your phone. Right. And that's a big thing make yeah. If look is in somebody else's hands now. Kyle Rankin (5m 1s): Yeah. So if they physically have your phone, then they can, can look at your conversations and copy files off. And that's what this device does is it will copy all kinds of different files and conversations from apps, including signal, and then parse them out unless you review them later. But it turns out that, that in addition to that, there's some of the, when you're parsing files, there's all kinds of openings for exploitation. It's incredibly difficult to, to parse files, secure, open up files and read the contents securely. And they apparently have had an, this that the Cellebrite software had a number of using a number of libraries that hadn't many well-known in old vulnerabilities in parsing. Kyle Rankin (5m 46s): So a Moxie's proof of concept was able to look like, take advantage of some of those and get remote execution on the Cellebrite device. And then as a further, you know, Pixer, it didn't happen. There's an excellent, highly recommended. You watched the video proof of concept with anyone who likes hackers. As much as I do the movie, you will love the video. It's it's like I said, is it the write-up is the write-up is gold, but the video really just pushes it over the edge into legendary status. That's that'll be, that'll be with me for many years. Katherine Druckman (6m 19s): That's pretty great. So my understanding of this is, so for example, so signal could include this little handy package in there, iPhone app or iOS app. And once a Cellebrite device is compromised, it, it makes all previous and future reports from scan devices unreliable because it modifies, it modifies all of the data that is collected from the, from the phone that the celebrate device is imaging. And so it, so if that data is now corrupted in the process, because the D the celebrate device itself has been compromised, and then everything on it is then therefore, and, and in the future, and from the past is now unreliable, which is interesting. Kyle Rankin (7m 13s): Yeah. I mean, it comes down to this sort of chain of custody things. And when you're, when law enforcement is doing forensics on a device that they are trying to retrieve evidence from in the olden days back when we would use parallel port cables and things like that, or, you know, like the old school ATA cables to access hard drives from that point on when, when someone would comp, when the FBI let's say would confiscate someone's computer, there were special cables that disabled the write pin on for those cables for doing imaging. And the idea was that when they had the stand up in a court of law and say, we took an image of this hard drive, and here's what we found that you could defeat the argument of. Kyle Rankin (7m 58s): Well, what about if an attacker or somebody that hacked your machine then went in, and what if you implanted the evidence on the drive instead of extracting the evidence from the drive? And so they can say, well, we, we use the special cable that makes it literally impossible that the write pin is cut. So we literally cannot modify the evidence. And we took an image and we checksum it, and they're all, there's a whole process where you're doing investigation forensics investigation on something like this, where you take checksums of everything. So you can demonstrate later on after the fact that something hasn't been modified, the problem in this case, like you're saying is in theory, if you have remote code execution on the Cellebrite device, then in theory, you could then modify the evidence that they're capturing in flight. Kyle Rankin (8m 45s): It's almost like a detect that you're being attacked by Cellebrite and then mess with the data that it's capturing. So it could, in theory, at least bring some of that into as to whether that's evidence, whether that would hold up in court is what it does is it raises doubt it's. But the question is still whether that's enough by itself to throw out using this using evidence that was captured by celebrate. I doubt there's one case where someone's already tried to challenge, and I doubt that's going to go very far, but anyway, that's, what's fine. Shawn Powers (9m 19s): I certainly would. I would be the person that would say, you know, you, you got my phone and you got all of my, you know, naked cat photos, but how do we know that your celebrate device wasn't compromised? And those are somebody else's naked cat photos that you got supposedly off of my phone, but here's all the vulnerabilities that this device has. So how can you show that this wasn't, you know, that it was a valid image? I mean, that's what I would want to do again, will it hold up? I don't know that, you know, I'm not a lawyer, but it's certainly would put doubt in the minds of a jury, you know, or whatever. So, Katherine Druckman (9m 58s): And from my understanding also is that these, these, this interception would be undetectable because they, you know, they're, you know, you wouldn't be able, it's not something that can be verified by checksum or anything like that. But yeah, that is interesting. No, it's not going to help anybody in like an oppressive regime because they're still gonna, you know, do the thing and unless you can, okay, so you're in X, you know, whatever country we've decided as oppressive and, you know, the government has taken your device. And I suppose in theory, based on this vulnerability, it would be possible if you were super sophisticated in an inspired movie to intercept the Cellebrite device, thus, I don't know, altering whatever incriminating information there is on the device, Shawn Powers (10m 49s): Every phone you plug into it, you know, it injects naked cat photos. Katherine Druckman (10m 53s): Oh, those damn naked cat photos are everywhere now. I mean, a lot of people, Kyle Rankin (10m 58s): Well, I get into a lot of forum discussions with people that are interested in phone security and a great number of them have this kind of spy, threat doom, state device, duress code, kind of scheming. You know, that this, it always sort of comes up with where someone says, well, what if I am captured by the authorities or whatever. And instead of entering my normal encryption key, I enter in this other one and it erases everything instead. And that's going to be super clever and putting my purism hat on for a second, I often will get, well, why doesn't purism helped me implement a dress code like that. Wouldn't that pretty be pretty sweet. Kyle Rankin (11m 39s): And so then I say, well, the problem is of course, that at least many jurisdictions consider that tampering with evidence and that's illegal. So you not only can you not do that and it's, it's illegal, but yeah, I mean, it's, you are talking like spy level stuff. However, like you said, it would certainly be possible for if you have remote code execution on the Cellebrite device and you happen to have one, you can test this on the fell off the back of a truck, then you could in theory, create something that since you know, that the, that your phone is connected to this device when it's happening, and it has full remote, full access to the phone, because it's pulling the files off, not only erase all the evidence that it's trying to pull down, but also erase all the evidence on the phone itself. Kyle Rankin (12m 21s): So you could create something that would, in theory, at least wipe out the wipe out everything, including any potential evidence on the phone, which would be true, Katherine Druckman (12m 31s): Same stuff. These devices are expensive too. I mean, I don't know a number off the top of my head, but I know they're really expensive. Kyle Rankin (12m 40s): Good example that security be security by obfuscation is not the ideal security model for, you know, something that is obviously not great software. And the only reason that nobody is realized it's pretty crappy is because nobody had access to it. That's not a great, yeah, that's not a great model. And it may also be somewhat expensive because I don't know how much they rely on zero days to exploit some of the phones that they're able to exploit and how much they use something else. But part of besides the fact that if you sell something to a government, you can mock it up, you know, infinite times fair enough. Especially if you're the only game in town, but beyond, beyond all of that, there's also just the case of if they have implanted and zero days in the Cellebrite device to, to exploit like more recent I-phones, let's say the neck, all of those are super expensive too, so. Katherine Druckman (13m 34s): Hmm. All right. Well, I, I just speculate just because I wonder how, you know, I wonder what truck that wasn't fell off. That's all I wonder, like, how do you even get your hands on it? It's kind of that, that part of the story is interesting enough, but like, who knows? I hope, hopefully we'll never know because then that could be, Doc Searls (13m 51s): Well, hopefully we probably will never know, period. It seems to me given that it may be less, it may be less useful for law enforcement than just for spying. I mean, I guess that, I mean, that's, I would imagine that that would be the primary use for it, you know, and, and, or just to, or just for a kind of brute surveillance. Katherine Druckman (14m 13s): Well, we talked, we talked previously about how these devices have been, are being used to break into people, to student devices in schools, where there was a, there was an article, I don't know, months ago we had a, we had an episode all about it, which I'll link to, but so, I mean, there are a lot of kind of shady applications of these devices. So, you know, the security flaws aside, their use at all is problematic. I mean, I'm sure there are plenty of quote unquote legitimate uses for the device, but it's something I'm not really comfortable with in general. I am also comfortable that it exists. Well, it's also Kyle Rankin (14m 52s): Sort of the trickle-down nature of these sorts of things, right? I mean, a lot of things, especially when you're talking about hacking or you're talking about the like security devices like this, where it always starts at a certain high government level, and then unlike, let's say, actual ballistic weapons or something, which take a walk, that sort of thing takes a long time to trickle down from large scale governments all the way down to whom whomever, but this sort of thing moves pretty fast. And so, like you said, in addition to that, there's also just the case of things that were used at high levels within governments, finding their way, like you sit down into schools potentially, or, you know, I I'm sure that I, I, this would be a very useful thing to have in, if you were in a customs agent edit airport that gets a lot of traffic where you could confiscate someone's phone, go hook it up for a minute, image, it detain them for however long it takes to do that and then send them on their way and then have a copy of everything that you can review later, you know? Katherine Druckman (15m 54s): Yeah. Sketchy, you know, when we, when people talk about phone security, everybody inevitably brings up the San Bernardino shooters where Apple refused to unlock their device. And then they got an Israeli company to unlock it. And, you know, there's, you know, if you, if you put enough resources behind something, eventually you'll get into it. Right. But that always makes me think, you know, if you have to, so we've talked about the personal nature of, of, of phones and personal electronics and mobile devices and, and how it's like the next best thing to hacking into somebody's brain and how invasive that potentially is. Katherine Druckman (16m 36s): But in my, my feeling about a lot of the applications of this type of device hacking by law enforcement is that if you have to rely on evidence and somebody's cell phone for something like walking in and shooting up their office, like, is that really the best? Like, it feels to me, like, you're not doing your job correctly, but I, you know, what do I know? Kyle Rankin (16m 58s): I mean, I think in a lot of cases, it's, it's more useful for things like, if there's a conspiracy beyond or Katherine Druckman (17m 5s): Right. If they're planning to shoot up a second or third office, I suppose, but I don't know. Kyle Rankin (17m 11s): Yeah, no, I mean more if there was more than one person involved at the beginning and maybe only captured one person, but they actually planned or plotted with multiple people, then a lot of times, I mean, one of the most valuable things with phones is getting a social graph of all of the people with whom persons communicated with. Right. And then if you see, Oh, they made a quick phone call before the event happened and then they wouldn't did the thing. There's a, even that circumstantially, you know, that lead you to think, well, maybe I should see who they contacted, Katherine Druckman (17m 42s): But you don't need, I mean, do you need to access to the device to do that? Can't you just subpoena records from the phone company? I mean, I don't know. It just seems to me when I think about these things, and again, I know, I know very little about how these types of investigations work in terms of what they use this sort of data for, but it seems to me, we haven't had smartphones that long. So how did they do their job 20 years ago? You know, if, if they, if this is so important that the rest of us, including students in schools have to have to put our privacy on the line, why is it so important? That that's my, my question. What is the, is the benefit to society of these things existing? Does that outweigh the, the, what, what we're giving up? Shawn Powers (18m 25s): Well, to answer the question about the subpoena, the phone company, I mean, for example, in the iPhone messages is end to end encrypted, you know, signal is end to end encrypted. So the phone company, and even Apple doesn't have those messages, the content of the messages, the content of the messages. So, I mean, they may know who it's going to, but they don't, they don't have that content that's on your, on your phone. Also, I did want to point out that, you know, looking at the super awesome video, it, it doesn't look like this device, like it gets around your phone being locked, or, you know, encrypted like that. It required even in their cool little mock demonstration or whatever it was, you still had to unlock and acknowledge and trust the device. Shawn Powers (19m 7s): So like, it's not like plugging in your phone to this device automatically unlocks everything. So if you're at an airport and they put your phone through the thing, they're not going to quickly go down, plug it in, get an image while it's going through the scanner and give it back to you. And you're not going to know because then they just have an encrypted image that they can't do a bupkis with. I mean, at least that's what it appears to be. I'm I think, I mean, it's, it doesn't, I don't want to say solve the problem of the San Bernardino shooter, but I mean, it doesn't get around the same issue that they dealt with there it's I mean, it doesn't, it's not a magic box. That's just a creepy box, right? Yeah. It did. Like you said, it looked like it was already unlocked. Shawn Powers (19m 48s): So, and, and customs agents, aren't going to randomly burned zero days. Every time they're having a new phone that they want to scan for 20 minutes in airport security or whatever. Right. So besides the fact that those things are expensive. Yeah. You're, you're totally right. And along those lines, if you want to, there's, there's some sort of, there's some new information on some of the history of the San Bernardino shooter as part of this new book that came up by Nicole Pearl Roth called this is how me, how they tell me the world ends, which I read recently, she's she reports for the New York times. And there were some controversy about some of that, because she has some insight knowledge within the community. Shawn Powers (20m 30s): And some of the folks didn't necessarily agree with all of her statements on things, but regardless, it's still a fascinating dive into the world of zero days since the history of zero day exploits. And it turns out that apparently this, the San Bernardino thing was there was an Australian firm. That was Kyle Rankin (20m 48s): The one that ultimately sold the exploit. I believe something like that, that w that contacted them, that was able to hack that particular iPhone. So they didn't have to go through the courts and they decided, yeah, we don't want to bother going through the courts for risk losing the precedent. Of course, they didn't find anything on the phone either way. Doc Searls (21m 6s): Yeah. The phone was useless. It turned out. And one wonders how much of that there isn't a world to, I look at this in a way, is, did it, did any of you see the closing episode of the series Silicon Valley? Yes. I remember how that ended. I mean, basically the, you know, they, they went out of business because that was the, that was the right thing to do because we invented, they basically imagined out a hack that made it possible to disable, basically get past all encryption if I had that. Right. And meaning that all privacy goes away and, or is at risk of going away. And there really is no secrecy on possible online, but that was sort of the implication as well. Doc Searls (21m 51s): And I, and I wonder whether or not that isn't implicated in all of these discussions, that there is an end state here that says if it's digital, it's hackable, and there really is no secrecy, or ultimately no secrecy in the digital world as we build it out. And I, I wonder about that. I mean, I, I spend a lot of time thinking about that without having enough expertise to know whether I'm close to being right with my paranoia. Shawn Powers (22m 22s): Well, it mean great news for people who want Bitcoin, if we could just decrypt the keys using, you know, a magic box. But, but no, I, I mean, mathematically, we're not mathematically, we're not there. I think vulnerabilities in how encryption is implemented is, is the, the thing that's gonna be more likely to, to cause frustrations with data encryption and that not necessarily to the encryption itself. I mean, the math is pretty solid Kyle Rankin (22m 53s): Well, and, and the fact that when people try to break these sorts of things, what are they typically aren't necessarily trying to break signal as much as they're trying to get access on the end point? You know, there's, I'm sure there is some focus on finding flaws in the encryption algorithms that are used for things like signal, but for the most part, you see devices like this, where they say, well, if we physically have the device, which we can do in many cases, then we don't need to worry about cracking that we can go the easier way to Shawn Powers (23m 27s): Look. Here's the private key. Kyle Rankin (23m 29s): Yeah. And here it is, everything's right here. Yeah. So I think that's why you're seeing a lot of focus on, you know, going after the individual devices and finding flaws there, because then you have everything, you have everything in its un-encrypted state is sort of like, you know, those of us who have had to do security check boxes for having encryption at rest on servers and things like that. And you set up your encryption at rest on your server and you check that box and from whatever audit saying that you've done that. And then, but you still realize that, okay, great. Now I'm safe. In case we ever decommissioned that server in the drives go somewhere, or someone breaks into this data center and steals my server. But if someone hacks into the server, then it's not encrypted because it's encrypted at rest right there. Kyle Rankin (24m 14s): It's running. Right. Yeah. So, so all the data's not protected in the most obvious use case in the most obvious attack. Right. And that's the same thing here. You know, we have these phones in most cases have encrypted storage too, but that's only good if they're locked her off. You know, if they're unlocked, then everything's right there for the taking. Katherine Druckman (24m 35s): But whether or not the math is solid and, and, and, you know, the technical aspect of it is solid. It doesn't go very far. If, if governments, like, for example, Australia passed a, basically an anti encryption law that most privacy experts are deeply concerned about. So I suppose that's, you know, that's the concern is if you know, whether or not things are protected, technically, if there are, you know, scary legal penalties for not just violating your encryption intentionally or, you know, however, however, they, they, they enforce their law. Katherine Druckman (25m 18s): Then, then I, you know, then it doesn't really matter, right. If the, if the math is solid, well, Kyle Rankin (25m 26s): It just what you ended up with as a rehash of the crypto Wars from, you know, a couple of decades ago where you ended up taking a cryptography algorithm and publishing it, publishing it on paper, and then distributing the book, because it was like this legal loophole so that it could be exported that that encryption algorithm could be exported. Right. So you kind of get a repeat of that, but ultimately if people try that, what would happen is that, you know, you implement some sort of cryptography, backdoor, some nature place that is required to enforces that. And then most governments have shown that they're pretty good about getting hacked by other people and other governments. Kyle Rankin (26m 6s): And, and when they do, it's pretty devastating in most cases. So what would happen is that government, whatever, whichever government would implement this sort of thing, that's the target for all the other governments. If I can hack this government that implemented this backdoor, then I can get a copy of this key. That's not just sitting in some volts somewhere, obviously because it's being used to decrypt things. So it's sitting in the number of volts and as all of the disclosures of all the NSA hacking tools from, you know, a number of years ago, post-Snowden, but these things eventually will get out. I mean, even the NSA, wasn't able to keep all of their tools a secret. And so the, ultimately the encryption key for any storage will get out and it'll be there for bad guys, as well as good guys. Shawn Powers (26m 51s): Let's be clear if P versus NP is solved, we're all screwed forever. Just to be clear, you know, that's the whole reverse engineering encryption problem, which there's no solution for. But yeah, just, just reiterating that it's still key. I don't want anybody to be listening to worry that, you know, encryption itself is now compromised. It's the, it's the, it's the meat portion of it. You know, it's the people who do or don't have these, you know, they don't keep their private key private. I think that's the, that's the issue. It isn't, it's not math. Dang it, the math is fine. Kyle Rankin (27m 31s): The math, the math has done nothing wrong. Here we are. And, and in general, with these things, everyone goes after the easiest way to get in whatever that is. So in the case of, for example, in the case of the San Bernardino case, the issue had to do with the fact that they could only try so many password attempts before it was locked out, but if they could try infinite password attempts, then they had a pretty good chance of getting in because most people would pick not that great of a pin, you know, especially on a phone you're entering in a touchscreen. So it's probably going to be maybe four digits, maybe six, but not 11. Shawn Powers (28m 7s): Yeah. And iPhones max out at six. So yeah, Kyle Rankin (28m 10s): Yeah, yeah. And like Michael, Michael Lee at around the time made some sort of post along the lines of, if that's your threat model, then essentially you need an 11 digit pin minimum. Otherwise then it's, they can, the machines they have now with the number of guesses they can try at once can we'll be able to get in quickly enough. Yeah. I don't have one Apple bricks, the phone at 10. Right. They give you 10, 10 tries. Yeah. Shawn Powers (28m 39s): And image it. And so you're always on try one, you know, where you can keep trying, I mean, that's the one. And so Kyle, you, you hit the nail on the head to you. We go for the easiest exploit. I just, I just helped teach or create a course for CBT nuggets on pen testing. And, you know, everybody thinks I'll pen testing. It's like the, like the hacker's movie, right. Where you're doing all this stuff, generally the easiest way is exploiting people. Right. I mean, it's being a real creepy person who uses, you know, who trick somebody into giving them a password or access or, you know, shoulder surfing that it, again, it's the meat space that causes the majority of problems. And that's the, usually the easiest thing to exploit is the Kyle Rankin (29m 22s): Yeah. Yeah. Or, or other than the other, beyond the people there's just other software that's connected to the software you want to get into that doesn't have anything to, that's easy to break. You're like I gave a, a kind of a pin testing primmer, a number of years back before a CTF tournament, I running inside of a company and essentially taught them how to hack by pointing at a local, a wireless access point. I happened to buy that was running a really weak security. And it was very easy to modify the URL to low launch, a little local shell without much effort at all. And it just sort of demonstrated the point that they then ran with and found vulnerabilities across the board. Kyle Rankin (30m 6s): But yeah, every, you know, you don't have to go very far to practice those skills. They're all sitting in devices in your home today. And for company, The, your vendors often have access to your systems. And if they are not designed with security in mind, very often the, the next person in the chain who has access to your secure data because they need to, if they're not secure, well, that's a great Avenue to get into your thing, because again, they need access to do whatever it is they're doing for you. And you can't always make sure that they're Doc Searls (30m 40s): Smart, Kyle Rankin (30m 42s): Like these, like the surveillance company that had an employee that because the way that it worked, all of the camera feeds were also going to a central location that was managed by, you know, not the company that bought the service, but the company that was hosting the service and they had an employee that was just kind of looking through all those videos when they weren't supposed to, because they worked there and they happen to have access, you know, and they did it for quite awhile. Shawn Powers (31m 9s): What was the Twitter thing with the God mode password not long ago or all the, all the famous accounts started tweeting the Bitcoin thing, like send Bitcoin. And now, I mean like Barack Obama tweeted because it was a disgruntled employee who had the God mode password at Twitter, which is just ridiculous that that existed, but Kyle Rankin (31m 27s): Uber God mode that they may be disabled eventually that once you enabled, you could just track all the cars going everywhere all the time and where it was going. And just on a map, you know, Shawn Powers (31m 38s): That's like a die hard movie, right? Doc Searls (31m 41s): Yeah, it is. But then that also brings up a, you know, I mean the, the God mode experience there, and it brings up another point that to celebrate, not to celebrate, I'm sorry, the, that the, that the, the signal blog brought up, which is that celebrated really done. Pretty horrible housekeeping period. I mean, they, they hadn't, they hadn't done very good audit over their own practices, in addition to whatever other evils they were up to, which made them a lot less trustworthy. Kyle Rankin (32m 18s): I mean, cobbler's children, you know, and shoes and all of that. Right. So, I mean, this is often the case. You will have a lot of people that are security experts that will advise developers on how to write secure code, but they're the code that they write is also insecure, you know, Shawn Powers (32m 32s): Passwords and their config files. Yeah. Kyle Rankin (32m 35s): Make the same sort of mistakes. It's, it's very, you know, it's easy, it's a skill for being able to find a problem than it is to be able to do something without the problem. You know, it's, it's very different to find bugs in code than it is to write bug free code. And it's the same thing here. Yeah. They, you know, they hit, I guess didn't necessarily have to worry. They weren't worried about the threat of someone attacking their own devices. So they didn't really spent a lot of time on defense. And you'll see that a lot with a lot of security tools now that a lot of pen testing firms publish their tools for other people to use. You know, it's, it's someone that's running a Python script on the side to help them do their job a little bit faster and get through this audit and move on to the next customer. Kyle Rankin (33m 17s): And it shows, you know, it's not necessarily made the, be an enterprise product that end users are going through the code and trying to exploit, you know, it's not, it was never designed to be resilient to an, to an attack itself. It's always just sort of focused on Doc Searls (33m 31s): Offense. And this is the soldier to stay on the truck. Yeah. Well, there's the silver winds thing too. Right. Which it may not even be over at this point where like, there was the biggest breach in us history and they haven't even begun to find all the places that it's still vulnerable or something like that. The news age is fast. I'm not remembering the particulars on this very well, but I think that was the solar wind story. Wasn't it? Kyle Rankin (33m 55s): Yeah. Yeah. That one was enormous where everyone who had was running their particular software, it's like a supply chain attack. And it had, it had existed for over a year, potentially in some cases and the automator companies. Doc Searls (34m 8s): Yeah. Yes. Shawn Powers (34m 10s): This is only tangentially related, but my wife ordered something. It was something for her drama class, like a script piece, or I don't know something, but she ordered it online. And a year and a half later, she got an email from the company saying, we encourage you to cancel your credit cards. Because for the past like 18 months, there has been on our website, you know, code we didn't know was there and everybody's credit card, number security code, everything that you need, including including even the zip code for the, for the client has been going to an unknown database, every single credit card they ever processed. Shawn Powers (34m 53s): Every bit of data on it was gone. I mean, it was, it was a breach that was just like, wow. And it just came in a letter and it just said, BT dubs, you should probably change your credit card. Kyle Rankin (35m 8s): Well, along those lines, my wife got a similar email, but it was from a, a, a company that she had written for and said, Hey, by the way, the system that holds all of our 10 99 forms for all of our contractors has been compromised, which means that your, your ID number is compromised. So here's some free credit and you know, all the standard stuff that they do in exchange for that. But just, just so you know, you're everything that you, that you need for identity theft, which is everything in a 10 99 that you would turn in is somebody probably has it somewhere enjoy. Kyle Rankin (35m 54s): I think it's in the firmware of the Filbert device. It's actually her 10 99. I think I saw it scroll by. Katherine Druckman (36m 2s): I always get depressed when we have these conversations. They're fun and interesting, but then I'm like, Ugh, I give up. I just don't, I can't have a phone. I just, you know, screw it. Kyle Rankin (36m 11s): Apparently, if you, apparently, if you have signal installed, there's a, there's the random chance you might get a special file for aesthetic reasons added to it. Katherine Druckman (36m 20s): I know that that was exciting. I feel like, I feel like at least I, you know, we know somebody is looking out for us. That gives me some faith in humanity, I guess. I don't know. So I guess the moral of that story, I, you know, I don't know what is the moral of the story? I mean, as somebody who sells works for a company that sells a phone, that's not an iPhone. The morals probably can't be like buy it just by iPhones. Because I mean, Kyle Rankin (36m 47s): I think the moral of the story is electricity is bad. That's pretty much it. Katherine Druckman (36m 50s): No, no, no. Yeah. Like what do you, what do you like? Kyle Rankin (36m 53s): It'd be fun. Wouldn't it? I mean, I, I, I mean, in, it depends, if you could unlock the disc encryption, then you could read files like anything else I imagine, you know, I mean, there's eras, phones are fine unless you're using one. I mean, every phone is fine unless you're actually using it then. Yeah. If it's, yeah, if it's unlocked and the disc is no longer encrypted and someone can copy the files off, then someone can copy the files off if it's low, if it's off, I guess. And it's like, it's very similar to a laptop in that way. You know, if, if you have a laptop and it's off and you have description, then they would have to guess your passphrase to get in. And it depends on the strength of that, but if it's not, and in that case, even if we have that in place and you have a bad pin or a weak pin, that's guessable, then someone could get in, I guess, or automatically on encrypts on boot. Kyle Rankin (37m 48s): Right? Yeah. Yeah, exactly. Yeah. I guess, I mean, to me, the moral of this is that I guess the old school playful hacker stuff isn't completely gone. So that's one thing and to it, so that's kind of fun. And I suppose, yeah, and that, you know, this is that your phone, is it underscores how important your phone is for a snapshot of everything about your life and underscores the need to have more definitive search and seizure protections against your phone, which I think the eff is, is in the middle of a case that they're wanting to bring in front of the Supreme court on this subject about specifically about the ability of customs enforcement and when people are traveling back into the United States, whether or not just because you're traveling in, does that give someone in customs the right to, to confiscate your phone or computer and scan it and image it all. Shawn Powers (38m 56s): I'm curious too. So Apple usually gets like this really rosy picture of I'm all about security and in these conversations, at what point, even if it's illegal, I mean the whole face ID thing creeps me right out because it's not really difficult to go like this, which on the podcast, you can't see me, but basically hold a phone at somebody's face to unlock it. That's it not great. Kyle Rankin (39m 17s): So it turns out your, it turns out your face is not a secret. It might even be less of a secret than your fingerprint, you know? Cause it's everywhere you go. And I know that, Shawn Powers (39m 27s): Yeah. Even if it's fancy, right. And it uses 3d modeling of your face or whatever, I mean just the idea, same thing with touch ID before or face ID with, you know, your thumbprint or whatever I'm on the phone. I mean, without your thumb. Yeah. Or, or, you know, be fancy and dust it and put tape on it and all the things you see in the movies. But I mean, at least with a code it's it's, I mean sure. Fingerprints on the screen, whatever, but I mean, I don't know the whole face ID touch ID that sure is pretty easy to crack when you have the human's body right there in front of you, whether they want, I mean, what do you have makeup face Kyle Rankin (40m 4s): In cases of that, there's been a case that there's been a case of that one, compelling someone to put their finger on and unlocking another, their face was right there. And so they just held the camera up to their face or the phone up to their face and unlock the phone. But here's the other thing is the other dirty little secret around that isn't even about getting into the phone. It's the fact that so many people use iCloud, which is un-encrypted that most, many, most, much of the time law enforcement doesn't even really need your iPhone necessarily. It's just supplemental, for example, even in the San Bernardino case, they first, the first thing they did was they went to this, to the, the corresponding iCloud account and just captured everything with their warrant. And for all of the, we will not unlock the phone stuff. Kyle Rankin (40m 45s): They will hand over obviously all of the stuff in iCloud and because Apple hasn't implemented encryption on any of that data, it's all plain text and ready to share. So any, and there's plenty of, I'm sure there's for most people there's who use iCloud, there's plenty of incriminating things like your address book and all your contacts and whatever else gets synced up there, Shawn Powers (41m 6s): A backup of your entire phone. Kyle Rankin (41m 8s): Yeah. And so it's all there. Right? And so that's the other thing. Yeah. Doc Searls (41m 15s): And as somebody who has the all Apple family, not as only Apple, but it's mostly Apple trying to, trying to make sense of iCloud. I mean, it's just bizarre and it's, they make it so hard to use and make it the obscure. So first of all, they want you to use iCloud all the time and putting iCloud in the middle of everything. You'd plug in your phone, all of a sudden your, all your photos are in iCloud and all of your photos are somewhere else. And if there's not enough storage on your iPhone, it won't erase them on your phone, even if you wanted to delete them. And you did the delete thing as part of the, part of the copying over, I swear, I spent, I've probably spent a day in the last month on the phone with Apple trying to unscrew that shit. Doc Searls (42m 3s): And that's just has nothing to do with encryption or anything. It just has to do with making the damn thing work. It's it's way too complex. So I did, to me, the kind of takeaway from this is nobody is fully good at what they do, including including the bad guys, including people who were or pretend to be good guys that are actually bad guys in some other way, like, like celebrate. Yeah. Shawn Powers (42m 27s): I hate on Apple a little bit just because they tend to get to go for it. No, that was it. I just wanted to say it's not perfect. Doc Searls (42m 34s): Oh my God. No, no, no, no. Not even, not even close. And what there is is a whole other subject, but they, the implemented the don't the, the, I think I was now 15. It's like 15 even is the, is the new one. I haven't gotten it yet on my iPhone, but they haven't even prompted me for it. But it has the ID for advertisers as an opt in rather than opt out or something like that. But I've already seen because somebody showed it to me that they're, that they have a, you know, a thing where ask Facebook, not to follow you and not like, tell them not to follow you. You know? So it's like, my God, they're compromised too. Doc Searls (43m 15s): And I mean they're and they want to be in the ad business still, apparently. So that also creeps me out. Anybody that wants to be in the ad business is, is creeping me today. It's Shawn Powers (43m 26s): IOS 14.5 as the Oh four Doc Searls (43m 29s): Oh five. It okay, there you go. Yeah. Shawn Powers (43m 32s): Yeah. And yeah, when you, when you boot it up the first time or started up after the like three hour update process on your phone, I have an old phone. Hopefully it won't take that long if you have a new phone, but it's yeah. It asks you if you want to let people see stuff. And I mean, I say no, but I don't know. Kyle Rankin (43m 50s): Which is a sad Testament to my confidence. Yeah. Doc Searls (43m 52s): Yeah. I, it, it, to me, the fact that the iPhone had something called ID for advertisers in your phone is horrible. It's unforgivable almost, you know, you don't have one in appears in phone. Right. Okay. Shawn Powers (44m 6s): Social security number there. Yeah. Kyle Rankin (44m 10s): Yeah. I mean, we, I mean, not only wouldn't we implement something like that, we, we couldn't at least from our corporate bylaws. I mean, that's, that would fundamentally break everything that, you know, that we've said that we wouldn't do. So, yeah. I mean, that's, I couldn't imagine implementing on purpose something that would uniquely identify. Well, here's the, here's the thing is that I, maybe we talked about this here in a previous, previous podcast, but that all of the cellular providers in the U S are doing the same thing. They've realized the value of assigning a unique identifier that, that tracks your, your IMS, your NZ, your IMEA, basically all the unique identifiers that, that ha tell one phone apart from another phone on the network and linking it with your identity and then using all of that, selling all of your whereabouts and all of that for advertising on the side, and all three are partnering together on it. Kyle Rankin (45m 5s): You know, all three, Doc Searls (45m 7s): Do you have a link on that? Cause I, I have too many tabs open, so I need to refresh that. I think that's true. And I think that's, there's a disease. I mean, advertising is a, is a cancer at this point. It's direct marketing. It's not, it's not, I mean, the advertising you see on the Superbowl provided you have a TV. That's not tracking you is, is, is meant for whole populations and not for you personally. So that's relatively innocent, but all this stuff that's trying to track you is just wrong on its face and used to die. And it's getting worse. I thought, I keep thinking it's going to fail, but it's not, it's getting worse. It it's getting worse. And Amazon's in a business and Apple is in the business and what, what makes it even worse than that is that, and this is kind of a, it's like there are four red herrings, you know, there what, the way Amazon does advertising Facebook, Google, and Apple are all totally different. Doc Searls (46m 0s): They're totally different to generalize about them as except to say that they're creepy in their own ways is wrong because they're all doing a different thing. Katherine Druckman (46m 9s): Something else that I've noticed is I think that while people are, I think as a whole increasingly more concerned about privacy and, and blocking ads and those, and those kinds of things, I think that it's still not enough. We haven't reached a whatever critical a herd immunity or whatever number that is to make enough of an impact. I think the vast majority of people still have this sort of indifferent attitude toward it, or they're either indifferent or they actually, you know, say, Oh, well, I, you know, I like seeing relevant ads only, or, you know, whatever it is. And I, and I just wonder how we convince Doc Searls (46m 47s): This is a tough one. I actually could. There was a, there was a study. I have a link to it somewhere where if people are asked outright, do you want to be tracked period, 99% say, no, there was another study that said 97% yet to specific cases here, I have a client. I will leave them nameless. That is all about, you know, the individual and being right by the individual. But you're still a B2B client and they're still BB B2B company. They're not really financially obligated to, to respect the individual, but they say they're all about that. But they have tracking on their, on their website and, and, and it's the usual thing. Doc Searls (47m 30s): They, they hire one of the aftermarket companies to put a gauntlet on the front of the website that urges you to accept all these cookies. And so I I've been talking with them about it. It's an ongoing conversation, but trying to get the, the marketing people there off of the idea that, Oh my God, we have to have this data. We have to have this data. And, you know, we went through it with Linux journal. You know, we had, you know, the, you know, Google, Google analytics is going to give us what we need to know, but who comes back to the website and how often they come back and all this shit. And also they needed to retargeting is there, there's a really important non-profit I know that does retargeting. They make millions of dollars off of it. Doc Searls (48m 12s): And their whole thing is respecting the individual too. It's so hard to resist this stuff. And it's so normative now that it's very, very hard to fight it. And, and I've, I've actually arrived at the point when we talked about the sun, another podcast yesterday. So some of us didn't, which is that I think the entire web basically, and beyond that, even all the norms around the way we do, we've been using the internet so far are just, we have to get around them. We have to get around the whole goddamn thing. It's so corrupt and it's so corrupting. It's just too, it's too hard to fight it. It, and you're fighting the norms that are inside of it. Doc Searls (48m 53s): And the norms start with you are dependent. Like you're dependent on all of your phone companies, as long as you depend on the phone companies and you can't stop them from tracking you and selling your, your data, screw you, you, you lose, we need a way around that. So yeah. Hats off to purism for being one of them. Once I do that, we've drifted off celebrate. But anyway, yeah. Kyle Rankin (49m 15s): Well, in, in Katherine to sort of go to your point about whether people tend to care about the privacy or not. I mean, I think, and I've said this a couple of times before, but I think it's more that people just don't feel empowered to do anything about it. Yes. But the thing, the evidence that I, I point to over and over about why I think people do care about privacy is how much of a fight these tech companies put into resisting any legislation that would require opt-in for tracking. They always want it to be opt out. And the reason is the path of least resistance. Right. But they also know that if you've got a pop-up saying, would you like to be tracked? You would say no, but because it's the default, you're not going to go dig through the, all of the check boxes and unchecked tracking only to find out in some cases on Android. Kyle Rankin (50m 3s): Well, if you did say don't track me, that doesn't necessarily mean you're not being tracked. You might still be being tracked, but just not by this program, it's just by the seven other programs. There's no. Yeah. But so to me, at least one solution is to require opt-in for tracking the California privacy law tried to do that in the draft form, but big tech, like I said, fought tooth and nail to get that removed. Even though companies like purism wa spoke to the assembly and said, no, you need opt-in and this is, this will not kill these companies. This is a, you know, this is a necessary thing. And, but yeah, so they, they intentionally grape, they were able to get it removed. And so now you have privacy laws, but it's all opt out, which means that it's not going to affect most people because it's, it's buried somewhere to turn it all off. Katherine Druckman (50m 51s): Right. I think, I mean, awareness and education is half half of the battle here too. And I think of, of, of issues, especially with regard to women in particular being one, I think if you ask a group of women, are you okay with people having the ability to stalk UV or your digital devices? They would say, Oh God, no, that's creepy. You know, I would like to feel safe in the world. And as a woman, I, there are frequently occasions where I don't. And, but the problem is is that when you reframe it by to quote a previous guest when marketers and Aaron chairs reframe it as no, but we're, we're, you know, we're, we're targeting you so that we, we show you things that are relevant to you. Katherine Druckman (51m 35s): And, and, you know, we were only interested in, in what you're interested in. And then, you know, we show you the right, the right advertisement for that couch. You want, I mean, that's actually something that happens. Kyle Rankin (51m 46s): Yeah. And as a result of tight to tie a tidy bow in between this and the celebrate thing, so we can loop it all the way back around. What this has created is this alternative data collection economy that now when governments that have a celebrate device, maybe they can't get all the information that they wanted from the celebrate device. Maybe they don't have your individual phone, but maybe they don't need your individual phone because they can even, without being able to get a warrant, they can simply go by all of the tracking information that advertisers have captured and stored with data brokers. And they can do an end run around the whole around, you know, jurisprudence and the fourth amendment. Katherine Druckman (52m 25s): But there's another issue, especially, I mean, that's not a, that's not a gendered issue in that case, but a lot of people would just say, well, but I, you know, again, I have nothing to hide argument. I'm not a criminal. Why should I be concerned what law enforcement is doing with, with marketing data, but at the same time, you're literally being stocked. And I don't know. I just, I sometimes I feel frustrated, like by my desire to reach a certain group of people who I perceive as indifferent, because I don't think that people really do understand the risk. I think that, you know, I think that people put too much faith in other humans that ha you know, what's, what's the capability is there, all it takes is one bad actor, you know, and Kyle Rankin (53m 7s): Yeah. And it's not just the enforcement that has this access. Right. I mean, creepy stalkers use the same kind of tools. Yeah. Surveillance company or whatever. Yeah, exactly. Or, or creepy, you have the creepy ex who installs the, one of the gigantic suite of creepy apps that can track people on their ex's phone and then tracks their whereabouts. And in-stocks them, you know, indefinitely. I mean, yeah. There's, it's, it's a huge problem. And it's not necessarily, we're not doing enough to address it. Shawn Powers (53m 37s): And the, all the women listening and Catherine, like you said, though, I mean, I've heard multiple people say multiple times, I would rather have ads. I mean, I think when you install Microsoft windows, I've had to do that. I've had jobs where I had to do that. One of the check boxes is, would you like us to only show you targeted ads? And it seems, yeah. I only want, and, and I've actually seen or heard the argument. Well, I don't want to see, I don't want you to a bullshit thing. It is, but I mean, you know, the argument is, well, I don't want to see boner pill ads. Right. I want, I mean, you know, I don't want my kids to see boner pill ads. I don't want to see ads for like, you know, hot imported wives or whatever. And so then they say, well, if we can track who you are, what you watch, we will only show you things that you're interested in because we want you to buy things. Shawn Powers (54m 26s): Why would we show you things you don't want to buy? That's a waste of our time. It's a waste of your time. So again, then advertisers seem like the good guy for like, this is a curated list of things that you weren't. I mean, it's easy to talk your way through that and sound like, well, I mean, they're going to track me anyway, so why not? Why wouldn't I want targeted ads? I mean, that's yeah. Logically. Sure. But, Oh my goodness. Kyle Rankin (54m 54s): Which then sort of sidesteps, it's sort of like this false dichotomy of, well, would you rather have un-targeted ads or would you rather have relevant nets? Well, of course you'd rather have relevant ads, but the third option is would you rather not have the ads at all and not have like the surveillance infrastructure that leads to all of that stuff? And I'm sure plenty of people would say, yeah, we'd rather have that too. That we're starting to get a kind of economy a little bit where people can, where it's people can pay to opt out of that stuff a little bit, like the things can be funded outside of tracking and advertising in some models, there are at least some models out there that allow that I'm somewhat hopeful that maybe some of that would be enough down the road to at least if there's an alternative, because the thing is there's, there's plenty people made this argument even all the way back into Napster where, you know, when some, some kids, some 13 year old that was using Napster would get sued by, by the RAA for all of these CDs, all these MP3s that they had for millions of dollars. Kyle Rankin (55m 53s): Right. And saying, will you infringe upon millions of dollars? Well, of course that kid in the absence of Napster would not have bought millions of dollars worth of CDs either. Right? I mean, they may, there's probably plenty of, most of that, the music they had, they probably wouldn't have paid a cent for, you know, that's why I've never really been that concerned about people pirating my books, because someone who wants to read my book and wants to support my book will buy the book. And if someone who's pirating, it probably isn't necessarily someone who would bought it to begin with. Maybe they couldn't afford it, or maybe they wouldn't have wanted to. So I don't really concern myself with that. Yeah. Doc Searls (56m 31s): It what, what it needs. And w w I am beginning to think we'll never get, because I've been fighting this, I counted now 22 years. Shawn Powers (56m 40s): Spirit. Yeah. That's the spirit. Yeah. Doc Searls (56m 43s): It's like, no point whale. Now, Shawn Powers (56m 45s): Captain wait, Will's not there, you know, or I might go out, Doc Searls (56m 50s): You go down like him, you know, it it's it's that there's no good journalism on this. I mean, I should, I should say there's some, I should, I, I shouldn't say that because I've been trying to commit some for a long time, but there's no, no, there's no critical mass of it. And one reason is because all of the publications make money off of it, and it's a third rail. They just don't want to touch. And, and it's too easy for them to go after the red herring of the Fang, Facebook, Apple, you know, whatever they are, who have Fang. I dunno, the, whatever the Fort misspell, you know, it's Amazon. Yeah. Shawn Powers (57m 28s): I'm N I wonder is actually typically interchangeable. It can be added. Doc Searls (57m 38s): Yeah. It it's. Yeah. It's, it's it's yeah. It doesn't matter. The bigs is, is imagining that it's all about Google being bad and Facebook being bad, but it's so endemic, you know, if you're focusing on them and you let's break them up, you know, that's, that's a big thing now, let's we gotta break them up into what and why and how and why nobody. There was no jeweler made for that. And on top of which, there's just no way. No, no way. There's it's so it's so like, here are four different kinds of fruit that are all fruit, but they're all different and generalizing about them is wrong. Meanwhile, there are 7,000 companies doing exactly the same wrong thing in a more or less consistent way that, you know, th that's what's behind every, every time you click on those except buttons, it's like, there's a whole bunch of parties you've never heard of. Doc Searls (58m 30s): There's not one of those four companies. That's busy exploiting you. And, and, and nobody wants to talk about that. Just nobody. I always love talking to you guys. It's always so true. Shawn Powers (58m 45s): Yeah. We haven't solved anything. Every time I go, like, you know what? I am just going to, I'm going to go to my farm and I'm going to live off the land. Kyle Rankin (58m 57s): There's certain tractors because you know, a lot of those, you can't repair yourself now because right. Doc Searls (59m 1s): The John Deere problem was a John Deere, Shawn Powers (59m 4s): But you think I could have repaired any of them ever, but yeah, I think, well, Kyle Rankin (59m 10s): They're all computers. Now you can repair them. Oh, Shawn Powers (59m 12s): Okay. Kyle Rankin (59m 14s): Part breaks. Just the computer part breaks now. And you have to go get hacked your Ukrainian firmware so that you turn it on. Turn it back on. Generally, I think is Shawn Powers (59m 22s): The, Oh yeah. Katherine Druckman (59m 24s): I think the important answer to Sean, what you just said though. I think that, no, we're not saying don't use an iPhone and don't use iCloud and don't turn on advertising. I think ultimately, I mean, maybe we're a little bit, but ultimately what we're saying. Okay. We kind of are, but to a certain audience. Yeah. We're definitely saying that, like, I don't want to participate in certain things, you know? And, and, and yeah, but really what we're saying is that we want people to at least know what they're getting into. You know, we want people to know what's going on. We want people to at least consider the fact that they're being tracked and consider if that's something that they want to consent to. Katherine Druckman (1h 0m 4s): And if their consent really means anything or not. So I think that, you know, well, okay, maybe we are telling people about that, Doc Searls (1h 0m 12s): Telling him that. And the thing is that they, I think opting in or out are both wrong. I say that there should be able to turn off the whole thing. Just turn it off. I don't, I, it isn't going to work. And I think you can, with the iPhone, with the IDFA, you can turn off the IDFA and then that should take care of the problem to some degree, or at least some of the problem. Shawn Powers (1h 0m 33s): Just be the devil's advocate though. And I'm, I mean, the boner pills were a joke, but that means, so if I, if I have no data to target advertising, and I'm an advertiser, I'm going to push my most profitable thing to the bulk of what information I know, like adults can't afford phones. Adults are generally people who need boner pills. So all I'm going to do is push ads for boner pills. Would you like me to say boner pills one more time? Doc Searls (1h 1m 5s): Yeah, but I mean, again, there's, it's all about, are you doing direct marketing or doing advertising? It's all called advertising, but most of what's happened is I'll put it this way. The, the most quoted thing I've written in the last 20 years is, is Madison Avenue fell asleep, direct marketing, ate its brain and woke up as an alien replica of itself. That's that's what we have now. We're dealing with the alien replica. We're not dealing with advertising as it used to be in spite of the fact that not one brand known to the world has been made by tracking based advertising. Ever. In fact, I addressed the class the other day in which the challenge to the students was named one brand. It was made by tracker based advertising and not one student could come up with it in the meantime, however, all the brands and just everybody that was there, I want to do direct marketing. Doc Searls (1h 1m 54s): They all want to go straight at you for stuff that you've done, that they think you might want to buy because they're tracking you. And the truth is Facebook could do a really good job of that. And so can Amazon, because we're already buying stuff from them. Amazon could do a better job than anybody at that. And they're doing they're, they're doing it a lot. They're put, they're giving you direct ads for things that you looked at, or they think you might want to buy because you bought something like that on Amazon before, or just bought Kyle Rankin (1h 2m 21s): The same thing. And they were just put the same Doc Searls (1h 2m 23s): Thing. And it's a whole other creepy thing with that, that the thing is that, you know, and I dunno what the hell Apple is going to do, but I, I, I know what, you know, th th th w what Google does is they basically follow you everywhere, and they make some guesses based on that. But Google makes 80 some percent, I don't know what it is, but it's a vast majority of its money from plain old search advertising, which is, which is based on, you looked for the height of some mountain and you're going to get hotels nearby. That's it? I mean, and, and the truth is all the tracking in the world is going to make that more accurate. Especially if you're a kid who is looking up to do a book report on something that had to do with that mountain. It's like, you know, and, and another fallacy behind it is that we're buying something all the time. Doc Searls (1h 3m 3s): We're not most of the time, we're not buying a damn thing. So speak first. Katherine Druckman (1h 3m 7s): Yeah, no, I feel like I always buy. Doc Searls (1h 3m 10s): Oh, it was nice. Yeah. But I mean, it, it, it's a T it's another reason it's a tough one. I know two alpha geeks, one of whom I'm pretty sure is known to you guys too. And I won't name who it is, but who told me recently when I tried to recruit this person, I wouldn't even give the gender away to the cause that I'm in working on, said, I have to tell you are really like targeted advertising. I no problem with the whole thing. I totally trust Google. I trust Facebook. I trust all of them. They're really good at what they do. I want to see, I want to see relevant ads. What's that don't give the person's name. Katherine Druckman (1h 3m 47s): Let's assign them a UI. I think it's not uncommon. Doc Searls (1h 3m 53s): It really depressed me. It totally depressed because this is a person I really wanted on our team, you know? And it's like, Oh God, you know, well, there we go. Katherine Druckman (1h 4m 1s): So I think that's the, that's the thing going back to your on, off switch. Yeah. W I think the four of us would all love for friends, or there'd be a switch where we can just turn it all off and not an exhibit. Doc Searls (1h 4m 12s): Never let it come on, period. You know, Katherine Druckman (1h 4m 15s): In order for that to exist, I feel like the only way that that could ever happen is enough consumer demand or, and the only thing that happened is if we, if they, if people truly, truly understand what we're giving up by participating in this ecosystem. Well, but let's say in theory that you, Kyle Rankin (1h 4m 38s): In theory, let's say that you created a phone that didn't have the notion of a tracking ID for advertising, Katherine Druckman (1h 4m 45s): And might've done something like that. Kyle Rankin (1h 4m 49s): Hypothetically, you could just browse the web, like it's a computer and maybe have ad block and all of this stuff enabled, and there's no Mon there's no monetary incentive for an application on that device to track your data and sell your data, because it's not part of the economics of that device. It's simply a computer. And then you're, you're, you just fall back to the traditional I'm using the web and a website may try to track me. But again, there are tools that you can install them web browsers to limit of quite a bit of that and get a different experience. So I'm saying that to me, the key is putting the control back in an individual's hands to, to have the switch, you know, until the, the user needs to have to switch that could turn it on and off, but they need to have the control. Kyle Rankin (1h 5m 34s): And right now, most of the devices that people buy the control is in the, is in the vendor hands. It's not in the user's hands. These are, it's not us. Like, for example, earlier, you made the comment about, well, but they, you can flip the switch that turns off the, the tracking ID. And as far as we know, it does what it says it's going to do. Right. And that's because there's no way for you to know. And that's intentional, right? You, you are hoping that that's the case. Just like you're hoping when you turn off location services on a, on an Android phone that it's actually doing it, but that in that case, it actually didn't do it, you know, but again, there's no way to audit those cases. So what you need is individuals to be, be put back in charge of the devices that they have to own those devices instead of renting them. Kyle Rankin (1h 6m 21s): And then when they flip the switch, it actually means something. Shawn Powers (1h 6m 24s): Then what is ideal is, is ideal where we go back to the soap opera type thing where, I mean, you know, soap operas had soap ads because, you know, Housewives would stay home and would, they would want to market to women at home who were washing their clothes. So thus the name, soap, opera. I mean, every website we go to, then it's going to be that right? Like, well, what, what kind of, what kind of people might visit our website? And then, you know, advertisers go after that. I mean, is that, is that the ideal? Doc Searls (1h 6m 55s): And that's called advertising, that's called advertising. That actually makes brands. Well, I think it's, I don't think that ever goes away, but I think that's, I mean, the weird thing is that's the first game. It could be the first game and it could be the end game. It could be, it would be the end game. If in fact, both customer demand and regulators come together said no more tracking, no more of that stuff. You, you have to ask for it. It's not that you opted into it or you have to escort. I want this, I'm interested in advertising for this. And I think then we're looking at a completely different system Shawn Powers (1h 7m 33s): And you're going to have website owners, you know, who are, I mean, they, okay, so then we're Linux journal, you know, we're no longer than one external, but we were. So then we're back to getting as much information as we can gather about our readership to try to get advertisers, to give us money. So, I mean, it's, well, there's still a level of creepy still. It's just, we're not directly taking it from this. Doc Searls (1h 7m 57s): It's not creepy. I mean, for a narrow, for a narrow cast that publication like Linux journal is not hard. We're after Linux use to people care about Linux. There's a lot of things, ways you can characterize, right. Is that okay? Shawn Powers (1h 8m 11s): I mean, how much information does Lennox journal this theoretical in past Linux journal that doesn't exist now? You know, how much information can we gather on our users and how much is, is fair for us to share with advertisers. Right. We have all, you know, like 82% male readership. Is that something that we know? Okay, Katherine Druckman (1h 8m 30s): No, we had 99. It was like 90. Yeah. 98 on a good day. It was 96. Maybe we should have been selling banner ads. For sure. I know, Shawn Powers (1h 8m 43s): You know, the, the creepy factor is still there a little bit. Not as, it's not as directly creepy, but I, I think it is how much information do, but, but Doc Searls (1h 8m 52s): If you're in a sellers market, okay. How, how much, I mean, how hard does it have to be to sell a Superbowl ad? You know, exactly what it was watching. I mean, it's, it's a similar thing. You've got sports fans watching, so you advertise on there. I mean, yeah. Shawn Powers (1h 9m 7s): And here's the other thing is in your example, we, you wouldn't actually know necessarily that demographic data because my questioning, so yeah. You're just getting someone using a web browser that's visiting, you know, this person had this web browser. Maybe you liked the contents of weblogs the way that, you know, all that demographic data is through the creepy tracking. You know, otherwise you're just making assumptions and guesses, which is how, how it used to be. Right. You're just saying customers and stuff, but yeah, with paid customers, you know, we would, I mean that, I guess that, that was my point is how much information is everything anonymous then? I mean, if I subscribe to a magazine, you know, do I have them send it to a random PO box with a UID instead of my name and so that they, you know, there's no demographics or, or do they, I don't know. Shawn Powers (1h 9m 59s): I mean, there's still a level of it. Doc Searls (1h 10m 0s): I have to give up your name, but I ideally what would happen when you subscribed to something is that you would have an agreement with them that says, you're not selling this at this to anybody. We don't have that either. For the most part, you know, a lot of the things we subscribed to do seller do seller addresses and so forth. So that's, I mean, the, the selling of personal data goes back a long way. Heck I mean, there are States that, that, that sell driver's licenses have hunting licenses, all kinds of things. I worked for a brief time consulting, Axiom, which is one of the more evil companies out there because they were wanting to change their evil ways. And when they told me what all they collect, I mean, there's States that sell personal information. Doc Searls (1h 10m 42s): I've been doing it for many decades and it, and that's not right either. You know? So to me, it's like, if we're going to get serious about privacy, we have to go all the way back. We have to go back to, you know, to let so much go all the way back. We have to spread it out to the physical world as well. Shawn Powers (1h 10m 58s): So I guess, go ahead. Katherine Druckman (1h 11m 0s): Oh, sorry. I think because we're so ingrained, we're so much a part of, of this creepy tracking method of getting of getting demographic information. We've kind of forgotten the way that we used to do it a hundred years ago, which we would just ask people, you know, we didn't have to stalk them to find out this information. We could, you know, if you have a reasonable sample size, you, we would just put out a survey to our readers or, you know, our, our web visitors and just ask them politely they could, they could participate or not. And, you know, and that was very valuable information and we didn't have to do anything shady to get it. Doc Searls (1h 11m 39s): Yeah. And an interesting thing too, and maybe I'm wrong about this. You guys may know more about it than I do, but I believe we were on our way to being at least treading water, if not profitable, selling plain old advertising at Linux journal before we got killed, Katherine Druckman (1h 11m 53s): Where we selling advertising? Oh, no, Doc Searls (1h 11m 55s): We were, we had it, we had an advertising. We actually had somebody who was the head of advertising after has Carly. And yeah. Shawn Powers (1h 12m 1s): I mean, that, that was, that was my understanding was that it was, it was very close to theirs, you know, basically. Yeah. It would have been possible to be kind of breakeven or stay afloat. Right? Doc Searls (1h 12m 14s): Yeah. That's, that's what it was. I mean, it was, and Katherine Druckman (1h 12m 18s): We, we had removed all, we didn't have an ad server, so there was no, there were, it was just a fixed spot. If you, if you wanted to sponsor the publication, you had a fixed logo or whatever that would, that would appear on all pages or some pages or whatever. And that that's at that point, what we were doing, we didn't have any kind of trackable ad server, which was nice. Actually, it was a, Shawn Powers (1h 12m 41s): And, and here, I mean, this is, you know, the depressing that I know, we're kind of probably getting close to the end here too, but we were trying to do things right. And we're no longer in business, Doc Searls (1h 12m 53s): But I think the re the reason, the reason we're no longer a business is not because we tried to do things. Right. I think it was a really, it was, it was, it was a wacky thing that happened. That's probably not worth going into, but it, it was, it was not, I don't think it was not our fault at anything to do with trying to do things. Right. I think it had to do with, for whatever reason, the company that owned us wanted to kill it at that time. And, you know, they had their reasons, but they were nothing to do with us. Yeah. Katherine Druckman (1h 13m 23s): Yeah. We may, the world may never know. I wouldn't assume that it was because, because that was a faulty model though. I think people want to support good content itself. Doc Searls (1h 13m 33s): Yeah, they do. Yeah. That's inherently generous actually. Yeah. Yeah. They are. I give people the benefit of the doubt on that. I'm an optimist in spite of my cranky, Katherine Druckman (1h 13m 46s): Wait, wait, we have some new Patrick Patriot, Pat Patrion, Patriot. I don't even know how to say that. We have some new generous people contributing to our podcast. So thank you to those people. Yeah. I know. It's pretty exciting. Doc Searls (1h 13m 58s): And thank you. That's great. Yeah. And Katherine Druckman (1h 14m 0s): At some point I will investigate other methods that are not that particular site Patrion. I mean, I'll think of other ways if you guys want to generous to us, that would be nice. Anyway. So yeah, I think, yeah, as Sean said, we've been talking for possibly forever because there's so much, there's always so much when we get together. But yeah. So the celebrate thing is problematic. Advertising is problematic. I think that's the summary here. What's the takeaway and also boner pills and also boner pills. Thanks, Sean. Like a product you could sell, probably a problem. Katherine Druckman (1h 14m 41s): Problematic it, vacuums and dusts. If things aren't bad enough for you, there's the problematic. Okay. Well, thanks if you've made it this long, thanks for hanging in with us while we solve the problems of the world or attempt to, and we look forward, at least name them, name them Lee, welcome to the depression podcast where we talk about civilization informational. We'll never get that off switch if people don't know if people don't know they want it, we'll never get it. So on that note, thanks for joining us. And