[00:00:00] Katherine Druckman: Hey everyone. Welcome back to Reality 2.0. Doc Searls is back with us today, and so is Petros Koutoupis, our frequent guest slash guest host. And in addition, we have Don Marti. Don Marti, who you may remember, has been on on the show quite a few times. Don is among other many, among many things an expert online privacy and especially issues like the California, the C cpa, the California Something Privacy Act. I remember what [00:00:27] Doc Searls: Consumer. Hard to say that word. [00:00:29] Katherine Druckman: The California Consumer Privacy Act, I had to say the words Confidential Computing Consortium several times in a row yesterday. And like, my tongue is messed up. Thank you for joining us, Don, especially at short notice. We tend to, we tend to ambush people here. Doc is great at that. I appreciate that skill . But before we get started, I wanted to remind everyone to check out our website at reality2cast.com. That's the number two in the URL where you can sign up for our newsletter that we occasionally send out. When we do, it's very good, I promise. And, check out the links attached to each podcast episode and other things. yes, thanks y'all. Thanks for, thanks coming on today. It again, a little bit short notice. [00:01:12] Doc Searls: Yeah, the, the, the, the short notices occasioned by Catherine asking me if I, if I was using Permission Slip, and I thought, do I need a permission slip to do anything? I, I didn't, I didn't know what it and I knew the consumer reports is working on something like that, and, Um, in the spirit of everybody not knowing everything, I, I didn't, and, but as soon as I looked it up, I thought, oh, this is actually a cool app. And I downloaded the app and, uh, and I immediately thought, Don must know more about this than , not only than me, but when, than most people. So, cuz he's been involved with Consumer Reports and Mozilla and other allied efforts. So, Don, give us, give us the, you know, The framus on, on what? On I guess two things. One is the context, one is, which is that it was occasion about the California Consumer Privacy Act, which gives people the option to demand that their data not be sold, which is basically a way of saying Stop tracking me and all that stuff. Um, and people in California and people elsewhere will see that. Websites that, um, opt out of having your data sold. So anyway, I'm saying too much. You, you say it instead. [00:02:28] Don Marti: Yeah. Well, to get back to the basic legal level, there are really two kinds of privacy laws in the world. and most of the privacy lawyers out there are gonna say, I'm vastly oversimplifying this, but from the point of view of writing code to, uh, implement privacy tools, uh, there are two kinds of privacy laws. One is consent based, um, like over in Europe, and the other is opt out. like the C C P A, um, and now C P R A, which is, is kind of the follow up law, uh, in California. So when they originally wanted to do the C C P A, some of the discussions said, well, hey, let's just copy GDPR in Europe. Let's. All the companies that want to use your personal information to, uh, go get permission for it. And the lawyers went over this a little bit and decided, well, it's more likely to hold up in court in the USA if we don't. put in some new, uh, consent requirement, but instead we create a requirement for an optout. And so that's, that's where we got C P A as working fundamentally different from the gdpr. the problem of course, with opt out is there are literally thousands of companies. have or could have your personal information, it would take you and weeks of effort to, uh, opt out of every company that you might want to opt out of. even if you could get through each one with, uh, just five minutes of work. And, um, at the time they put C C P A together. , they decided, well, in order to make opt-outs practical, we've gotta add two other features. And one of them is a global privacy control, which is signal your web browser or some other piece of software can send that says, I'm a user who wants to opt out of stuff so you don't have to do it manually. other feature that was built into the law was something called an authorized agent, where you can sign permission for an organization to go and opt out of the sale of sharing your personal information for you. So instead of having to go to each company individually, would sign up with an authorized agent. And the authorized agent goes out and, and opts out with a whole list of companies that they maintain. permission slip from consumer reports is a front end to unauthorized agent service. [00:05:58] Doc Searls: what, so the, the authorized agent that we have, and there's been a lot of conversation about this on some other lists have been on. It is, it is sort of a very abstract concept. And, and there's also, it's confusing because some people call your browser's an agent, your email client is an agent, but this is an actual, um, active external instrument that you use that's provided by a company. It's provided by Consumer Reports, and it's this free app called Permission Slip, and you can use it to selectively opt out of. The collection of data about you, you can ask them to get rid of all your data if you want. It had been interesting to me, cuz I started using it yesterday and, okay, I have a cv, a cvs, uh, account. I have Home Depot account. I don't know if I have an AMC Theaters account. It's, it, it kind of, you, you kind of flash through it like you flash through cards in solitaire to look at the different companies that might have data on. , which I guess are consumer reports best guess that a lot of people might have these, um, or maybe at the other end. Donna, you could correct me if I'm wrong about this. Consumer reports knows that there's an instrument on the CVS end or the, or. The Home Depot end with the lawyers there have set up a way to. Yank out selectively the day that you've asked them to yank out or not to sell or what else to do with, but it's almost gamified that way. But I, I, I, I do have the feeling that there is a well-meaning agent that's working for me. That, and that's a, that's a cool thing. I imagine we can have these things for higher, but this one's for free and the labor involved is really pretty low. Cause it's all kind of, [00:07:44] Katherine Druckman: I use a paid service actually for something similar. I use something called Delete Me. We've had, we talked to the Rob, the founder of Delete Me actually on the podcast a while back. But this is interesting because it's free, it's, it, it, the interface obviously is completely different and so is the mechanism. But it's interesting. It was a good move on Consumer Reports behalf, in my opinion. [00:08:03] Don Marti: Yeah. Yeah. It's, it's, um, substantial and growing list of companies that consumer reports, has figured that a lot of people in California probably have an account with, um, probably have at some point, uh, shared their information. . And of course there's no way that to get the list to be perfect, but I think it's a good balance between retailers that, that people actually buy stuff from, like Doc mentioned Home Depot and some of the lesser well known, um, companies in the shadows like Oracle, which is of course famous as a database company, but they're also. a data broker who will sell information, uh, about you. the reason why we put that particular one on there is because I did something with, uh, called A C C P A right to know, uh, with Oracle came back with a huge file. They've got a lot of information on me. They've got like my income code, my neighborhood code information about my family. Um, I'm a soccer mom of four. Um, [00:09:24] Katherine Druckman: Very [00:09:25] Doc Searls: Like these erroneous things. yeah. Like Axio I lived, well, I haven't lived in Northern California for 20 some, 25 years, almost 24 years, but it still thinks I lived there, , you know? And I'm not gonna correct that, which is another [00:09:41] Don Marti: Right. [00:09:42] Katherine Druckman: speaking of where we live, something that Don just said I wanted to mention, and that is people who live in California. So in their uh, permission slip specifically has that, because I'm sure that's a question on all of our minds, and it was Mo the first thing I asked, uh, I wanted to ask is, can I use permission slip even if I don't live in California and they. they say it's permission slip is based on the C C P A, which defined new data rights, et cetera, et cetera. Um, the, the bottom line is, it says if you live outside California, you can use it to send the requests. However, your requests are likely to have a lower success rate and may take longer. So [00:10:21] Doc Searls: cuz they're demotivated to deal with And here, here's an interest. thing that I think the, these companies would like to know about. Um, I, I haven't hit Kroger on this yet, but I, um, when I, when we came here to Bloomington, Indiana where we, we, um, speaking from, there's a, there's a Kroger nearby and you know, I wanted a Kroger. You know, basically like most of these retailers, the grocery retailers especially, they. They have tiered pricing. They have the, the real price for the members, and they have an more expensive price for the people who are non-members. They say that's the real price, but it's not. It's the, it's a, an inflated price. And I looked at their terms and conditions, which basically say, we care about your privacy, bunch of paragraphs, we care about your privacy, a bunch of paragraphs, and then, Way, way down. It says, oh yeah, will, we're our third party. We give everything to our third parties, , and they can do whatever they want with it, you know, and you know, we're our, our data is everybody's. And, and so I did what I think a lot of people do, and Don, in his case is a soccer mom for a similar, right. Is. You know, I, I just faked up something. It was somebody else and I, I get the discount, you know, I have to use their, I have a little key fob on my key chain and it's the Kroger one and it's got the, got the code on it and the, the barcode on it. And I never get any promotional materials from them cuz it all goes to a fake address, you know, but it's, but I get the, I get the, the price I want, but if they'll, if I have an agreement with them. That's based on an opt out, um, of the selling of that data or the giving away of that data. I might be willing to give them my real address. I might be willing to deal with them in a more friendly way and, and not, and not game them in that way. And I don't know whether or not there's ano there would be enough of that or whether that kind of behavior is researchable enough for anybody to want to. But I think it's a factor. I think it's a factor that that fewer people will lie and hide if they have a sense that they have a genuine relationship based on a kind of trust that could get scaffolded up from this kind of, this kind of agency. [00:12:43] Petros Koutoupis: A, a few things. You know, number one, am I the only one that, uh, is under the impression that, uh, doc is recording from an underground bunker? [00:12:55] Doc Searls: You kidding I It is true. It So, so for those of you who are not, we're all seeing each other cuz we're, but this is an audio. Podcast. This I in the basement of our new house built in a worker's cottage in Bloomington. It was built in 1899. And behind me on the wall, the, a painted wall of limestone because you, you dig down two feet here and you hit limestone, and most of the state capitals in the country are made with Bloomington Limestone. The, the, the. The, the, um, empire State Building is clad in Bloomington Limestone and all of southern Indiana is, is this kind of limestone. And, uh, so it's actually the, this wall is, I just visualize it for people. Some of it is just rough cut stone and some of it is obviously millstone that has, that has smooth surfaces in there, roughly rectangular that you know, are cast off. From the mining and milling process. You know, somebody dragged this home on a horse and buggy or something at that, at that time to the top this hill. [00:14:00] Petros Koutoupis: num uh, the, my second, my second thing that I wanted to mention was, you know, since, since the start of this conversation, you're talking about memberships to, to anything and everything. And I imagine your wallet is about like four inches thick, full of cards all the memberships that you have everywhere that you, uh, [00:14:19] Doc Searls: I know I, and, and I know people who have, um, who have a wallets like that. Or, or just even a stack of things, you know? I mean, there are actually things you could buy that you would put in your pocket or your purse that have that. Um, and, but increasingly today you don't have those. You just give 'em your phone number or you, you the long run this, in the this [00:14:42] Petros Koutoupis: phone number, right? Fake phone number or fake name. I mean, it reminds me a bit of me whenever I fill out something like that. I, I use like a first name, Joe, last name, mama. You know, just something, uh goofy [00:14:57] Doc Searls: And, and as, as of course, a peculiarly, high number of percentage of people were born on one, one of some year. Right. Because that's the, that's an easy one for them to remember, right? That's not their real, not their real birthday. anyway, so, so Donna, I, I'm wondering where you, okay. This, this is a real big jump from this, the research that, and now you may, you were, if you weren't involved with it, you certainly knew a lot about it that Consumer Reports did, where you could manually do what they're doing now. You could manually send something off. I tried it with a couple companies and it, it got immediately into a more ass where I'm hearing from their lawyers and , you know, it was pointless. Um, but there's. Evolutionary jump from that kind of experimentation to a real instrument like they have now in this app, which is only available in iOS so far. But, um, works, um, to what comes next. And I think there are a bunch of different ways this fans out as possibilities, [00:16:00] Don Marti: there were two rounds of. , there were two rounds of research. Uh, the first round, the first round of research is giving people the instructions to go do c p a opt-outs, um, or do not sell, uh, emails manually. And Doc, I think you are actually signed up for both of the projects. The second research project is we asked, uh, a list of people to. Sign a, permission letter giving, uh, us as consumer reports the option to, do an opt-out as an authorized agent for that was kind of a funny project because we had a calendar of, well, we're gonna need this number of research participants, therefore we've gotta send out this round of email. we're gonna do a follow up. Then there's this other newsletter that we can put it in. So we had all these rounds of recruiting participants for the authorized agent study. Um, we sent out the first email and the whole thing filled up right didn't have to do any of the others. It was, it was incredibly quickly filled up. So, um, in that round of of research, went through the process of doing opt-outs and frankly, a lot of the opt-out. Processes were completely broken. We got, email bouncing. We got broken web forms. in the case of one company, we actually ended up printing out all the opt-out paperwork and sending them a nice stack by a postal mail. shortly after one, um, that company came back and said, please use our new process. We fixed it up. It's. , all tricked out. Please use Um, because it turns out that, that, here's here's a, a citation needed. A def factoid is that it costs the average major corporation $40 per page to process paper correspondence that you sent to them. of course, we sent them a big stack. and so, so there's a, a big incentive for companies to go through the steps to get it right and. . The, the recent announcement that just came out on, uh, Friday, January 27th is, um, General Rob Bonta actually mentioned permission slip in, uh, an announcement about data privacy day. So that raises more, um, more attention to project and. Likely to get a lot more companies interested in and, and motivated to handle these, uh, opt-outs quickly. [00:19:11] Katherine Druckman: So it just, it, it was just released and I, was it November? Do I have that right? October, November. It was, it's, it's quite, [00:19:17] Don Marti: Yeah. I've had the trial version for quite a while. I don't remember when the official release was. [00:19:23] Doc Searls: Yeah, it's not on their homepage. I'm kind of surprised. And if you look it up, you get to their lab and their, it's not, it's not highly SEOed, uh, put it that way and, and I don't know if that's just because they kind of wanna let it dribble out there and, and do what it does gradually and iterate improvements on it or, um, you know, it's just not a top priority for them. I think it's a fabulous idea and I'm especially interested in what happens if some kind of critical mass of people start using a critical enough mass people or a critical mass, not quite a mass, but. So that it causes companies to kind of rethink and then re routinize the way that they deal with customers or the way they look at loyalty or, you know, how it starts impacting some of their, systems. Is it, I mean, is the cost of, is the cost of dealing with this stuff, uh, does that exceed the cost of, giving people's data away and doing whatever else they do with this? [00:20:31] Don Marti: Yeah, there's a, there's definitely a precedent for this whole situation, and that was, um, a law called rra, uh, resource Conservation and Recovery Act, uh, from the 1980. and RRA was really interesting because it required companies that maintained hazardous materials on the premises to turn in a report to the e p A about, oh, we've got this quantity of the following chemicals at this location, and ended up happening with r. is, so many companies a random shelf full of random hazmat in the corner that they weren't really getting much value from, [00:21:30] Doc Searls: It's, it is like presidents having, uh, uh classified documents, right [00:21:36] Don Marti: Well, yeah, it's like, it's like stuff, stuff that can only impose a recordkeeping responsibility on you, plus possibly a cleanup expense if somebody spills something or there's a fire or an earthquake. So the result of RRA ended up being not so much that Uh, started generating these huge reports on every can of hazmat. Um, it's a lot of companies got rid of the stuff and they redesigned their processes to avoid having to report on all that, that random hazmat and. The, the supermarket loyalty programs doc that you mentioned are a really good example because those programs collect way more data than the supermarkets actually use, and at some point there's going to be a, uh, reevaluation of is it worthwhile to do all this data collection and regulatory compli. or since we have these, these regulatory and, and, uh, other limitations on how we can use this information anyway, should some of that investment go to other places. [00:23:14] Katherine Druckman: So you mentioned the, these, uh, grocery store loyalty programs, which is kind of leads into something that I've been thinking about and, uh, fir First I have to say, I, I, I feel like this sounds like a disclaimer, even though maybe it's not, but I think the. Permission slip app is a fantastic initial step. I think it's, it makes this, this whole process so much more user-friendly. However, um, a limitation and, and, and it's nothing. I, I, I don't even think the, the app could address this yet because the law isn't in place, but, or there are, I don't know if there are laws in, in place, but are. Certain things I'm happy to share with maybe a grocery store, maybe Starbucks. I want my points. I'm fine with them under, you know, tracking what my orders are over time. I'm okay with the grocery store knowing that I like to buy this brand of milk or whatever. but like you say, they collect all of this unnecessary stuff. So the, my struggle is that there's, there lacks a mechanism for, for consenting to certain collection, but, other data collection. Right? And so it, when you, when presented with something like permission slip, it's an all or nothing thing. It's like, I, well delete everything from McDonald's, delete every, I don't, I, I saw McDonald's. I don't, I don't think I saw Starbucks, but , you know, delete everything from Starbucks. Whereas really, I'd like to say delete all the stuff that's creepy, but it's fine for you to track my lattes and give me points where, how do we get there? [00:24:41] Don Marti: Yeah, I think, I think there's going to be a set of, um, Escalation steps that people take with companies with different set of practices. So if you have an extremely basic loyalty program that tracks purchases, but it doesn't track your location, it doesn't do face recognition, it doesn't do all that stuff, um, then those kind of companies will tend to be lower for opt-outs. if there's a very intrusive. Set of marketing practices that does face recognition or, or fine grain location tracking or something else that people perceive as creepy, then those will tend to get more attention from authorized agents. So there's, there's been a lack of focus on. User for understanding privacy norms, uh, in, in the whole marketing business for quite a while, and it really shows a lot of marketing. People are trying to come up with a single set of rules for processing everybody's data, and that's just not how it works. There are some people who. Like content or would rather see ads than general purpose ads. And personally, as a privacy nerd, I don't understand those kinds of people, they, if, if a platform gives those people the ability to turn their personalization on, and gives the privacy people the ability to have the system reflect their norms, people are gonna be a lot more comfortable with, um, trusting companies to use the appropriate information in the appropriate [00:26:43] Katherine Druckman: I think it goes back to doc's proposition that he makes o you know, many, many, many times over that we individually need to be in control of our norms. We need to be able to broadcast this, this, this is what I'm comfortable with and this is what I'm not, because that does vary tremendously. [00:26:58] Doc Searls: Well, there's, um, th think about the way we deal with secrets, right? Um, yeah. You know, I, I, I tell any one of you a secret. Um, that's. You know, the assumption is that the other party is going to keep the secret. Now, we always, we all know that go how gossip works and doesn't always happen, but there's a, there's an informal understanding in, in the vernacular of everyday life that the other party will keep a secret. That's kind of what we expect of companies doing. With our fir what, what they call first party data. It's actually first party data for us and a second party data for them. Either way you look at it, you don't want it to go to third parties. Now it's proforma for them to seller or give it away to third parties, which is why we're the, the messer in now and we have these laws and we have these instruments. And I was, I was looking here at, um, at the app and. and I advanced to Amazon , the list of stuff that Amazon has, I and I, I spend more with Amazon probably than [00:28:05] Katherine Druckman: Then you'd like to [00:28:06] Doc Searls: probably any other company. I hate to say it, but I mean, it's I'm, I'm overall, Probably that's the case. And there's identifiers, account information, demographics, financial, online tracking, physical characteristics, location cameras and sensors. That's their cameras and their sensors, by the Alexa, et cetera. Jobs and education purchases, social life communication, and then other. So the interesting thing there is I might, I might a little have them having some of that stuff. I don't have an Alexa. I wouldn't have one, but, um, it's fine for me if they know a lot of stuff about what I buy, there's no problem with that. Um, I have a big problem with them giving it away to somebody else. The cool thing about this tool is it forbids the selling of the data, which the C C P A conceives of any of the shit like you, you give it away, you've sold it. That's the characterization. And, and that's fine with me. I kind of like there's, there's. And outcomes razor there. I guess there's a, but there's a sharp distinction between the, the data they, they need and they would like to have in order to better serve me as a customer. And data I don't want them to give to anybody else, and as far as I'm concerned, is not their business. You know? And that should be the default and that maybe should be where we get at the end of all this is to kind of turn any data you're gonna turn over to a third party. Into a RRA example of, of a haz, uh, the hazardous waste that you have. And by the way, of, I, I have a friend who some of you may know, um, uh, who actually left in part because of its practices with that stuff, you better job somewhere else too. But, um, you know, and I did not know before they told me that they were, that Oracle was in that business at. [00:29:57] Don Marti: Well, if you're going to do a C C P A right to know, I would probably put Oracle down as of the top companies to start with. Oracle has lots of, information about you and. Probably fairly Um, so far I haven't met anyone whose Oracle data was particularly accurate, but there's an awful lot of it. the other company that I would recommend for people who want to get started in the fund, uh, habit of ccpa Aing Verizon. They've got huge amounts of. on, um, largely mobile phone shopping habits, like intent to switch carriers, that kind of thing. But also, uh, some really weird stuff like, uh, how much has this person, accepted the customs of the usa or something like that. It's really, um, uh, there's, there's really a lot. There's, there's a lot of stuff in, in your Verizon data that, um, that is gonna stand out. Um, not so much because of, it's like this kind of psychographic or attitude, um, uh, data, but the fact that it's just like a numeric code, like think there's likelihood of this individual to be an influencer on social networks. [00:31:33] Petros Koutoupis: how would they even come to such, like, what data would they collect to, to come to such conclusions? I, I don't know it [00:31:41] Katherine Druckman: All of it, [00:31:43] Doc Searls: Uh, it is some AI that they can apply to it, I'm sure [00:31:46] Don Marti: Yeah, they probably have some vendor claimed a, a mystical data driven that would attach a social influencer value to, uh, uh, everyone. They have [00:32:03] Petros Koutoupis: and then all of a sudden we're all soccer moms. Right? [00:32:06] Doc Searls: Yeah, there [00:32:07] Petros Koutoupis: but with Verizon. [00:32:09] Don Marti: knows? Yeah. Colorado's gonna be more interesting though because co under Colorado's privacy law, not just allowed to send uh, uh, alpha numeric for a data point. Uh, they actually have to send it in a comprehensible. . So, um, I'm really looking forward to comparing notes with someone from Colorado. Um, can you please, get your Verizon data and let me know what all these mysterious numbers mean? if, uh, want to know if a 14 is good, uh, an online influencer. [00:32:55] Doc Searls: That's, that's interesting. And I find with, I don't have a, a Verizon account. I had one a thousand years ago, but, um, but I have T-Mobile and, um, wondering what deal is as well. [00:33:07] Don Marti: Oh, I've never had a Verizon account, but, but Verizon has a a lot of [00:33:12] Doc Searls: Really. Wow. they come by at some of it, I guess, all secondhand or They, they are, they're the end customer of a third party, or they are itself a third party for whatever. [00:33:25] Don Marti: Well, Verizon Media [00:33:30] Doc Searls: Oh, right, right. They ate Um, [00:33:33] Don Marti: and [00:33:33] Doc Searls: aol, right. Eight AOL or something like that. [00:33:36] Don Marti: right, so Verizon at one point was, owner of, uh, Yahoo and some companies that got lumped together. Uh, as oath, and then they changed the name of Oath, and then Oath got spun off. So there's, there's some, some, um, mergers and acquisitions type, uh, stuff that may be behind how these companies have all this information. Um, but really a lot of. Um, lot of people don't have time to keep up with who bought who and who might have your information from what source. So you really, you really need an authorized agent to go out and do it for you. [00:34:27] Katherine Druckman: it would be impossible to do this. I, I mean, it's impossible, but it would, it wouldn't be feasible for, without something like going back to the, or, you know, the original topic of this app without something like, Mission slip or, or many of the paid services like I use and, and there are others available. I can't think of the names, but it's just not feasible. And that's, that's the frustrating part, or that has been the frustrating part. that's why I would , that's why I pinged Doc about Permission Slip. I was like, ah, did you see this? This is great. It's free, you know, so anyway, but yeah, it, that's, that's the, the rub is that, yeah. Okay, great. There are these consumer protection laws, but getting them actually exercised is a whole different. rights, I mean [00:35:08] Don Marti: And. and at the time they put the C C P A together, they did realize this. They did realize that you can't have an opt-out based privacy law giving people the ability to use agent. Um, and without giving people the ability to use a global privacy control in their browser. and it's just taken the, the companies a while to get into compliance because a lot of them just uh, said, all right, fine, we'll just turn on our GDPR software for California too, and you can't get to where you need to be in California by doing a subset of what you had to do for Europe. [00:35:57] Doc Searls: So I've done a bunch of experimenting using a vpn, what, when I was in California, which is most all of January anyway, um, and before that I'm there about a third of the year, something like that. And it's interesting to me that there, you know, there are these companies like OneTrust, I'm, I mean, when you see the consent popups and all that stuff on your website, and if you bother to customize your, your own opting out at that level, um, None of the sites are doing it themselves. They're hiring some other company to do that for them. So, and OneTrust is the one I think I've seen the most, or just the one I best recognize. And I'm wondering if they get into this game somehow, like wait minute, they could just add something to what they're already doing so that the c p a opt out signal goes to their gizmos and, and, and they handle it cuz they're keeping the records, I assume. And I don't even know if they are keeping the records. There's no there. Legally, no way that you could go back and audit them. You don't have the mechanism at your end to audit them. They give you a cookie. You don't even what the cookie says, and there it is. But, but they, it might be a game for them. [00:37:09] Don Marti: Yeah. And it, it definitely is. there's a project called Data Rights Protocol, um, which is another thing from, um, Consumer reports, um, that, uh, is to automate these flows. So right now, most of the time you use permission slip and, uh, it's automated and super easy from the user side. , but then from the consumer report side, there's some manual of web stuff and emails and spreadsheets and whatever going on. the next generation of this is to make things less expensive, both the company side, including whatever service provider they use, like wire wheel, OneTrust, whoever. Um, and on the authorized agent side. So as, as the user base of something like permission slip grows, it won't mean more manual work for either the authorized agent service or for, the companies being opted out of. So check out, uh, data Rights Protocol and I'll send you [00:38:35] Doc Searls: I just, just put one at our chat here, but as data rights protocol.org, all one word. And it's, uh, consumer Reports, legal hackers, and the MIT Media Lab. working together, but I, I, consumer reports is in the center of that. So I imagine they're, the, the main, the main, the main outfit there. Interesting. Yeah. I, I, I, I'm wondering it where this ends up, I mean, because at some point there's a. It becomes normal for companies to look at and say, boy, the cognitive overhead and the operational overhead of dealing with all this stuff is pretty high. You know, it's not just that this stuff is a legal liability, it's this kind of toxic waste we're keeping in the back room. , a lot of which isn't being used, but boy, it sure is a pain in the ass to try and use all this stuff in a way, in the way we were accustomed to doing it, when there was no pushback at all, when the pushback gets high. And I was maybe, maybe if we, I mean, I, I, I was point to this, but um, you wanna see how to do business? Just look at, just look at Trader Joe's. They have none of this stuff. And they have, and people love. You know, and it's not a coincidence. The cognitive overhead of using Trader Joe's is zero, as close to zero as you can make you know, and, and the people in there are incentivized to find out what they can about what customers are buying, but not getting personal about it. [00:40:06] Don Marti: Yeah. Yeah. And that's the one that's I, I guess one of the big misconceptions about the pathway to the. Post surveillance economy. And the biggest error, uh, about looking at this stuff is, we need to balance uh, privacy interests of users against economic benefits of, uh, surveillance marketing. And there are no benefits to surveillance marketing. Surveillance marketing is a negative sum. Surveillance marketing investments compete directly with investments in content and investments in product, uh, innovation. and the biggest example of this so far is mobile gaming. some of the, uh, changes Apple um, surveillance ads in iOS are having an effect on mobile game developers. which is instead of making a bunch of minimal effort games, uh, that they can then promote by exact surveillance investments, they're instead more in, in game content and game play, um, in order to overcome the lower ROI of the surveillance ads. So, Mobile games and mobile game ads are kind of a fast moving economic laboratory for, um, the advertising slash product innovation, um, market. And it's really, uh, it's really interesting to watch this. , this change, um, taking place there. And I think that as we see lower ROI from, uh, surveillance marketing projects in, in, uh, both within companies at the venture investment level, uh, we're gonna see more of that money flow into either product innovation. Um, into process innovation that can result in price cuts, uh, or into content. [00:42:45] Doc Searls: There were so many quotable one-liners in that [00:42:48] Katherine Druckman: Yeah, I, yeah [00:42:50] Doc Searls: Hello. It I, in the challenges that we just said. I mean, it's such an important point that, that there really is not an economic advantage to surveillance, you know, there and. Economic advantages to spending that money on many other things. It's a war on the, on the cost side of the balance sheet, right? I mean, it's not, um, directly bringing in money. [00:43:14] Katherine Druckman: Yeah, [00:43:15] Don Marti: Yeah, well, it's a, it's a negative sum game where the results are very easy to measure short term, and so you can put up good ROI numbers for a surveillance marketing project. And, um, it. Relatively easy to use surveillance, marketing, uh, projects to, uh, get investment, whether at the VC level or the intracompany level. Um, the problem with sum games like, uh, product innovation and um, and creating new content is that they tend to be. Slower and less accurate to measure, even though they create more value on average. [00:44:09] Katherine Druckman: it's interesting what you said about the gaming, the mobile gaming, because I, I do kind of see that I, I myself play a couple of mobile games and I, I see the innovation, the rapid innovation and change and experimentation and, know, trying to eek out a little more revenue. And, and I also appreciate what you said about. Basically nobody wins here with surveillance advertising. We all, he, we here know very well that it's not good for publishers and content creators. Um, it's, I don't see it as particularly beneficial for advertisers. And, and yet somehow we've forgotten as humans that, that there is another way to interact with potential customers. And, and you know, I hope to see that return at some point. [00:44:49] Don Marti: Yeah. Well, just because it's a negative sum game doesn't mean that there aren't winners. There are. that can ship a product from China and get it on Amazon and put up Instagram ads for it and make a profit at So, um, negative sum games still have winners Um, yeah. [00:45:16] Katherine Druckman: it's not a win. It's still a loss. [00:45:19] Don Marti: Right, right. There's, there's product innovation right now that is not happening because money that should have gone to fund it has gone into a surveillance marketing project [00:45:31] Katherine Druckman: I would, I wouldn't consider the individual wins. Certainly worth the , negative impact on everything else that it has. Is there anything y'all else y'all wanted to make sure we covered before? [00:45:42] Doc Searls: and I is great. [00:45:43] Katherine Druckman: You know, I'm glad we talked about the app. [00:45:46] Doc Searls: Yeah. I think this is great and I, and I hope, um, yeah. I hope we successfully promote, um, permission slip. I think it's a great way to learn about what's going on, among other things. I mean, it's kind of, it's a learning. And I, like I said, it's kind of gamified and you, and you get a different list. I, I read earlier the, the, the long list at at Amazon is a nearly as long list At Home Depot there was shorter list. There was a shorter list actually at, uh, at cvs. And I think that might be because, at least as I recall, I might be recalling wrong, but, um, in the, you know, they're in the medical business. They're, they're a pharmacy to begin with, so one corner of their operation is busy Observ. Pretty strong privacy laws, so they may be less inclined to do that. But the truth is that it, it frankly, it, it, it costs them money to have them ask you for your phone number at the, at, the checkout every, all the time, and for you to have to enter that kind of stuff. It, it's just a, and, and the, and you know, what you get with their receipts and their promotions are, are ludicrous. I mean, they're, they're, they're not based on much intelligence. There's, you know, discount on the toothpaste you just bought and you're not gonna buy again for another two months. It's, it's pointless. Um, and, and yet it's proforma. And at some point that that pro formation starts going the way of AM radio or something, it's just gonna be like, wait a minute. The cost of this is too high. We're just not maintaining it. [00:47:18] Don Marti: Yeah. Yeah. And it really depends on people showing that they're harder to reach by surveillance than by other methods. So, um, if you. Put in a bunch of privacy tools that make you completely invisible to third party analytics. Well, the problem with that is, according to third party analytics, 0% of customers block third party analytics. Um, a so unauthorized agent shows that you have these privacy norms in a way that's much harder for. The companies to ignore and, uh, harder, um, for, uh, for you to be completely invisible to them [00:48:13] Katherine Druckman: I, I wanted to mention also, I'm gonna include the link to the announcement, um, that Consumer Reports made about this app where they also mentioned that they're actively seeking feedback. So they've, they acknowledge this is early, and this is, this is highly dependent on user feedback uh, to, in order to improve, you know, see, see what's working and what's not, et cetera. Mm-hmm. . So I if there's a big, there's a takeaway here I, I hope that our listeners who are definitely the target audience, I think for this app, uh, Use it. It's, it's on its way for Android. I think. It says there's a waiting list and, uh, but also provide feedback. I think, I think we collectively need to be more vocal about our preferences and about our needs this area. So I hope that that will happen. [00:49:00] Doc Searls: I just noticed, speaking of feedback that they list themselves among the company, you can ask to to It's one of and they. Online tracking, including cookies, and they do track. It's one of the things that bothers me about 'em. Um, demographics, job in education, a a list almost as long as Amazon's, and you can opt out of sale, which I will do right now, but I won't delete the data because if I've opt out of the sale, it's fine with me. Them having all that data. That's, you know, I, not actually the, the tracking data, but if they're not sharing it with anybody else, that's. So there goes the request [00:49:38] Katherine Druckman: Oh, fingers crossed [00:49:40] Doc Searls: and it, I mean, here's how it's gamified. I mean, one way it, it's now it gives you a little progress in how they're drafting the letter, sending the letter , and then there was a [00:49:50] Katherine Druckman: like the Domino's [00:49:51] Doc Searls: that, that went away before McDonald's suddenly showed up as the next one. Right. You know, so McDonald's opaque, you know, uh, the, the last thing that they do, That, that's an interesting thing. So yeah get app get get the app and play yeah, give em feedback They're, they're an extraordinarily valuable company and extraordinarily valuable, um, um, participant in the marketplace. So, worth, worth, your interested in support [00:50:23] Katherine Druckman: Absolutely. Well, thanks. Thanks everyone for, for thanks, Don, and thanks Petros. And thanks Doc. And thanks everyone who listened. Um, [00:50:32] Don Marti: Thank you for having me on the [00:50:33] Katherine Druckman: go download the app and thank, thank Don personally the side there for, for helping, helping, uh, get stuff like this going.