Hey [00:00:01] Katherine Druckman: everyone. Welcome back to reality. 2.0, I am Katherine Druckman. Doc Searls is back with us this week, which is awesome. And we also have a recurring guest back with us this week, and that is Dave Huseby of cryptid tech, CEO, and founder. And we're going to just kind of pick up where we left off in last week's conversation, because there was so much to cover and we just kind of got started, I think. And now that Doc is back, we wanted to kind of get a, give a little refresher, both for doc and for y'all and for me. And, uh, yeah, so, but before we get started, I wanted to remind everyone to go to our website, which is reality, two casts.com. That is the number two in the URL, and you can sign up for our newsletter and there will be supplementary links for this and other episodes. So with that doc, um, so you were just saying, you, you did catch up a little bit on like, [00:00:52] Doc Searls: okay, can I con I could link the first half of the show and. I'm abbreviated today as well, because, uh, I have a pretty tight time constraints, but I, I, I mean, I like the idea of synonymous authentication. I like pretty much, I like everything I heard, uh, Dave, where I, where I got kind of, and maybe you could just give me the cliff notes from last week. Um, and for people who are listening onto this show, um, what, uh, but also not only what you're doing, but, but also what you're offering to, to, to Twitter, you know, I thought it was really interesting, but if somebody asked me to say, what did Dave say? I wouldn't be able to say it, so [00:01:35] Katherine Druckman: yeah, I'd have to go check the transcript again. [00:01:40] Dave Huseby: Okay. So I'm going to try to take another approach to this. Um, and I'll try to keep it very brief because we talked for like an hour and a half last time. Um, so I wrote this article a couple of weeks ago where. I was in some ways, trying to defend Elon's position that we should authenticate all humans. Um, but also then protect anonymity. He actually used the word anonymity, but the word we like to use in the industry is pseudonymity, which is I have a pseudonym. Um, and so this article opens up with historical context that the founding fathers, uh, pull before and after the revolution used to publish synonymously, right. They would not publish. Um, they would not publish. They're political feelings publicly that they would put it in their newspaper. They would pamphlet and everything. And it was almost always under a pseudonym, you know, like Benjamin Franklin was notorious for this. Right. You started when he was just a teenager as Mrs. Silence do good. And then he pamphlet tiered in the 1750s as a tradesman from Philadelphia. And then when the constitution was being argued about after the revolution, they were, you know, Madison and Alexander and Jay were all, um, sorry, Hamilton, Madison and John Jay were, uh, you know, writing as Publius. And so it's a really common thing. Right. Um, and so the point and actually sort of the tongue and cheek humor of that opening of the article is that they were doing this 18th century equivalent of shit posting on Twitter. Right. They were saying edgy thing. Uh, you know, maybe the king is not the coolest thing in the world. Maybe we should have our own country. Maybe we should do all these things that normally get you in a lot of trouble, you know, and they would make fun of each other and, and call each other out and stuff. And it's basically what you get on Twitter, just the 18th century version of it and, or the 18th century version of it. And they did it synonymously, you know, Benjamin Franklin didn't sign his name to this stuff. Okay. So that's what it was historical context. So what you want's asking for is not without precedent. And then the other thing was, the other point I made was that, um, there's actually letters between Madison and Jefferson when they're discussing the, how to write the bill of rights, like exactly how we should do it. Exactly what should be in it, what the political leanings of all the people who are interested in this, you know, all the delegates from the different states trying to get the constitution passed. Right. Um, Madison encrypted those letters using a cipher system, not, you know, not unlike the nigga, the Jefferson and invented as wheels and everything. And so the point of all of that is that the people who literally wrote the constitution believed in and used technology to protect their privacy when they were making edgy political decisions and political politicking, you know, campaigning and things like that. And so to, to argue that the first amendment and the fourth amendment and the ninth and 10th amendments weren't emit intended, and the constitution itself, weren't intended to support this idea that we should be able to shit post on Twitter is actually false. There's actually a historical precedence that people who literally wrote this, did that, and they use technology to make that possible. So, um, that I was hoping would. Maybe put that argument to rest. Although it seems to have put gasoline on the fire, it's fine, whatever. Um, but what we're offering to get to cryptid in, what we're talking about here is that, um, Twitter right now is a mess. People say, well, if you give, if you give people an enemy or a pseudonym mini on Twitter, then you're going to get a bunch of illegal content. And by that, I mean, exploitative, you know, content stuff that exploits, children, that kind of stuff. Um, and then you're also, if you authenticate everybody, you're going to get dictatorships, persecuting, people who speak out against the regime. Right? Well, I would argue that Twitter has both today. Um, you, you see Peter, you know, people who say things on Twitter and stuff are actually getting in trouble and they're still not doing a very good job policing, uh, Uh, illegal content, if you will content that exploits people, right. Harmful content. And so what we're trying to do is, is improve the world by offering an alternative. And the alternative is with our technology, which is open source by the way, and documented. And we're building out all these tools to demonstrate this right now, in fact, demos are coming in like a week or two. Um, Twitter could require everybody humans to go to a KYC vendor and a KYC vendors like a, you know, know your customer vendors. So that could be like your bank or a company like inclusive or something like that, where they go and actually verify who you are. Um, and then give that data to you as authentic data. And I explained what that was last time, but we'll go into that a little bit more. But then the intent is that you keep it private and you don't ever share it with anybody because all you need to do to, uh, with regards to Twitter is to prove to Twitter that, uh, you have been KYC. So it's like a proof of a heartbeat, right? It's like you Twitter, you don't need to know who I am because this KYC vendor knows who I am. And by the way, you know, we have some cryptography that allows law enforcement to unmask who I am under judicial review. But what it does is it gets Twitter out of the business of policing content and knowing anything about their customer, their users, other than that, they are heartbeat. So I believe, or that they have a heartbeat. I believe this perfectly matches what Elan was saying. I want to make sure that they're all human, but then we also need to preserve privacy. And so this would allow people to be synonymous by default. Which is one of the core principles of user sovereignty. Um, you would start out synonymous, okay. On Twitter. Now that doesn't mean you have to stay that way. You could. Then if you chose to disclose your ID, your identity, you could, so let's say you're a brand or you're a famous influencer or whatever. Um, and you start up a Twitter account. It starts off like, well, what's your fake name? And they're like, oh, I actually want to be me. So I would be like, oh, it's actually Dave Hughes V. And then you can say, this is actually me. And then, because you've been KYC that there's actually a cryptographic paper trail, Twitter can actually verify. And the idea was that that we'd be able to at least differentiate between bots and humans this way. Um, and then people who are humans could remain synonymous if they wanted to, or they could then disclose who they are, you know, at various levels. It's a T it's a capability that, that hasn't really existed in the world, or it hasn't the reason we don't have it right now is because of revocation. That's the problem revocation at scale, up until, um, our research, the last two years hasn't been solved. Right. And so Catherine's question about revocation. Um, we use the set proofs that are super compact. They're only 48 bytes and they can contain, uh, an infinite number of proofs in it. So you can think of it as like there's a set of all humans, right. Or there's a set of all KYC credentials. And I can prove to you that my credentials are in that set. And then if the KYC vendor is like, oh, Dave lied to us, or he's not really Dave and they want to revoke it. There's an easy way to remove it from that. Um, and without the involvement of the other person and essentially pull back your consent over that credential. So there's some glue behind the scenes that has to go on that's fully decentralized as well. But, um, in, in short, the math exists to do what I call an infinite infinitely, scalable layer, two solution. So we're not tied to blockchains per se, but, um, in a blockchain terms, that's what it would be like. These are state proofs. So if I have a credential, that's a piece of data and I can prove that that data exists by including it in this set. And actually I don't do it. The KYC vendor does so that if they ever find out that I lied or whatever, or my status changed and they're going to revoke my QIC credentials, they just remove it from the set. So anyway, that problem has not been solved up until just recently. And let me tell you, let me just give you sort of a scale of what I'm talking about. The current best in the market today of revokable credentials. So you're talking about an identity platform, whether it's decentralized or centralized, whatever, um, the best in the world today. Well, I guess is what you see from the SSI community. They use a revocation method that scales with the number of credentials, which means that if you wanted to give, say a digital driver's license to every person on the planet, 8 billion or whatever, 9 billion, the amount of data that set proof, you know, of like which ones are valid and which ones aren't would be around 78 terabytes of data that 78 million or 78,000. Okay. That's not something that you could easily ship around. Um, it would be difficult to make sure everybody had access to it so they could verify if a digital driver's license was valid or not with our approach, this new mathematical approach using cryptographic accumulators, um, because you can put an infinite number of proofs in a single set in theory, it could be 48 bytes, but in reality, let's assume there's like a, like a million issuers. Let's say there's a million institutions on the world that we're handing out digital driver's licenses, you know? So you got it from one of those million. Okay. Even with a million issuers and a million set proofs, you're talking about 4.3 gig, uh, mega, uh, sorry. Yeah, we try to do the math in my head. So. Let's see what it was. If I said there was a million issuers in each one of them were issuing a billion credentials. Yeah. You're about 3.5 gigabytes of data. So even if you had something that was thousands of times larger in terms of the number of credentials, the revocation data is only on order of single gigabytes. And so that's something that could easily be shipped around to a mobile phone or a point of sale machine or anything like that. So that has been the roadblock for why digital identity. I think hasn't really taken off because you have to have revocation, um, for authentic data for digital identities. And there just, isn't a good decentralized solution until now this is a computer science innovation. This is a mathematical innovation and it's actually old crypto. It's been around for 15 years. It just has never been used this way. And. Yeah. So the, like I said, the SSI communities, revocation data grows with the number of credentials. Ours grows with the number of issuers. So if you only have a few issuers, it's only on the order of like a few hundred kilobytes. So megabytes is the set data for revocation because it's, it's an order. It's an oh, big old one data structure. It's it's called a yeah. Anyway, I hope that, that makes sense. Right? [00:14:13] Katherine Druckman: I think so. And all of that, I can think I missed apologies if, um, so if I want to revoke my own data as a user, like not the intermediate intermediary, but if I say I no longer want to participate in the system, I don't care if I'm authenticated. I want to take back all of my data and I don't want, I want you to forget about me, is that, is that possible? [00:14:41] Dave Huseby: That would require, are you talking about in specifically in the context of Twitter? [00:14:45] Katherine Druckman: Well, Twitter or anything else, but sure. In the context of Twitter, I don't want anybody to have my identity anymore. Delete my account, delete my third party. Verify identity verification. Uh, no one can find me. All of my content has gone and I've [00:15:00] Dave Huseby: disappeared. Yeah. So there's a problem where once you've shared decrypted data, you have to assume there's a copy of it somewhere and you can't always force people to delete it. Right? So there's no cryptography solution that would forcibly revoke someone's access to data because once you've disclosed, um, data in the clear you've given it away, you lose control over it. So that would have to be, yeah, that would have to be governed by policy or, or regular. Um, that said, I mean, that's why everything we build is based around the idea that sharing is obsolete. Although when you do publish things online, you would have to share, right. But when it comes to your data, your, your personal information or your identity, we now have techniques where we don't ever have to share anything ever. I mean, the people who do self-sovereign identity do this thing called selective disclosure, which just that doesn't preserve your privacy. That just means I'm going to give it to you in pieces instead of all of it once. Right. And what we're talking about. Yeah. What we're talking about is using authentic data data that has provable provenance, crap, cryptographic, provenance, like it came from here. It, you know, it's, it's tied to these key pairs. Those key pairs have been rotated. The data has been updated. All of that provable provenance as the foundation for moving to things like verifiable, verifiable computation, And, uh, programmable zero-knowledge proofs essentially. And what that allows us to do is reverse the arrow in digital transactions. Right now, what everybody does is they send their data to code. So if I go buy something from Amazon, I give them my name, my credit card information, all that stuff. And that all goes from me. My data goes from me to Amazon, to the, to the software on their server. And what we're doing is I now have my personal data as authentic data provable, that it was verified by an institution that has some societal trust, like a DMV or a company that works on my behalf. Like has a fiduciary relationship with me. once you have authentic data then, um, and it, it's not coming from me, it's coming from institutions, uh, that are, that have some societal trust, like DMVs governments, that kind of stuff, or private companies. Um, then I hold it and instead of me giving it to somebody, what they do instead is send me a verifiable computation or we do some kind of proof. Like for instance, with tr with Twitter, I would prove to Twitter that yes, I have a valid KYC credential and yes, it came from one of the KYC vendors that Twitter accepts. So like, let's say Twitter said, you know, bank of America is an acceptable KYC organization. Uh, when I wanted to create my account, I would just say, yeah, I've been KYC by bank of America. You don't need to know anything about me, but I can prove cryptographically that, that. Um, if I wanted to buy something from Amazon, uh, it would be, yes, I've been KYC by bank of America, by the way. Uh, I am a visa customer. Here's a proof, you know, a capability to collect payment from visa, um, that they can verify themselves because it itself is authentic data. And so then the fact that Amazon is asking visa for money, proves that I authorized the money transfer, same thing for shipping. It's like, I can prove to you that I'm a customer of FedEx, uh, by the way, here's the capability to get a shipping label for the thing I bought. And now I just bought something from Amazon. Amazon knows nothing about me, except that bank of America knows who I am. That visa handles my payments and that FedEx handles my shipments, but they don't know any of my payment information. They don't know where I live and they don't know who I am, but because of the cryptographic paper trails and the, and the relationships I have with these. Companies, should there ever need to be a law enforcement action? Should there ever need to be a regulatory action? And there's a judge that says, yes, here's a warrant. Go figure out who the heck that was in that transaction. That is also possible. We call that perfect fourth amendment privacy, because you know, we live in a society and that's the, that supposedly is the balance we have struck, right? I'm private by default as a citizen, but if there's ever reasonable suspicion, then you know, the judicial system can authorize law enforcement to go and violate that. Right. That's what the whole system is about. That's what we all live under today. And we all want to have nice things and that's the compromise. So what we've constructed is the ability to do that. And it all comes back to authentic data. Right. Um, and that's just really, it's our solution for authentic. You know, if you think about what's really going on right now with NFTs and web three and everything, it's a scramble for authenticity. How do I know this data is it came from where it said it came from, has it been modified yet? You know, like nobody's modified or if it has been modified, here's how it was modified. Um, and so it's a sense of authenticity. And I ha I take, I take issue with web three and with like all the NFT stuff, because they seem to think that you need a cryptocurrency and we all need to pay fees for authenticity. And I would just point out that the certificate authentic the certificate authority system has been running for 25 years, providing authentic data in the form of X 5 0 9 certificates that are independently verifiable, um, without a cryptocurrency. So it proves that you don't need it with a cryptocurrency and what we have built, doesn't use a cryptocurrency. And so w the scalability of our stuff, the cryptography is so compact and the proofs are so compact. I mean, we're talking 48 bytes for set proofs and, you know, a couple of hundred bytes for some of these, uh, zero-knowledge proofs, um, that we can reach scales that have never been seen before. Right. We can make any, and all data on the internet authentic. And that has huge ripple effects through basically every industry, literally every industry, because, you know, I was just reading a thing from Vitalic today, where he was saying that we're going to have these soul bound tokens or something, and brought up that, uh, you know, universities could issue diplomas as an NFT. First of all, I don't need. All the all that needs to happen is my university has been KYB. So their identity has been verified and then I've been KYC so that my identity is verified. And then when they issue it to me, they issued, you know, signing it that with their keys to my key. Right. And I can protect my privacy because I can prove that of all the things in my personal provenance. One of those events is me receiving my diploma from this accredited university. And you don't need cryptocurrencies for this. You just need an independent verifiability. That's really all it is. And that's what the provenance logs provide. And, oh, and there's one other point. Nobody thinks that you need history or nobody has realized that you need history. Right? If everything now is going to be digitally signed or tied to a key pair, then there has to be a history of keys because we rotate keys all the time. This was a point I brought up last time. It's like, I would have been a contributor to open source for a long time anonymously, but I can't prove it because I don't have those keys anymore. So I don't have a provenance log that ties my identity to the, all of the history of keys that I've had. And, um, so if you have old data that was signed by old keys, you have to have something in our provenance log. So you can say, oh yeah, that key that created that signature 10 years ago was a valid key associated with that person or that organization room. So anyway, that is a little bit longer than I wanted, but that's sort of the gist of what we're talking about. [00:23:25] Doc Searls: Twitter. There's only got 20 minutes left. Uh, not that I want to speak for 20 minutes, but I want to work in some thoughts. One is, in, in respect to, to Twitter, Twitter's an example only of itself. It's also, I think I'm not a betting person, but I would, where I, to bet I would bet that Elon Musk is not going to acquire Twitter. And it's almost a, it's kind of a moot point. Um, but I do think your idea for, how authentic data is useful to them and that there, there should be, I mean, already on Twitter, if you're on Twitter, you can change your name to something else. A lot of people do that. You know, lots of people saying my name is I'm for Ukraine, you know? And now you click on it and you'll look at it as, oh, it's actually so-and-so, but there are a lot of pseudonymous. Um, a lot of examples of pseudonymity there. There's certainly not pseudonymous to Twitter itself, but they are synonymous to the reader. Um, Personally, not fond of that, cause I'd rather know who the hell that is. You know? I mean, just because I follow people, I don't want to follow, um, you know, I don't, I don't want two layers of abstraction away from whoever the hell that is, but again, that's Twitter. And when people talk about, oh, Twitter is assessable, Twitter is this everybody's look is different. They're all following different bunches of people. What the hell? I mean, it's completely not a, a very good example of where I think you want to go. The fact that what you're talking, I mean, I would think authentic data would be massively interesting. Buy those two words together, authentic data. Oh, cool. Okay. Without going into the particulars of how you have that. Um, I think that's, I think that's strong shit, frankly. That's one thing that the thing I found myself thinking is. What is the new thing that takes off, you know, and, and, and that the world falls behind where you're talking about the kind of scale that's possible with this, uh, that isn't when you've got the eight terabytes of whatever that, you know, a server, a server would have to churn through. Um, you know, but I think that there's a, you know, we're, we're sort of in a. Yeah. You know, if you ever watch a time lapse of how cells divide, like the cell of a zygote divides, right? You have two and then four and then eight. And, and, but then all of a sudden the thing is a sphere. It's a blast Jolyn and it goes through some of that stuff and there's something called something else. It morphs into another shape that moves toward viability. I sort of think my own feeling is that the history of the internet is really only about 25 years old. Okay. Since we, since the age of ESPs and the first dial up and people's first experience with browsers 27 years really that's it is starting in April 30th of oh five, which is when the last of the backbones that had a restricted policy on traffic stood down. It was the NSF net. And after that, You know, all heaven broke loose and, and, you know, there didn't happen quickly, but there was this that's where I would date the internet. We know from that moment in time. But the one where all of us have phones is really only about 10, 12 years old, you know? So, um, but I think there are steps beyond where we are now that don't start with, I'm going to make a start-up and, um, and I'm going to have a unicorn now. I hope you do have one day, but I, I went, but I think. There's something kind of old school about that. I, I think what we need is something that it's like ice nine or the Andromeda strain, you know, something that, that changes the world by changing the theatre, the temperature of which something melts or congeals, or, you know, there's a, that accelerates a state change because this is a better way to do it. We've seen this with code, you know, something useful comes along and suddenly everybody's using it. You know, I remember when, when Ruby took off, you know, and rust and all these there, and O'Reilly had a really interesting series of histories about the uses of different kinds of code that all of a sudden, you know, even though were laying around for a bit, suddenly got used a lot and then all and the world changed. Right. So, um, so I guess my question is. Do you have a fantasy about this, that is a particular application or something, something that billions of people are going to be doing that they aren't doing now. And, and it's on their devices in, in a sense that whether or not it's on the front page of their phone, um, because have chosen to put that little rectangle and, or that little square on the front page of their phone, whether or not, and not as there it's, it's, it's it it's front burner for them because they take it for granted. What would that be? [00:28:43] Dave Huseby: Well, uh, let me see. So that's really great. First of all, I, I don't have a fantasy of having a. Startup. I'm an open source guy. I look at this as like ice cream, like I'm the guy walking around, just handing ice cream out to everybody. Like, you know, it's like field day in elementary school, everybody gets an ice cream come here and get your ice cream because it's open source. We're trying to give away this code. No, [00:29:09] Doc Searls: that's Neil. Stevenson's a metaphor in, in the beginning, it was the command line, which he wrote late 1998. It's like, wait a minute. Uh, you know, And at that time, you know, Microsoft made bad station, wagons had fell apart and apple made something that was completely weird and closed. And here are these guys giving you a way tanks, right. And have a tank it's free, walk off with it. [00:29:32] Dave Huseby: Right? Yeah. That's exactly what we're doing here. And so I think, let me see if I can divide this up into a few categories. So if you're talking about just normal everyday people who have a phone, first of all, the math that we're talking about is simple enough that it easily runs on phones. It runs in the most embedded chips you can imagine as well. So, um, because everything is so small, like the proofs and everything is so small, which I don't want to get into this, but there's a, there's an argument to be made that the most profound, uh, Innovations are ones that, uh, go with the farthest back in history in computing history and the computers could still use it. So like, uh, you know, we were joking the other day that our set proofs and our revocation and stuff could be programmed, uh, for a PDP 11 and the proofs, since we can anchor in Bitcoin in the 80 bytes and the Bitcoin, that's an effective anchor, that's an 80 column card. So it could be punched cards on a PDP 11. Um, that's how we know that this is a really, really big deal. That's why we're confident that it is okay. So end users every day and users here's, what's going to happen. You probably won't know that you're actually doing this. It's going to be like a matrix change behind the scenes. Cause we're really trying to become Intel inside kind of thing. You know, like, I guess we want to be. You know, or like the UL logo on the back of every piece of electronics, right? So everything you create, every photo you take, every video you take, every audio, you record every text message. You send everything like that can be captured in a Providence log. So associated with a key pair, that's either yours or for the piece of data itself and can be anchored because we use these sets. They do not scale with the number of proofs in them that they're fixed size. It's possible to basically amortize the cost of a single Bitcoin transaction to zero. Okay. And we're not married to any blockchain, but I'm just using Bitcoin as an example. Okay. So that means every photo that everybody ever takes, every video, everybody ever takes, every text message you send everything right. Can be, there can be a Providence log associated with it. And that provenance log can be. In a set proof in a single Bitcoin transaction. And that alone makes whatever data you did. Um, you created authentic data, meaning that if you handed that photo and the provenance lock to somebody else, they could independently verify the data in the provenance slot. So any metadata with the camera, with the photo, the photo itself, your camera, any identity data you've, um, associated with it, any rights, like, is it public domain? Is it all rights reserved? And you know, is it a creative commons license, whatever that's also in the Providence log. And we can do that at the point of capture. So I think what you're talking about when it comes down to, um, the end every day person who has to just a cell phone, what we're going to allow them to do is to assert their rights over every piece of intellectual. That they create and creating authentic data is effectively free. That's the, everybody gets an ice cream here. Everybody gets a tank. Right? Tell me about a system that everybody can use without really thinking about it. That captures every piece of intellectual property of an individual and allows them to assert their rights in a way that it's independently verifiable by anybody. And it also gives me an easy way to, you know, commercially license it, turn it over to public domain in a provable way in which everybody who looks at it can go. Yes. And the cryptography involved ensures the security. We're talking about the strongest open source cryptography you can imagine. So, um, that I think is going to be the thing that everybody will be doing soon. And then I think this has far reaching effects in the world. For instance, can you imagine looking at what we want to do is because this is free and so ubiquitous. Okay. We want to change black into white and white into black. What I mean by that is when I look at a news website, I want to question why there's even any data in there that doesn't have provable provenance, every photo should have a Providence log. They can, that I can, as a reader of that news site, I can get that information click on or whatever, and see that the person who took the photo was this photo journalist. And they were using this camera and they were in this location and it hasn't been modified. Or if it has, here's a transparent record of how it was modified. Like we cropped it or whatever, so that we can provide radical transparency in our newsroom. And if you're a legitimate journalist, you're going to want to do this and this because of the pseudonymity part, that's also important here, because then you could be a whistleblower and you could protect your identity in a way that still provides the authenticity piece. But doesn't reveal that you were the person who took the picture and then smuggled it out of a war zone for instance. Okay. However, we can tie it back to a GPS location to an actual camera. It was anchored at the point of capture, right. And there could be that level of transparency involved so that, um, the trust from the camera to the webpage it's transmitted accordingly gets transmitted in a trustful way, right? The trust of that photo. And so, because it's so free and it has the potential that it's ubiquitous and totally free, you know, as in beer, um, I foresee a future, hopefully in the near term where crowd-sourced media, you know, I have a cell phone, I took a picture of that incident that happened, uh, crowdsource media has provable provenance, and everybody who takes a picture of something or whatever. And it's used in a newscast it's properly compensated. And then there's also a provable provenance. And when I look at a news site, everything on that page, every editorial, every reporting article, every piece of text, video, audio, photo, everything, every individual piece of data on a news thing has grievable provenance that I can examine if I want. Um, that way we don't have to look at news sites and wonder, you know, w w like when CNN applied a filter to Joe Rogan's video to make him look sicker, you know, that kind of stuff like that kind of nonsense would be immediately. Visible, right. That's a form of a deep fake, actually in my mind, you know, creating a false impression by selectively editing or carefully editing or, or taking out of context. Um, this kind of provenance stuff would, uh, could possibly put a huge damper in that. And I would expect that people who consume news to start wondering, well, why doesn't that have provenance or what is the provenance in that photo? I want to know where that came from. I want to know who wrote that, you know, that kind of stuff. Um, the other piece of this. The thing that I'm really excited about. Oh, so are you going to say something Dr? [00:37:19] Doc Searls: Well, I did as a journalist, what I'm thinking about are our stats or facts. I mean, you know, what are, I mean, those things are in short supply lately. I mean, actually there are an abundance supply, but the supply of things that are not facts is more abundant right now. And the motivation to, to say false things about a lot of stuff. Is there too. So yeah. I mean, I think that's a, I mean, I'm, I'm thinking, okay, if you're a research organization and you want to produce something that where, you know, and here's another interesting thing to me is I, um, it's not only that I, I want to credit who I get it from, right? Like let's say, um, party a has facts about an event party, B sources, those facts. I learned it from party B. I want to say. Here's here's the authentic data that I have authentic data that came from party a through B, that kind of thing. [00:38:24] Dave Huseby: And it has provable provenance because you have this right. [00:38:28] Doc Searls: And we a supply chain as well, which I think what you think actually Catherine wanted to ask about. [00:38:34] Katherine Druckman: I think that the supply chain conversation may have to wait for another episode because that one's going to get long and complicated. But the, you know, here's the thing about every time Dave joins us, I have to go back and listen again. And while I'm in it, I'm just listening honestly. And I have to listen to it again. And then I think up questions, but, um, yeah. Anyway, sorry, go ahead. [00:39:00] Dave Huseby: Uh, well I was going to say doc, another thing related to the supply chain stuff. Um, provenance, they are this approach that we're trying to give away and show the world how to do, um, would allow for the very first time ever to have complete end traceability for every contribution to an open source project all the way through a build. Um, if you have a reproducible builds, uh, you know, CICB pipeline, it would all link to that. So you could account for every line of code going into a piece of software and then every piece, every step of the build process in such a way that. Um, what Kyle brought up, other people could reproduce it and they could vouch for it, which would be really great for a company like peers and doing firmware and things like that. But it's also going to be really great for companies like freedom of the press foundation, doing secure drop for whistleblowers, or frankly, any piece of software signal app, uh, you know, anything that demands that we trust it in some way. Um, I would think would want to adopt something like this, but there is one other aspect of this that I wanted to touch in last time that we didn't even get close to, which is we now because of this sort of universal approach to authenticity, we had the ability to, to move towards a new API security system that we are also, we've also source it's called Oberon and it has similar, uh, privacy guarantees and things like that. Um, but we are in the process of getting off the ground. Uh, I just kick this around internal meetings. I call it the great open-source inclusion project. Um, one of the problems in, in open-source projects is that the only thing we typically have a record of contribution wise, our edits on a Wiki and code commits in a repo and using Oberon. We have this ability and Oberon uses provenance logs. And, you know, I'll get to that in a second. Um, has the ability to tie in all of the interactions with APIs, um, in an open source community. So any posts on a, uh, you know, on a message board, any edits on a Wiki, anything that helps organize an event, sending out emails, all that kind of stuff. If it's done through a platform that uses this we can also capture the, that actually in the repo itself, because the provenance logs will be in the repo associated with the people doing these things and their contributions to the community can also be put into the repo. Um, and so. I do want to touch on O'Brien because it's totally open source. And this is another thing that I think is going to be fairly ubiquitous, which is, um, it enables for the first time real zero trust architecture, which I don't think a lot of people understand that zero trust architecture means we're getting rid of the firewall. There's no such thing as inside the firewall and outside the firewall anymore because you literally don't trust anything. So yeah, this, this over on thing I'm really super excited about, um, because the, the characteristics of the Oberon. So this is a replacement for API tokens as a replacement for open ID, all that kind of stuff. It has, I don't want to get too deep into the cryptography, but the, the ergonomics of the system is a lot better for security. Um, issuance of Oberon tokens, uh, requires some digital assigning on part of the issuer and it can be done doing a decentralized signature, a distributed signature system. So you can do like Emma van or a threshold signing is what that's called, where to issue a credential. You would have multiple machines that would be, that would have to participate. So that increases the level of compromise that an attacker would have to do to be able to falsely issue Oberon credentials. So there's that then the protocol for issuing over on tokens is such that the issuer never sees the token itself, that the client generates it. And then there's a protocol for the server to sign it without ever seeing it. And this is important because in a world where everything is. You know, moving to capabilities and we're doing cryptographic signing for everything, for trust and integrity. You don't want servers to be able to impersonate clients. Cause if that server ever gets compromised, then whoever compromised, the server can then impersonate one of the clients. Okay. And, um, so there's that right? So that this also increases that the servers themselves are actually incapable of perceiving a client. So you don't create high value targets. Like we normally do, you know, like you want to steal everybody's credit card number in Twitter or say like, you know what happened? Target had a whole bunch of them. You only had to compromise one database. Yeah. And then you got all the credit card, credit card numbers. Well, think of this as like those Shopify and target would know about us know that we exist, but they would not have our credit card numbers. And there would be such a cryptographic relationship with them that I could prove that yes, I'm. Target customer and yes, I have valid payment credentials and you can let me pay for it, but you don't need to actually have my payment information. Right. So Oberon works in this way. When I get it over on token from a server, I from then on just prove to the server or to the end points to take it, that I have a valid Oberon token. Um, and the cool thing here is that end points, meaning like the services that I'm talking to consume my credential. They only ever need the public key of the issuer present for them to verify this. And so that means that service end points themselves can be hardened against takeover. So if they get compromised, they no longer, or they don't divulge anything about the clients and they can't be tricked into issuing Oberon credentials, and they can't deny an Oberon credential. That makes sense. You would have to take the server out to deny access to the service. So the server is the service end points themselves. Do not possess any data, uh, that if they were compromised, would compromise the security of the clients. Right? Um, that's so these are, these are all, these are all things that don't really exist in the world. These are all improvements over existing API security, you know, um, that don't really exist in the market because they're all photography based. Um, the, the stuff that we were that we're about to launch next, that we're about to open source. Now, Oberon is totally open source and it's been used in industry already. There are companies that are using it. Um, it's been out in the world, fighting a year now, and the first we've had the first couple of companies adopted in the last few months. Okay. Um, the next, say [00:45:58] Katherine Druckman: anything about the ways in which it's being used [00:46:01] Dave Huseby: exactly. As I described API end point security. Exactly. What you okay. Yeah. And one of the interesting. Yeah. And one of the cool things is, is it, it's actually a drop-in replacement for open ID and it collapses the open ID stuff from a three-party model to a two party model, because you know what open ID is, right? Like log in with Twitter login with Google that yeah. So you don't have like this third party vouching for your identity anymore. You just go and get your credential and then you can talk to the API end point. Um, but if you really still have infrastructure that uses open ID, well then the issuer is that third party, and then you never have to talk to them again. So there's a, one-time sort of log in with Google, but what it does is it gives you an over on token and then that token and the proofs from the token can actually be the payload in the open ID, uh, protocol. So there is an obvious migration from the existing, you know, federated identity protocols to just moving to a point to point Oberon and search architecture. Um, so the next thing that we're launching is this is a thing called. It uses a technique called reductible signatures, which is a thing out of punchable, Sanders, uh, signatures, which is what Oberon uses. And this one might get a lot, get beyond some people, but let me get a curtain. Let me take a crack at this. So, um, when you go to talk to you a server and you want to prove that they, that you have the capability, like you have been granted the right to use that service. Okay. The best that we have right now in the market is private set intersection, which means that I can send a proof to the service that the capabilities it provides. Let's say reading from a database, writing to a database, uh, writing a log message or something, most three services I can prove to the server. Um, which of those are, are some of one or more of those is in my set of capabilities. So things that I've been allowed to do. Okay. And the unfortunate thing is the server gets to learn all of my capabilities that I've been allowed to do that it provides. Okay. So there's a potential there that if that service is logging, that I presented a proof that says, yeah, I can both read and write from the database, then attack an attacker to that server could see that. Well, I have the ability to write to the duty, right. And so that might make me a target now because I might be one of the few people that can actually write [00:48:34] Katherine Druckman: to that desirable [00:48:35] Dave Huseby: privilege. Exactly. Right. I have a desirable privilege that you might want to target me for. Well, with, uh, reductible signatures. What I can do now is I don't know. I no longer disclose to the server, all of the capabilities that I have, that it provides. Right. What I'm doing is I'm saying, I want to read from this. And I can prove to the service that of my capabilities reading from that database on that server is one of them and I've redacted everything else. And, uh, so this greatly increases like the security of this interaction and reduces the attack surface, um, because the server, even if it's logging is not logging that I have desirable capabilities, as you said, that's actually a really great way to put it. I'm going to use that. No. Okay. Um, so anyway, [00:49:29] Katherine Druckman: privileges on [00:49:30] Dave Huseby: our system, basically. Yeah. And the cool thing here about all of this is that this is how we're really going to be decentralized in the future. And just to remind everybody, I've talked about this in the past, uh, decentralization is about power, you know, like it's about. MI maintaining the power to decide my level of interaction with the server or my amount of data that's on the server or is my data even shared with the server? It's, it's a balance of power, right? Decentralization is about pushing power out to the edges. Okay. And I've defined decentralization is the direction in which user sovereignty, the end user, their sovereignty, their power increases. Okay. People come, they get this confused all the time on the internet. I don't think we agree on the definition. That's just my definition. Now the other word is distributed, which is actually a spreading out in terms of servers, hardware. So I say distributed is hardware. Decentralization is power. Okay. Decentralization is about empowering individuals. Okay. Okay. Because here's the example. Okay. Facebook is. Distributed. They have many tens of thousands of computers all over the world to make Facebook happen. What would you argue then? Right. But would you argue that Facebook is decentralized in the sense that I have exactly right. It's a totally centralized service running on a distributed network of computers. Okay. So, but something like, you know, email email actually is the most decentralized tool on the internet because today, like I could be signed up to Google, you know, Gmail. Okay. And it's like, oh, I don't wanna use Gmail anymore. I want to use say fast mail or Yahoo mail or somebody else. Right. Well, I download an open source tool, like an eye map client. I connect to Google using the IMAP protocol, open protocol, open source software. I download all my emails off of. Store them in an open file format and inbox file or, uh, uh, uh, what's the other one? The, the mail order format. Yeah. Right. Those are all open standards. Yeah. Okay. And then I can go set up an account with say Yahoo or FastMail. And again, using the iMac protocol, I can then upload all of the files, all of my emails from my inbox file or my builder format up into that new email service. Okay. So I am totally portable. My data is portable. My association with the service provider is at my mercy. It's completely at my mercy. Okay. And you know what? I don't even have to use the service at all. I could run an open source, email server, like XM or something like that on my own server, upload my mail to that and take total control over my data and my relationship to the email system. Okay. Email is fully decently. The only reliance we have on some central authority to make email work is the domain name system. I would argue DNS, DNS, you know, people doing public key, uh, you know, doing public key, uh, distribution through DNS and domain name resolution, because I am, you know, Dave, add a domain and then, you know, you want to send an email, our servers have to resolve the domain to know. Yeah. So that's the only centralization, um, part of email. And it really is just to do what I call discovery. That's one of the fundamental problems with decentralization and also, um, coherence, which is also one of the fundamental problems with decentralization. So there's nine problems. We've talked about this in the past. There's nine problems that every distributed yeah. That every distributed system has to tackle one way. And to become a fully decentralized system, you have to have as many or all decentralized solutions to each of those nine problems and email to this point has the most, you know, there's only two that it doesn't have that are fully, um, that aren't fully decentralized. And even then they're pretty decentralized. I mean, DNS is pretty much under your control each of our controls, if we want to set up an email anyway, that that point, you know, we're moving in this direction where the world is going to be fully decentralized. Everybody is moving that way. I just want to define what that is so that everybody can start speaking the same terms, um, and Oberon security and this new, uh, reductible signatures for authentication or for authorization, I think is going to be a key thing that gets widely used because first of all, it's open source. It's easy to deploy. There's obvious migration paths from your existing. Infrastructure. And it allows existing industry, existing enterprises to move to a zero trust architecture, where you have defense in depth by default and privacy by default and auditability by default. And the piece that cryptid my company offers is the, the, the piece from my, um, that my company offers are the enterprise solutions to automate all of this stuff, but we are giving away the open source. So it's like very much like get lab, right? Like get lab has a community edition built on top of it. Um, but then there's enterprise tools that do so much more where the exact same kind of approach to the market. [00:55:13] Katherine Druckman: I wonder. I wonder if we could kill some time talking about something silly, like the Seth green NFT thing while I have you here. [00:55:21] Dave Huseby: Authentic data is NFTs without a cryptocurrency. Right? Right. So. I mean our technology, because even, even if you just anchored in Bitcoin, right, we can amortize Bitcoin transaction costs to zero, essentially. Um, and you can have an infinite number of NFTs created and transferring hands and stuff like that per Bitcoin transaction. Um, so essentially the NFTs themselves like that market, like the way they work now, I think will fundamentally change once our tools are widely used. That's one side of this. The other side of it is, um, because everybody's asserting their, their ownership rights, their intellectual property rights over everything they create and everything has provable provenance. One of the things that makes this possible is we can make legitimate copies so we can fork a provenance log any number of times. So I let's say I take a photo. I can make a hundred legitimate copies by forking it into a hundred Providence logs attached to each copy of the, you know, a hundred of them. They're all the same photo, but I have a hundred Providence logs that I have blessed myself as the original intellectual property owner. And then I can, you know, put the, uh, the terms and stuff that I'm licensed or giving to the person who buys a legit copy. So one of the neat things about this and the fact that the software is embeddable, like it, it runs even an embedded systems is that we can now unlock and automate third-party licensing of intellectual property. It makes everybody into their own media company. So this affects not only big intellectual property rights holders like Disney and Marvel and DC, but it also affects even just normal people. Right. I, what I want to get back to, which I think is just really cool is there was a lady on second. I used to work for a minute left and there was a lady in [00:57:18] Katherine Druckman: and we talked about this. We probably talk, I'm getting senile allergies. Go ahead. [00:57:22] Dave Huseby: No, there's a lady on second life who made millions of real dollars selling digital hair in second life and wow. With the authentic data economy, this, this infrastructure that, you know, these, this approach that we're showing everybody how to use, um, the talented among us can get right back there without a second life. You know, I could make avatars that are then can be used in or whatever they can be used in any video game or any virtual reality thing [00:57:55] Katherine Druckman: that sort of economy was in second life. Yeah, vaguely [00:57:59] Dave Huseby: this, open it up so that it's no longer platform specific because now we have this open format, open protocol, open cryptography way of verifying the legitimacy of any piece of data. And so now. Platforms like roadblocks or even video games like Fortnite. Um, they can now possess this ability to verify the provenance and the proper licensing of any data being uploaded. And so they can open their live, their platforms up to user-generated content, uh, content licensed from third parties that they never even contemplated. Like the, one of the things I like to fantasize about is, um, I want to be able to get a copy of say NBA 20, 23, right. From electronic arts. Okay. Okay. And then I want to go to say, Warner brothers and license, a bugs bunny avatar. And I want to go to Nike. I want a licensed, you know, digital air Jordan sneakers, and I want to go to whoever owns the rights. Maybe it's Michael Jordan, maybe it's the bulls or NBA or whatever, and get Michael Jordan's number 2300. As an outfit for my bugs bunny avatar. And then I want to go and play space jam. I want to play NBA, you know, 20, 23 from electronic arts using an avatar from Warner brothers shoes, from Nike and a Jersey from Michael Jordan, NBA, whatever, and just have it work. And what we're providing is the infrastructure, the knowledge, you know, the knowledge and the tooling to open-source tools to make this possible. And I think it's going to be hugely impactful because now platforms can just, they have the ability for the first time ever to not only identify the creators, but also to manage the rights associated with it. This takes something like creative commons and kicks it into high gear. This creates something like, uh, [01:00:05] Katherine Druckman: the attribution is automatic. [01:00:07] Dave Huseby: Yes. Right. And so if I made like a, say an avatar pack or something, and I released it as a creative commons attribution, you know, like you can use it and it's free and you can't charge for it, but you have to just give me credit. Like that's all built in. Right. That's all like, I could set that in the provenance log of each copy that I hand out. Right. And then when they go to the platform, that's going to use a render. It, allow them to play with it. It'll verify that. And you know, any other player could, I don't know, get the info on that avatar and see that it came from me like, oh yeah, I created by Dave, you know, and whatever, my little marketing messages, you know, like get your copy from here. Right. Um, and so that would enforce creative comments. Like this is the first time ever that we can fully, I mean, for lack of a better term, democratize the intellectual property, infrastructure and market. You know, anybody can be a creator. Any platform can be a consumer of this content, you know, see on steroids. Yeah. Digital and digitally. Yeah. And I could see this as like, you know, I really want to build a crowdsource media platform where every piece of media on it is, has verifiable authenticity. Like it came from a specific iPhone at this GPS location taken by, you know, Joe Q public, you know, was there, witnessed this happening and happened to capture a photo or a video of it and, and make that available for news, for content creators, whatever, and have like proper compensation. We'll get to finally figure out what the cost of a eyewitness photo really is. Yeah, because the intellectual property will be there. And if they publish the photo without the provenance log, that shows that it was properly licensed to them. People are going to start asking questions. Like I think having this as a capability will change the nature of news reporting online because people, once they realize that every piece of data on here should have a Providence log, we'll start to wonder why that doesn't. Hmm. [01:02:21] Katherine Druckman: That's interesting. So what I was hoping we could do give a little, just a wrap up based on all of the stuff that we talked about before I was hoping doc, you would have some like, uh, [01:02:36] Dave Huseby: some [01:02:36] Doc Searls: thoughts. Okay. I'll uh, I'll give some thoughts that may or may not be relevant. I think on the one hand, We talk about data way too much and don't understand what it is or why it's important and what it isn't. We call it awful lot of what we do data when it's really not, it's just useful information or useless information, but reducing everything to data is one of the big mistakes that we make. It's like reducing what we had with behaviorism back in the fifties and sixties. Like we are nothing but collections of behaviors and as a stimulus response chain, and that's all we are. Um, I'm seeing the same thing with data right now. And that's on the one hand, on the other hand, I think what we can do with data and what Dave is trying to do with data by, with authentic data and provenance and all that stuff is really interesting and very much I think the future and what we can do in the future, but we're not doing it yet. And I just want to see a path to success for that. So that's. That's one thing. Another thing is, uh, I think that what we have between digital technology and the internet is so new and profound. I mean like every, every new medium, obsolesces some other medium and or some set of BD. And even of them, this meteor is still around. We still have magazines. Conde Nast a couple of days ago, said somebody from Cottey and asked, which is an, a magazine business said we're not a magazine company anymore. We're like a data company. I mean, it's like, I mean, what a stupid, but, but the, you know, they recognize that, you know, people don't buy magazines anymore. They buy, you know, th they, they acquire information in other ways they want to be in those businesses, whatever those are. But McLuhan said, every new medium works us over completely. And. What happened with the internet? Is it eight? All the rest of them, I pay close attention to radio. Radio is getting killed right now. It [01:04:49] Dave Huseby: is dying. [01:04:51] Doc Searls: And nobody, I mean, even at an even, and I'm a radio freak, you know, but like our car has Sirius XM. Okay. I can also listen to Sirius XM through the car from my phone on an app. That's a much better way to listen then down from the satellite now down, Fuselli works great in the middle of Wyoming. But if, if I'm listening to Howard stern, for example, I'm listening to a news broadcast and, and, uh, or a sportscast. And what happens is when. The car is interrupted by the, the, the traffic prompting of Google maps. For example, you know, we just constantly say, it tells you five times in your merging with traffic, you know, but if I'm listening to an app, the app pauses, I can rewind the app. I can, this is with Sirius XM zone app, right? I mean, it's so much better. And, and not only that apple and Amazon and, um, uh, and Google and all these companies, big companies are busy iterating. What, what replaces radio on a more or less constant basis, every time an hour apps get updated, your podcast app, all these things get updated. Early podcast apps. Didn't have the, can speed it up feature. Now they have that one, obviously one and a half. Now you go one at a quarter, you know, Spotify, litchi kind of do it like even a dial with it. There's just so much that could be done with software that can't be done with in the old hardware model world. Um, but it's early, it's really, really early and we're just the, the, the internet and digital technology is soaking into everything. And, and we're just beginning to find out all the good and bad stuff that comes out of that. And just what we were talking about earlier, there are more ways than ever for the world to go to hell. Now, there are more ways than ever to make the world better. Both those things are at are, are, are, are laid out on the work bench in front of us right now. And there are more ways to cooperate than ever. There are more ways to screw with each other than ever. And both those things are on the table. And, uh, you know, there's a. You know, there's a side of me that, and I'll, I'll be, uh, prejudiced in a one way on this. I, I actually, Garrison Keillor wrote a thing about this. He, he, uh, he said it was in a book called a book of guys and he said, so, okay, let's look at a picnic at, it was a big crowd of people. Okay. In any town, big picnic barbecuing going on, food preparation, sports outside the cornfield behind you said, here's what happens. All the boys go outside and they pretend to be playing with guns and they argue about who's dead. Um, and they play sports. They try to kill each other and have fun that way. And, um, And all the girls are inside helping prepare the food and you know, where, where the books and the people having real conversations are and are learning to form complex relationships with other people and, and understand how society works. And after 20 years of this, which group do you think is better equipped to run the world? Is it even close? You know? Um, and, and that's sort of how I feel about it. I mean, I have a family on my wife's side with six sisters and I watched them operate and it's like, they're on top of shit. They really are. They understand they're seeing things. The guys are not seeing, you know, and, and a lot of it has to do with relationships and how the world actually works. Right. And, and it's not that everything gets solved, but it's a far more nuanced than come and, and, and complex. And. I think healthy approach. Now it's not, the guys can do this, but I think that the solutions that start with conflict and domination, um, are not as good as the solutions that start with with cooperation and partnership. And the open source world is all about cooperation and partnership. Even if there is competition for getting code to a base, you know, I mean, I, I talked to Lynas and other developers about this is like, and it's a great model. It's like, is it useful? Great, can we do it? We'll fix a bug, great patch it and carry on. Um, what works. This is what Andrew Morton told me is w Linux is all about what works best for most people, what works best in most circumstances, that's it, we're building a code base. That's going to support everything. How do we do that? You know, the code base gets bigger. It gets. Worked out. I mean, the, the original open source model that starting with individual hackers on, up through collectives that are, um, meritocracies with the most competent and incompatible people with the most encompassing vision at the top, um, passing through patches and, and the rest of it is an awesome model for doing an awful lot of other stuff that I think is to some degree lost even within that community now, because so much stuff is being done in containers and other stuff where the, the really deep based stuff that's closer to the metal gets lost or forgotten. And the supportiveness of that at the base level is kind of lost or forgotten. And, um, I don't have a solution for that, but I, I sense that, um, [01:10:41] Dave Huseby: you know, in. [01:10:45] Doc Searls: I don't know, but most of the, most of the world, by the way, it's not politics. It's so easy to talk about politics because it's, it's a, it's a form of sports, but most of the world is actually commerce and people talking to each other. That's what most of the world is. That's what it is. People, you know, talking to each other and trying to keep up with each other. And that's harder than ever now, too. I mean, I, um, boy, I'm game taking too long here, but you know what I'm dealing with right now, more than anything else, this is going to sound really weird and kind of a downer, but in a way it's not it's death because I'm S I'm going to be 75 years old as summer. Guess what happens when you're my age, a lot of people to croaking off. Right. And I'm like, I'm like standing here watching people get picked off. A lot of them younger than me. Right? What does this do on the one hand you get better and better at dealing with that. You grieve, you move on. What sucks up your time. It's not the grieving, but Hey doc, if you, you got a lot of photos of this person, right. You know, go find those, you know, it's the, it's the, is going after the resources of memory about these people that are important to other people at a moment in time. So there's a group that I'm going to see in North Carolina. I haven't seen in five, seven years. Okay. And I've lost touch with a bunch of them. One of them has Alzheimer's. One of them is dying of ALS and other one is dead. Um, Others or have health problems. Some are doing great, but I've gone too long with them. I didn't know these things were going on. And so an awful lot. It isn't like getting together after five years when you're 45, right. Where you've got, you know, oh, go, you got to do a job. You got a new car. You weren't living over here, you had this great vacation, you know, this bad thing happened, your team lost all that stuff, you know? Yeah. I got a, I broke my leg. I'm not going to be skiing anymore. That kind of stuff. Well, at my stage, it's like, wait a minute. I'm coping with dead people a lot. [01:12:52] Dave Huseby: Um, [01:12:54] Doc Searls: and, and that's, that's weird when you're healthy. That's, that's, it's, it's really actually kind of good in some ways also. I mean, I'm getting. You have to remember things about people, this actually the best about them, right. And what they brought to the world. And part of dealing with that is let's gather up as much of what this person did. That was really useful there. Kim Cameron, for example, died last fall, Kim Cameron alpha author of what we're doing with identity today, you know, his laws are going to reverberate Tilly, and he says Moses about the laws. Right. And, um, coming up with the war stories actually of what we're about that are really interesting, you know, and I just did the interview. I just did with these guys in Korea. I mentioned Kim because what SSI is about his laws, it's about minimum disclosure for constrained use. User control, um, uh, justifiable parties. What your dog, when you're dealing with is all at his laws are exactly where you live. You know, it just, he just basically surface stuff that matters. Their design principles really, you know, is, is the individual in charge of their lives on here. Good. Okay. Are they dealing with other people on indeed to know about. Good other, uh, the parties justifiable, is it directed? Meaning that it's going to the right party, you know, rather than out to the world. Um, and then you've added another one to it, which is provenance, you know? Well, [01:14:36] Dave Huseby: where's, this approach allows us to go maximum Cameron. Like I'm very familiar with that, right. I mean like his rule of like a minimum disclosure for a narrow, narrow application, narrow focus, right? Yeah. Like we now can use cryptography to have zero disclosure and forced single purpose interaction. Yeah. Yeah. So I don't have to disclose anything. And I, and what I can tell you is. Just like, yes, I'm a human being. I have the capability or that I've been granted the right to use this one function on this one service on this one server, right? That's zero trust architecture at its purest and its maximum Cameron a hundred percent. I'm a Cameron maximalist. Put it that way. [01:15:27] Doc Searls: The camera and maximalist. [01:15:29] Dave Huseby: Yeah, Kim, Cameron is one of my heroes, one of my inspirations, uh, [01:15:35] Doc Searls: totally good guy. And, and, and, and he was a really clever operator to, in, in way. You know, people who worked with them in Microsoft said, you have no idea. You have to be political area. So how, how was he? And he actually quit at one point. And, um, and then I ran into him at a, at a conference and he's got a Microsoft as a w what do you back with Microsoft? I said, yeah, what happened? He said, I held him to my terms and all my enemies are dead. So it was just, it was just great, you know? And, uh, oh man. I, I, you know, I, I miss him personally because, you know, he was a great guy to have wine with and talk music and talk about everything and, um, But he also made stuff happen when he first got there because he came there by acquisition. And he said, uh, because Microsoft was busy failing a passport at that point. And that was part of a larger initiative called hailstorm. When you, when you were the most hated company in the world, as they were at that time, why call what you're doing? Hail storm [01:16:54] Dave Huseby: lightning strike or [01:16:57] Doc Searls: tornadoes poisoning. [01:16:59] Dave Huseby: Yeah. Category. [01:17:03] Doc Searls: This is project food poisoning. Anyway, uh, he said, he told me when, when the corpse. Uh, with the corpses of hailstorm and, and passport come floating down the river among the knives in his back will be mine. He was determined to make that thing die as soon as he got there. And I think that actually motivated him. I think he could have done with a lot of people do when they go by acquisition to a company, which is I get, I get a corner office and I bide my time and I leave. And you know, when, uh, when the timing is right, you know, he retired when he was old and sick, you know, that was, and he wasn't that old, he was years younger than me, but he was reread, you know, he retired for, you know, the old, old fashioned reasons anyway, have held forth. Catherine. I know. [01:17:58] Katherine Druckman: Yeah. It's going to be an editing challenge, but it works. This hasn't been open. It's great because, you know, I love these episodes where I just get to sit back and enjoy and listen and laugh and yeah, that's great. And then I'll reflect on it as I'm editing. That's when I think about the good questions [01:18:16] Doc Searls: okay guys and girls until next [01:18:21] Dave Huseby: time. [01:18:24] Katherine Druckman: Thanks. Thanks for hanging out.