Katherine Druckman (10s): Hey everyone. Welcome back to reality. 2.0, I am Katherine Druckman and joining me as usual is Doc Searls. And today we have a returning guest, Dave Huseby, who is a longtime privacy advocate and friend of the Reality, 2.0 podcast. Thank you, Dave, for joining us today. Thank you. And we have a lot to talk about, so Dave is always up to something interesting. I dunno. I would highly recommend checking out a couple previous episodes we recorded with him. This is a, I suppose, a continuation of that to an extent, but we have a lot to talk about. Katherine Druckman (50s): I'll let you take it out, take it from there, Doc, but maybe we should get you back into that question of where have all the, the old school privacy advocacy. Doc Searls (57s): So we were, I mean, it's a little more background is that Dave, we were just talking that, you know, I I'm of the generation that it did there, there was a collection of us that were hardcore privacy advocates, way back when in, in the last millennium and probably the most notable among those is John Perry Barlow, who wrote the, the declaration of the independence for cyberspace. And, but Dave was around into it a lot younger, but, but very involved. So, so Dave, I want, I want to get into what you're writing about here because you came up with a new initialism or acronym, authentic data economy, and the idea behind it is authentic data and, and, and also coming out, coming, coming out the back end of this with full decentralization or full to full distribution. Doc Searls (1m 55s): And so, so, so give us the it's a long piece, but it's got a lot of, it's got a lot of wind up before it gets to the pitch. We could probably skip that part, but because we've got a fairly geeky audience, but tell us what, what's the difference between the authentic data economy. And I mean, every, everybody hates the one where all our data is being sold and, and handed around everybody without any control on our part. But what's what makes what you're talking about different than other we're using crypto. So we're cool approaches. Well, we're using cryptos and we're cool. We'll start there. Yeah. So I think the, just to tie into the previous episodes, when I was on here, I was talking about full decentralization and user sovereignty as a means to getting to a world where we truly own our data. Doc Searls (2m 49s): And we truly are in command of the monetization of it and are able to participate in the world around us without immediately correlated Dave Huseby (3m 1s): Across time and space immediately being tracked and, and falling victim to what I call sort of casual surveillance or corporate surveillance, right? Where is your data flows through systems? Corporations, businesses are able to observe that movement of your data, your information aggregated, develop some kind of behavioral models, psychological model, whatever, and then able to sell that to people who wish to manipulate you, sell you things, that kind of stuff. And, and, you know, a year ago was the last time we really talked and I had set out on a mission to just solve full decentralization, moving away from centralized servers, moving away from even like the DNS system. Dave Huseby (3m 49s): And, you know, we even got so deep into our research that we were moving away from IP addresses, even it's, it's quite interesting research that we've done. But in the meantime, you know, I had spent the last four years at Hyperledger is their security may have been there. And I was deeply involved in the self-sovereign identity community, which is internet identity workshop. You know, I, I w which is coming up here in April, that I give a plug to that. It's a great conference. Yeah. Doc Searls (4m 19s): I could organize. And was, it was born in a podcast, actually, the list of 2004 goes back that far. Wow. They'll be at our 33rd this spring, April. Yeah. I've been going there for gosh, almost 10 years now. Yeah. Yeah. Dave Huseby (4m 39s): And I'm looking forward to it because we're going to start showing actual running demos of what we're talking about here today. So in that push for decentralization and owning our data and to combining it with the self-sovereign identity theory and application, we, we started looking at what the real problems were like self-sovereign identity is supposed to be about, I get a credential, like a KYC credential, or like I have been immunized immunized for COVID Prudential by credential. I mean, some piece of data that is been digitally signed by an organization that is handed it to me and it either can establish my identity or be given to me like I'm identified as the recipient of it. Dave Huseby (5m 29s): And the idea is there and is that you should be able to take that data and use it in systems, like go to my grocery store and make a proof digital proof that I have had a COVID vaccine or something, and then access the grocery store. The problem with that is, is that the self-sovereign identity community has struggled with privacy significantly. They like to say that it is private and it can be that's. What I want to say is like the architecture itself can be Doc Searls (5m 59s): So to pause there for a second. So the, my understanding of a society and I've read and heard and talked to radial about it, but I am not on the inside of any of the development efforts, but in its simplest form, it's just individuals presenting verifiable credentials that are issued to them by somebody else. So, and the idea is that it isn't that my driver's license is my ID. It's that if all you need to know is that I'm licensed to drive, here's a verifiable credential of that. Or if you just want to know I'm over 18, or I'm old enough to get a Covid shot, which is like, in my case, I'm in California, I'm above the threshold for COVID shot. Doc Searls (6m 41s): You need to verify file credential. You get a verifiable credential. It is that I present my driver's license with all the other privacy exposures on it. But what you're saying is that's not enough. We need it. Yeah. Okay. So tell me, tell us about that. Dave Huseby (6m 54s): Well, one of the classic examples in SSI is when you go to say a bar, the bouncer asks to see your driver's license, you're handing over verified data, right? This is data that the DMV is verified to be true about. And it's presented in a form that's independently verifiable. They put it under the black light to make sure that it's not been tampered with, but all that data on there now is available to the bouncer. And what they normally do now is just scan the barcode in the back, which captures all the data, right? It captures all that data in a database. They don't need it. All they need to do is know that you're old enough to get in there. And, and so this is, this is where the crux of the problem, the crux is SSI is to gain our privacy back or to engineer, to, for us to be in control of our own data and to be masters of our own privacy. Dave Huseby (7m 54s): We have to be able to interface with the outside world without revealing the underlying data. So giving your driver's license or a digital version of it to a bouncer is not bad because you're giving the data away. Doc Searls (8m 9s): So, so let me pause there for a second, because I think, I think the SSI case is that you're not giving them the orange whole driver's license. You're just giving them the part of it. They need to know the, I mean, so I was thinking yesterday. So in order to verify to carbon health, which is the company putting on the COVID shots, cause I'm a second Covid shot yesterday. There were a couple of authentications. They did, they gave me a card that I had the last visit and they see the second visit is not filled out yet. And, and they scanned a cure. The BR the, the barcode in the back of my driver's license. Now I couldn't help thinking then what else is in that barcode? I mean, it could be for their sake and probably for their sake, because they're actually more companies actually are looking to do data minimization, which you're calling it on their end. Doc Searls (8m 58s): They really don't want to know your whole life. They're not selling your data at anybody. Else's not their business. They just want to want to do a verifiable credential. But the idea with SSI, as I understand it is I have, I have a digital wallet with basically pulled apart, digital identity, not identifiers, but verifiable credentials. So here's the one that says, I am known to carbon health and I've been here once before and they would scan that barcode. And no only that maybe didn't even scan a barcode. I would, in some form, I don't even know probably a QR code that I've generated myself. They would scan a QR codes. They go, yes. And that would check with somebody else. Doc Searls (9m 38s): I guess maybe the DMV they say, yup. Yup. That's authentic. That's him. Right. That was the idea. Is that the idea, anyway, an idea with SSI is that you're not, you're not disclosing lots of secondary data that is of no IM of no, which is none of that. In other words, it ideally minimizes what we might call none of your business. Right? That's an O Y B none of your business, but so I'm clear on what you're saying is that, even that isn't good enough. We need, Dave Huseby (10m 9s): We need more. Yeah. I set you up back. So you walked right into it. It's good. That's my job. So yeah. So the reason why to get back to the title of my piece, which is the authentic data economy, and to tie it into what you just said, okay, you were talking about data minimization and you're talking about systems that are using the actual underlying data. So you said, you know, you had proven yourself to them the first time they gave you a card, then maybe a QR code. That would just say, this is my identifier for me inside of the carbon health system. Dave Huseby (10m 49s): Right. Right. Doc Searls (10m 49s): Yeah. And it actually, the, the first time I did have a QR code, I just don't recall. Anyway, in, in a, in an app or in an email on my phone, I could show him that, but this time I just showed him my driver's license because he didn't have the other thing. Dave Huseby (11m 3s): Okay. This could be real life. Doc Searls (11m 5s): Real life is you don't have the damn thing they gave you. Right. Dave Huseby (11m 9s): They just, yeah. They just scan your driver's license. Well, yeah. So let me, let me, let me turn that knob on privacy a little bit higher and show you how you can actually do this in a way that you would be able to own your own data. And there wouldn't be any transmission of the underlying authentic data, but you still maintain the trust. Okay. So SSI is about getting some data that is independently verifiable. Yeah. So it's a piece of data that was given to you by the DMV or by your university. It's digitally signed by your university or the DMV. And therefore I can hand it to you doc, and you can look at it and go get the signing keys or those institutions and verify that it is valid. Dave Huseby (11m 56s): Okay. So the, the lead up in this paper is that we should start calling this authentic data because there's been a lot of problems around self sovereign. The term of self-sovereign, there's been a lot of problems around all these other, like the terms we've been using. And so I'm just proposing that. We just start calling it authentic data. Okay. Because that's what it is. It's data that comes with some degree of authenticity, all authentic data indicates where it came from, who it was given to, and that it hasn't been modified. And that can be your driver's license, right. That's authentic data because it's tamper resistant and it came from the DNB and it's spot, right. Or your diploma for instance, from your university is all fancy and everything. Dave Huseby (12m 37s): So it's difficult to counterfeit those, but in the digital world, authentic data is data that is digitally signed by the issuing institution or the issuer it's it's given to the holder and that's provable. And then there is a digital tamper proofing, you know, the digital signatures themselves show that the data hasn't been modified. Okay. So once you get that data, you can do all kinds of things with it, right. It becomes a digital equivalent of the paper version, right? However that doesn't preserve your privacy to preserve your privacy. What you want to be able to do is receive authentic data about yourself. Dave Huseby (13m 21s): So I possess it. So I received my credit history and my credit score from say, like Experian, okay. They digitally sign it. It's, you know, it's encrypted to me. It's tamper-proof okay. Now, if I want to go and apply for a loan, I don't want to have to authorize that broker to go get it because it's none of their business, as you said, right. What do they need to know to, to properly price the risk of the loan? To me, they just need to know my credit score. They, you know, the range, you know, it's greater than 800 or something. They need to know that I haven't had any bankruptcies in the last 10 years. You know, whatever it is, those are things that I could prove to them based off of the authentic data, my credit history that I now possessed and that's called zero-knowledge proofs. Dave Huseby (14m 10s): Okay. And this is again, we're using crypto. So we're cool. It's a, it's a cryptographic technique that when combined with non repeatable digital signatures. So when I make the presentation of the proof, I digitally sign it, they digitally sign that they've received. It sent me back a receipt. Now we both have a copy that this proof has happened. They now have all the necessary paperwork or paperwork, digital data to make their operational decision, have insurance over that. And they also have a cryptographic record that that transaction happened. Okay. Without me revealing the underlying data. Okay. And this is, this is because part of the theory here, and this is something the SSI community hasn't really discussed until now, but there's kind of two classes of problems. Dave Huseby (14m 58s): And I don't have good names for them. I just call them class one, class, two problems, class one problems are problems where regulation requires disclosure. So think like financial transactions, right? There's a certain degree of disclosure of who's sending the money to whom and that's required by law, right? And in the participants in any financial transaction have to pass certain background checks like anti money laundering, checks, that kind of stuff. So those are all class one problems. And I got to say, SSI has that locked up? Like we could do that so easily because it's just authentic data. And it's based on the real stuff. Right. I can have an AML check done on me. Dave Huseby (15m 39s): Get that check as an authentic data. I can hand it to my bank, my bank, good to go. Right. The other class of problems, the class two problems are basically everything else where regulation requires privacy. Okay. And there's, we sort of, one of the great things about our society is that we actually default to privacy per regulation, right? If you're a business and you ask somebody for their personal information, there are consumer protection laws that do bind you as a business owner. They're not very strong obviously, but there are some protections. Dave Huseby (16m 20s): If we move in a direction where we're using zero-knowledge proofs for everything, not only do we free business owners from having to worry about those things, because the data they get does not actually possess PII, and it's not encrypted PII, it's zero knowledge proofs based off of PII, you know, the, the personal information. So not only do we free them, but we're also never transmitting our authentic data, our online, underlying data, we're just presenting cryptographic proofs based on that data that is necessary to interact with the real world in a trustful way. And just, you know, back of the napkin math, I think roughly 80% of the interactions in the real world problems in the real world are class two privacy preserving transactions, where regulation requires some degree of privacy to be maintained. Dave Huseby (17m 13s): And the other 20% is class one. So the problem here, and I'm getting to my point, SSI up until now has not had a real good solution for class two problems. They cannot do. Zero-knowledge proof sufficiently. When data is issued to me, underlying data issues is issued to me. My privacy is rested in my membership in a cohort. So I'm, you know, like you're, you're a carbon health patient. Okay. Your privacy is actually based on a membership in a cohort. You don't like if you could prove to them that you're one of their patients, but not reveal which one, that's how your privacy then comes from the fact that they have millions of patients. Dave Huseby (18m 1s): Okay. If they only had two patients, you don't really have privacy because you can prove to them. Yeah. I'm one of your patients. And they're like, okay, well, you're not Joyce. So you must be Doc, right. Or you're not Katherine, you must be doc, right? So you don't really have privacy in a zero-knowledge proof system, unless you have large cohorts. And the SSI community cannot up until just recently, could not do the math, the crypto efficiently enough to have large cohorts. They all the existing solutions in, in the community right now in the economy right now, the best they can do is like one in 10,000 is your privacy. And a lot of cases, that's probably good enough. Dave Huseby (18m 42s): But even then the data that is generated we're talking about is on order of megabytes, actually gigabytes per 10,000 pieces of authentic data. I know I'm getting like really bland here on numbers. But the point here is that my co-founder Mike Lauder invented some new cryptographic techniques late last year, that radically change our, what we're capable of. He is able to reduce the size of these kinds of proofs by eight magnitudes and increase the speed of computation by six. So now proofs. So when I want to go to, like, when you went down to, to get your Covid vaccine, the proof to carbon health, that you are one of their customers would only be 380 bytes instead of the many megabytes that are the existing solutions in the SSI community. Dave Huseby (19m 41s): And the reason this is important is because these new kinds of proofs are so small, they can be printed directly into a QR code. They can be done in paper. And in an inclusion now becomes the main driver for this new method. Because now we don't rely on, like, if we want to give self-sovereign identity and the privacy preserving capabilities that this new model, this authentic data economy, we don't require the end users to have super smartphones. We can now do proofs that are so small that we can put them on pieces of paper and we can put them in smart cards. And so this is, you know, I've, I've read a lot lately that there's a big concern about inclusion and equal access to this technology. Dave Huseby (20m 28s): You know, all throughout the world where smartphone access technology access is limited. And I, I'm just super excited about this new technology, this new technique. And, and like I said, Doc Searls (20m 40s): Debbie, we're going to be showing everybody out. So is there, okay, so we know who you're talking about or what you're talking about. You're, you're now working for a company here. What's the name of the company? Dave Huseby (20m 53s): The company is trust frame was co-founded by myself, Mike Lauder, cam gear and Rick Cranston. And so some old, old names. Yeah. Doc Searls (21m 4s): Yeah. I know. Cam what, what, and is the trust frame.something.com.org. Dave Huseby (21m 12s): Yeah. We don't really have a website there yet. It's just our logo at this point. Doc Searls (21m 15s): I see it. There it is. Okay. Dave Huseby (21m 19s): But we are launching, you know, we are going to put all this stuff online here in the next couple of weeks. Doc Searls (21m 24s): Okay. So is there a name for this compressed or a form of crypto that are, or zero, zero knowledge proof that you're doing that? Dave Huseby (21m 35s): So it's all about, we call it accumulator based privacy because the underlying cryptographic construct is called the cryptographic accumulator. And I could go into the math of it, but the, the trick here is that what's really cool about this is you can, like, if you're an issuer, let's say you're carbon health, you could issue a verifiable, you know, authentic data. So like, here's your, here's your patient ID? Here's your patient ID? Here's your patient ID. You could issue those out to all of your patients. And if you had a billion patients, you would take each one of their IDs and you put them in this cryptographic accumulator. Dave Huseby (22m 16s): And because it uses, which is essentially modular math, it never gets any bigger, it's fixed size, whether there's one credential or there's a billion credentials. Doc Searls (22m 25s): Okay. So, so, so you have a way that you can scale zero knowledge proof that wasn't possible Dave Huseby (22m 35s): Earlier. Yes. Yeah, exactly. Yeah. Okay. To global scale, by the way, like we've been running tests where we're issuing credentials at like 10 million per second, and this could easily scale to a world of, you know, 10 to the 15th power of active credentials, like billions of issuers issuing billions of credentials. Doc Searls (22m 55s): So, so you're, so you're kind of eliminating friction on the credential issuing side. That's right. Yeah. Okay. Dave Huseby (23m 2s): It's way more than that. Right? So there's other stuff. My half of this, Reich's the crypto guy, my half of this is the architecture we're moving away from the traditional three party model in SSI, where you have the issuer, the holder and the verifier to a new system where there's four parties. You still have the issuer in the holder and you still have the verifier, but we have this other party called the aggregate. And it's their job to solve the many, to many problem where you're, if there's going to be millions and millions of issuers and potentially millions and millions of verifiers, how are all of those verifiers going to know about all of the issuers and get, cause the issuers have to publish some data that the verifiers have to receive for them to be able to accept zero-knowledge proofs from holders. Dave Huseby (23m 54s): Okay. And so you really need this aggregator and that aggregator is also key in the sense that they will take the rules from our human scale trust system. So like what does a valid age check for, to buy alcohol look like in, you know, in Las Vegas, in Nevada, right? There's a rule about it. There's a lot about that. Or what does it look like to do a valid COVID immunity checks to get into a concert in new Orleans and Louisiana, right? That you can formulate that as a set of rules, aggregators would be responsible for getting those rules to all of the verifiers. Plus they also have a role of vetting the issuers. Dave Huseby (24m 37s): And this is a, this is a fundamental mind shift, like thinking shift in thinking in the SSI community. Because I think due to the limitations in the cryptography, in the past organizations like the trust over IP foundation have always thought about this as like issuers will be permissions. The word we talk we use is permission. So you would have a closed network. And the organizations that can issue data have to sign contracts, have to submit to a governance model. You know, like universities issuing digital diplomas would join a coalition of universities doing this. They would sign a contract. Dave Huseby (25m 18s): They would be bound by all the legal and policy issues and policy agreements for this kind of system. And I think that was like I said, largely due to the limitations in cryptography, they couldn't do privacy preserving at scale privacy, preserving credential presentation at scale. Okay. Now what we're able to do with this huge improvement is we're able to move to a much more global and like aligned with how society really works system, where all the issuers are unpermitted. Anybody can be an issue, right? At this point, anybody can be an issue because it can, you can do it on your cell phone. You can do it on your laptop. Dave Huseby (25m 59s): You can, you know, like my local farmer's market, my one might want to do a digital loyalty app, you know, loyalty program, you know, they can set up their own as an issuer. And then there is an aggregator in all of the merchants in the farmer's market are verified. Right? And just seeing like the scale is so much bigger than the scale, the scale of the co the, the, the cryptography is so much smaller that we can now scale the entire system to global proportions. And that means we're going to be facing a problem of potentially millions or even billions of issuers, trying to publish data that billions of verifiers will need access to. So that the billions of humans who are holders, or, you know, IOT or cars or whatever that are holders of data can then interact with verifiers. Dave Huseby (26m 49s): So it's a whole new scale. So Catherine, is this all making sense to you? Katherine Druckman (26m 54s): Yeah. Yeah. It's interesting that, that was actually you, you, in this explanation, you answered one of my questions, which was how, how does this address limitations and what we have now? And I, you know, you already covered that. So I'm kinda like, well, it's going well, I've got nothing. Now. I liked the idea that you're, that you are now able to use examples of real world use that are very, as you say, inclusive and accessible to most people. And I think that like literally most, yeah, literally yes. Yeah. Literally most people. And I think that's sort of the game changer here. Katherine Druckman (27m 36s): You know, not everyone can carry, can figure out how to manage a crypto wallet or, you know, whatever else, you know, whatever, you know, other references we can think of. But you know, a sticker with a QR code is a little bit different, Dave Huseby (27m 50s): Right? Well, and this is going to move us in a direction. I hope anyway, where we are actually the primary conduits of our own data. So the authentic data economy is maintains at least. And it's probably more efficient than the existing e-commerce economy that we have, the surveillance economy that we have, except it turns the whole thing on its head. Instead of using large scale databases, to transmit all of our information through the system, to allow us to interact with it. Instead, I am become the conduit of my data. I possess my data. And then, because I'm not necessarily disclosing my data, I'm doing zero-knowledge proofs wherever possible. Dave Huseby (28m 37s): There are some cases where you have to disclose some data. There's usually biometrics involved like a photograph or something that would have to be revealed, but that's, that's in limited use cases, but I become the conduit, right? So like, again, back to my, my example, where I apply for a loan, if you do that today, your broker says, okay, I need you to authorize me to go and get your credit history. Why like, why did they do that? Let's because they, the only way for them to trust the data is to get it from the source, because we do not have an authentic data system today. So you have to sign an authorization, they then go to Equifax or whatever and get your credit history. Dave Huseby (29m 20s): And then they are able to judge your credit worthiness for this loan. Well now, because authentic data is possible. I can get it. I can then make proofs to that broker, that broker, then doesn't care how they received the data because it's independently verifiable, it's authentic. It comes with its own guarantees, authenticity. It, Garrett comes with links back to the issuer. They don't actually have to see the underlying data, but they can trust it because the trust is baked into the authentic data. And I am the conduit in which it transmits to that loan broker. And this applies to basically everything. Dave Huseby (30m 3s): You know, if I have a ticket to get into a concert, or if I, you know, all it, like all of this becomes, I become the bearer of my data, right. And I've been really excited. I've been trying to get ahold of Tim Berners Lee. I was like, you know, you know, maybe doc, you can talk about this. Are you Catherine? I know you guys have followed what they're doing was solid. And people having pods, like it's a great idea. If the pod then becomes where you store your authentic data. And the only way to get the data out is through this privacy preserving presentation system. And I think the reason why they haven't seen a lot of adoption is because there's limitations in the cryptography, but those limitations have been solved. Dave Huseby (30m 46s): And we've had, we had cryptographers look at it and universally, the reaction has been like, Oh my word, right? This is huge. Katherine Druckman (30m 57s): So I may have kind of missed something in here, but going back to your example about, let's say a credit report or something, what do you do in the use case where you're dealing with an entity who actually does one, all the underlying data, they want to suck up as much data as, as they can. Now I realized the whole point of this is to empower ourselves to resist that. But, but how do you T how do you take it back and enforce? No, you know, I don't want you to have all of this underlying data. How do you make that shift? When the people who are currently consuming the data seemed to have a little bit more power, maybe a lot more power than, than the rest of it. Dave Huseby (31m 35s): Well, you know, there, this isn't going to be, you know, the magic band-aid right. It's not going to suddenly, Katherine Druckman (31m 44s): I hoped it would be Dave Huseby (31m 47s): Like all things. Time is actually on our side, on this one. Yeah. When I was at Mozilla, I did a lot of research on web surveillance and how companies track you across the web based on your browser and your installed fonts and your clock skew and your operating system and all these things. And one of the things we did to advance this research was to assign sort of what's the reasonable expectation that that data will be stable over time. So some things never change like the web VR standard reveals your interpolator distance, which is stable through the course of your natural life, right? Dave Huseby (32m 30s): So that's an extremely risky piece of data to give away because it doesn't ever change. But if it's something like, I don't know what time zone I'm in, I travel a lot. I changed time zones all the time. The time zone I'm in today is not necessarily the one I'm going to have tomorrow. So the lifespan of that data is on the order of hours, days, maybe weeks. And that's true about all of our personal data. We tend, we tend to change constantly throughout our lives. Humans, do we change homes? We change affiliations with employment and social clubs. We changed credit cards. We changed banks. We changed the car. We drive, we change a lot of things about ourselves, pretty much everything about ourselves changes. Dave Huseby (33m 14s): And so what I hope to do with the authentic data economy stuff is to draw a line in the sand or more, more appropriately aligned in the timeline in history where this is where my data stops flowing through the systems. And instead it flows through me. And the farther along we get in time, the older, the data prior to that line gets and the less relevant it gets to being about me. And so over time, my privacy will actually get greater as that old data they already have on me becomes more and more stale. Dave Huseby (33m 58s): And so, yeah, we can, you know, it has to start sometime, right. And what I'm excited about is that the research and decentralization and this huge breakthrough in cryptography is going to allow us to move the SSI community, to become the authentic data community. And we'll be able to now talk about privacy preserving systems all the time. This gets back to my principles of user sovereignty, where privacy is on by default, absolute privacy by people. That's the first principle of user sovereignty. And in the authentic data economy, we can actually that, that that's it, right? It starts off being private by default, absolutely private by default. Dave Huseby (34m 41s): And then we get to choose our association in transactions or, or with businesses based on how comfortable we are with how much data we, we will share. And, you know, you're right. There are going to be some companies that just say, well, you have to give us all your data to participate. There's always going to be a power imbalance. However, I think we'll quickly find. And you know, it's kind of unfair to talk about Joyce when she's not here, but, you know, she once told me that if you can get it so that the customer owns their data and controls identifying themselves to a retailer or, you know, business, it kind of inverts the loyalty. Dave Huseby (35m 26s): You know, they no longer want to offer 20% to the a first-time customer. They want to offer 20% off to a long time customer, because you can prove that you are a longtime customer, you know, they can reward your loyalty because it's, it's now authentic data and they can, you can prove it. Doc Searls (35m 45s): So, okay. So how I have a whole bunch of questions, but I'll put some of them in the chat and you can just look at them there and see if the, but there, the digital people know Joyce is my wife and maybe not all the whole audience knows that. And she's very involved in NSSI world and has been from pretty much from the start w one is, okay, how does this look to people? I mean, a problem I've had with SSI from the start is not with SSI itself because I think theoretically, it's fabulous. It's, it's the, it's kind of what, you know, what, in the very old days of computing, they call it a knife, edge rollover, or a forklift upgrade, which is you have the entire old system out and put a new system and it was just generally doesn't happen. Doc Searls (36m 29s): Right. And so is this cool? So, so, so practically speaking, okay. It's four or five years from now, presumably we all still have phones. We still have our own compute nodes that that may be logical or physical, but it's a place that, you know, wait, what do we have now that we did not have in 2021 that makes our lives easier? You know, I, I have as something, is it a wallet? Is it a, what is it? What is the thing that I have? Maybe it's just an app. Maybe it's that maybe it's something embedded in every app. I don't know. I want to know how it, how it works for people, right. Doc Searls (37m 11s): Within, in a practical sense. Well, Dave Huseby (37m 14s): You know, phone security is crap. I mean, yes. Like iOS, Apple has really great hardware, assisted security, but privacy, right? Doc Searls (37m 27s): How much privacy you have when you're, when your location is being leaked out. I mean, so here's an interesting story. Dave Huseby (37m 33s): I'm saying that's what I'm getting at. Right. So I don't see it as my phone. Like I want to minimize the data. My phone has honestly. Doc Searls (37m 39s): Yeah. So, so, so for example, yesterday I was in a small, I was in a suburb of Los Angeles and I'm driving in and it's early in the morning and there is my, my Google maps says there's a traffic jam on this side street that I'm about to turn on. There was one part truck on that side street and it's somebody, who's a grounds worker. Okay. What that told me is that person has their phone on, in a, in a car that, that, that isn't moving. And that tells Google there's a traffic jam there because the car's not moving. Now. I would like when I make the deal, I would like to be able to broadcast, for example. Doc Searls (38m 24s): Yeah. You know what, there are four providers out there of, of cellular connections. I'll as a customer, I'm saying, I want to not so much opt out. I don't want to opt out of anything. I'm willing to give my location data to you on the basis of these conditions, which is, I know that's anonymized, it's not being shared with the cops. It's, you know, whatever else it might be. There's some collection of things by which we feel we might feel. Okay. I mean, I have no problem sharing my traffic data in order to continue to contribute to the traffic system, whatever that Dave Huesby (39m 4s): You have to log in with an identifier to provide. Doc Searls (39m 8s): Actually, I don't, I just have to happen to have a B, I just have to be a customer of T-Mobile. Right? Yeah. I mean, I've got an iPhone and the iPhone may be as private as could be. So if I lose it and somebody gets it, they can't crack into it, but I'm still telling them that, you know, that's my, that's my relationship with Apple. It's not my relationship with T-Mobile right team. Right. T-Mobile has private deals with Google and everybody else's in the traffic business where they're probably getting money for that data. Right. I'm guessing I don't even know. Nobody knows what that business is. Or if it even is a business, it may just be, they give it away. I don't know. But I think part of being part of operating with full agency, as a human being in the digital world, as you have more control over that, over what others know about you and, and, and anonymizing that, right. Doc Searls (39m 54s): You know, that, that, that this, that my exposure is in is inherently limited. Right. And, and, you know, you could, the truth is, I mean, you could turn off all the location data on your phone that you want, but you still got the, T-Mobile still knows where you're going. Right. And of course they have to, because they're moving from cell to cell, there's nothing wrong with that. Right there, they're there for the purposes of network optimization. They have to be able to do that. That is how the cellular system works. But having, you know, part of, part of why we have privacy problems in the digital world is that an unquote choice again, you know, it has no distance and it has no, no gravity that we're not dealing with the real world here, but there's still location right. Doc Searls (40m 40s): In the physical world, they are doing this other stuff. But it's almost like a, a magic hat trick that you don't, you don't know what they're really doing over there. It is a shell game. I don't know what you're doing with my, the P under your, under your shells. I'm just trusting that you're not screwing me, which they don't, they don't. I mean, they're very few, you know, in the law business, they talk about harms or is there a harm? No. Okay. There's not a problem, but there's still a sense of having your pants down. Dave Huesby (41m 11s): Right. Well, and that's what let's see. You said something really great in there about, you know, our autonomy in a digital world. That's what this whole authentic data economy, philosophy and design and technical capability is all about, because it, it does make me the conduit of the data and all of the presentations, whenever are zero-knowledge Dave Huseby (41m 34s): Proof. So it's not the underlying data itself. It's some proof about it. You said in chat, is this a better way to do SSI or the only way to do SSI? And then later you at you asked, is this like a forklift upgrade? And I would say that this is a better way to do SSI. And I have to say the four party model, I have had lots of conversations with companies in the industry that are using SSI. And whenever I bring out the four party model it's typically met with, yeah, we've always thought there was something missing, you know, or we've already decided that there had to be this other entity. And it's been called Reggie tech. Dave Huseby (42m 15s): It's been called many different things, but yeah. So who's working for who are they working for? Who's well, in many ways you can think about this as developing payment rails, right? Credit card systems. So you can think of like, the aggregator is the Verifone in this system, right? They're the ones that will operate the verification hardware. They will provide an SLA to verifiers. So the way I see this working, and this gets to one of your other questions, which is how do you make money? Well, if I'm a business, say like I own a baseball park, I would love, don't a baseball park, by the way, it would be awesome. Dave Huseby (42m 58s): And I have to take tickets and I have to be able to check COVID immunity and whatever, you know, whatever I, my municipality, my state requires me to check to have an, a baseball park. I don't want to have to worry about every potential healthcare provider issuing pre COVID credentials. I don't want to have to worry about every possible, you know, issue of any kind of credential. I would have to check what I want to do as the owner and operator is to subscribe to a verification service. And that is an aggregator. So I'm paying a monthly fee or, you know, a subscription fee to get the checking hardware or the software that runs on our existing hardware. Dave Huseby (43m 43s): And then the SLA, the service level agreement is that I will get any updates to check rules. I will get a steady stream of, you know, signing keys and non revocation proofs and stuff like that. From all of the issuers, all of the healthcare providers, all of the, you know, KYC vendors, whatever, all of those issuers, their, their data that I need to have to be able to verify. Zero-knowledge proof presentations will flow through the aggregator to me, and I'm just going to pay for that. Right. And the same thing with issuers, they have to. So one of the things I think the SSI community did wrong from the beginning, as they said, you can't charge for issuance, right? Dave Huseby (44m 25s): And I think that's just fundamentally wrong. And because when an organization has to verify some data about somebody and then issue a verifiable Prudential, or, you know, an authentic data that they stand behind, there's some risk in that. And you have to monetize that risk or that business model doesn't work. Whether you monetize it directly by charging for issuance, or you monetize it some other way, you still need to monetize that risk. And I think the SSI community up until this point has never had an answer for that. And I think in the authentic data economy, it makes total sense that you would charge for issuance. So in the case of like COVID credentials, I, I assume it will be a billing code on your health insurance when they issue it, you know, data records, creation, and maintenance or something, right. Dave Huseby (45m 14s): And that will cover their costs of publishing their signing keys in their key event, logs publishing the non revocation proofs, which are set proofs that my credential is part of the valid set of COVID credentials. They have issued, right. They have to stand behind that. And if they ever find out that that's not true for me, then they have to revoke it, right. They have to remove my credential, my specific credential from the set of valid credentials. And that's a really great segue into your other question of, is this a, is this a forklift upgrade? Dave Huseby (45m 54s): One of the fun things that we've been doing is wrapping credentials from existing SSI vendors in a revocation proof. The biggest problem here is the revokable credentials are very difficult to do without this cryptographic accumulator technique. So being able to have privacy, so I'm issuing a million credentials. So you can like, you know, in the carbon health example, I'm one of your co I'm. One of your patients, I'm one of 10 million patients. So you don't need to know which one, just know that I am one and I have real privacy then, but then if you, if I move health providers, they need to remove that credential. They need to revoke that credential so that I can no longer associate as a, or prove that I'm a carbon health patient, because I'm no longer a carbon health patient. Dave Huseby (46m 44s): So revocation is actually really hard until Mike made his breakthrough. And so we've been taking a lot of existing systems in the market and wrapping them like we're working to, we have a solution that we take like existing credentials that were issued. We wrap it in a revokable piece. And so we're providing an off-ramp for companies who have already invested heavily in SSI and are either running into scalability problems or their use case now needs revocation or something. We're able to offer this, this as a transition. And eventually the idea is that eventually the, the old issuance system will move towards an authentic data economy, issuance system. Dave Huseby (47m 29s): And Dave Huesby (47m 30s): It'll just have revocation built in, it'll have privacy built in stuff, but so it's not a forklift upgrade. It's not a knife edge rollover. It's not like we have to throw away everything and do something new. We, there are intermediate steps that companies that have already invested in this as I can take. And, and anybody who's new to this, the requirements for rolling out this technology now has shrunken significantly. I could easily see like a little farmer's market. You know, the operator of the farmer's market has an app. That's an issuer. And then everybody can download the farmer's market app. And, you know, all the merchants have one. Dave Huesby (48m 10s): And it just, it's really small and easy to do because the computation is so much more efficient. We can operate in deeply embedded environments with, you know, kilobytes of Ram and, you know, eight bit CPU's and things like that. Or they could even just print paper ones, right? Again, these proofs are so small that you can, you could print paper membership cards to all your members. It's just a huge, huge improvement in the cryptography and the data, you know, the, what it takes to build these systems. I had one other point that I, I kind of ran over with my own locally. 5 (48m 52s): I don't remember. It was, Oh yeah, I remember it. Dave Huesby (48m 55s): I said, how, what does this look like for, for end users? Yeah. Is it a wallet? Is it an app? I would argue that you won't know that you have a wallet it's built into every app. So if I have an app that is for my gym, for instance, and it tracks my workouts, my personal data will be stored in the app on my phone. They will issue it to me as my gym. You know, like if I go up to a treadmill or whatever, and I run a mile, then I get, you ran a mile, you know, as authentic data that goes into my phone and this becomes authentic health data that I could leverage anywhere else. Dave Huesby (49m 38s): Right. A better insurance rates or getting my annual physical or things like that. You know, I can show my exercise regime and what I've been doing. I don't think you're going to see that it is an explicit wallet anymore. I think it will just be built into everything. There might still be a wallet, but for things like digital ID cards will be in your wallet, like your digital passport or something, but everything else I think will just be built in. So, sorry, 5 (50m 7s): Changing the character of the data itself, Dave Huesby (50m 10s): Right. It actually falls in line with Tim Berners Lee's idea that we'll all have a pod that will store our data. And then we'll use that to interact with the world. We now have the cryptographic solution in there. 5 (50m 22s): So he may have, I mean, I have a pod that was an early pod. You have pod user. I've never used it. It's just, Doc Searls (50m 29s): Yeah, I'm a pod person, but I, I haven't used it. I mean, it's, there's, there's not a, maybe there is a use for it. And I haven't looked at it in a while. I mean, I, I want to support solid. I want to support what you're trying to do. Right. But without, you know, and by the way, I also have an SSI wallet from one of the people that issues a wallet. And I don't use that either. Right. Because it's not, but if there's, if what you're saying is okay, so I have my, my gym app, you know, I mean, I mean, I mean, Apple's full of health apps. I've got an Apple watch here because I've got, I want to watch a, my heart's doing, and there are several apps that provide this and Apple does a pretty good job of saying, that's your data. Doc Searls (51m 11s): It's not our data. We don't know what your heart's doing. That that's up to you. Dave Huesby (51m 15s): They use their secure enclaves on their phone to keep that data so that if someone stole your phone, they can't get it. Exactly. I do commend Apple. They have really good hardware based security for storing your data securely, but I wish they would open it up to everybody. Right. Doc Searls (51m 31s): That's the Apple, right? Yeah. They, they, they, they want to be exclusive, but, but it's the kind of thing that, I mean, it's sort of interesting thought. I mean, if, if Google or Samsung or somebody else that's in the Android side of things wanted to do the SSI thing in a way that's more universal, they could build. I mean, they could adopt your approach. I imagine. Yeah. We're, we're the data by its nature in the first place is authentic in the way that you describe. Yes. Which is more, which is more lightweight and easier to put to use because the cryptographic burdens on proving it are lower. Dave Huesby (52m 10s): Yes. And we'll have the infrastructure to support independent verification of any data as authentic, right. Doc Searls (52m 17s): So is that your business having this independent infrastructure? Is that, or what is your business? Dave Huesby (52m 23s): So this is, is we are just licensing the ability to we're an open core business. So we have an open source side, and then we have enterprise upgrades that you can license from us essentially. But we're fast moving into a business model that I like to call it decentralized core business model. We have a number of ideas for apps and startups that are informal, like collectively there's a service provided, but the members that provide that service are informal relationships and they all get paid based on what they contribute to the group. So think like Bitcoin mining pools are, are decentralized core business. Dave Huesby (53m 6s): So anyway, Doc Searls (53m 8s): Interesting. Yes. So Dave Huesby (53m 9s): What we're doing is, is really just standard, open core stuff, you know, showing everybody how to do this, revealing opensource, you know, showing the open code and everything. And, and then we have enterprise and, you know, consulting services, support, that kind of stuff. So, Dave Huseby (53m 27s): Yeah. Doc Searls (53m 28s): So, so we've, we've been at the, this has been a very fast hour. Dave Huseby (53m 32s): I didn't even realize. And I was thinking that maybe this is like the last time we talked to you, which is just too much, should do a different one. So my follow up. Yeah. I thought we were just getting started and suddenly it's been an hour. Doc Searls (53m 45s): Yeah. We haven't getting started and we could go longer. So, so, okay. So people can read your, your paper. We can put that in. This is the authentic data economy. I think that, there's a, but I think there's a, I think there's a, I hate to say, go to market, because what you're really talking about is is, is a bit of a sea change in the market more than a way of going to the market. There's a, I mean, is it a problem of heaviness as I design diverse place, which is that, you know, identity itself, which we've kind of, I guess we've been talking about us as 2004 and never resolved it. Doc Searls (54m 32s): I mean, so it it's, it, it's not so much, it's a, it's a problem. It's just that it's, it's a characteristic that cuts across a great many concerns, right? So, and then there are all these defaults in the physical world where your identity, you know, we assume our identity is, is, is, is a printed credential that a rectangle that's presented that that's given to us by an authority, you know, the school, you know, whatever, you know that, and, and it's not something that's truly ours, but we're not aware it's a problem. I think that's, that's sort of the thing. But there, it seems to me, there are a lot of areas where there's an efficiency in here that may be scalable, that we're not thinking about, like for example, consents, right. Doc Searls (55m 21s): We're consenting all the time. Online is a horrible non-system, you know, where every single site we go to is asking us to consent, to being followed everywhere. Where if you have a global preference, there are ways of doing this right now, but there are a hack on the cookie system. There might be a better way of doing that. Where, where maybe the most important data that we carry are our simple privacy preferences that we need others to respect that are tied up with the circumstantial basis of a lot of our interactions. Like I am a citizen of California, you know, I am licensed to drive. I do go to the school, you know, I did buy this ticket, you know, but, but the, I has more meaning because it's authentic in a sense that I'm an in sole possession of it and I'm operating with full agency, but within, within a systematic framework, that everybody is beginning to understand it. Doc Searls (56m 16s): Right. You know, right. Dave Huseby (56m 19s): Your consent should be authentic. Data itself, issue, Doc Searls (56m 22s): Document data, right? Dave Huesby (56m 24s): It's this is mine. And I can prove that it's mine. And I'm the only one who's capable of making this data. Doc Searls (56m 30s): I had an idea. Here's, here's an idea. It's an idea I hate, but I have to share it, even though I hate it, which is that the awful system we have now is you go to example.com and example.com presents you with a cookie notice that says in a big green button, except our use of cookies and another one, like you can control what cookies are going on here. And then you get to another pop over page that has a whole bunch of switches on it that are defaulted to on for advertising and other tracking based cookies, not in all the cases, but in a lot of cases they're defaulted on. And then when you make your choices, the, okay, they call them your choices, but they're not your choices. Doc Searls (57m 11s): They record it. You don't have any evidence. You, you leave. They're like what happened? I have no idea. Right? But if you're getting, if what you get with that is a verifiable credential that says, I have, you know, I have, I have made these choices. I have recorded these choices. And, and then there's the implicit contract in that that says, if I go back and find out, you guys are busy selling my data, even though you said you wouldn't, I have some proof here and we can go into ODR, which is online dispute resolution, which is a whole other field, but it's a real legitimate one because we're going to need ODR. Right. Because disputes. Doc Searls (57m 52s): So, so in a way, what you can do is scale up the doability side of ODR in the most contentious part of the world right now, which is that everybody hates not signing these I've I've, I've talked with these three or four people in the last day who said, Oh, I always click accepting those things. Cause I know I'm screwed. Now that, that, that really accomplished with the GDPR won it. Didn't it? It, yeah. Yeah. Dave Huesby (58m 17s): So we have one of the things that we were changing. So yeah, classic SSI is about what you are, not who you are, right. That was where it starts. And there's no reason why you couldn't publish self issued authentic data about your basic privacy preferences. I would love it if there were some kind of standard like light, like user license kind of thing. So that, that businesses online can start accepting. You know, here are my privacy preferences and they're fairly standard, right? I would love that. One of the other innovations we have is we have created this thing called a micro ledger, which in and of itself becomes authentic data about every transaction. Dave Huesby (59m 4s): So because we're doing transactions with zero knowledge proofs, or very limited authentic data, we can turn the record of the transaction, every message between the counterparties, into authentic data, by signing every message between the parties. Now, if every message between the parties is signed using a digital signature that follows the wet signature equivalent laws in whatever legal jurisdiction you're at. Now, you have essentially a, a contract, right? It's a digital enforceable contract of this is exactly what happened. I went to this website, this website sent me this message. Dave Huesby (59m 48s): It was digitally signed by them. I verified it. I then responded with a message that I had digitally signed, and then they responded, right? So we go back and forth. And if that, that conversation is what are your privacy preferences? Here are my privacy preferences. Okay. We accept your privacy preferences. And I say, okay, I want to participate. They both the business and myself have digitally signed records of that interaction between us. And they would be wet signature equivalent. I mean, there, there are laws in all 50 States in the United States that dictate what has to be in place for a digital signature to be equivalent to a wet signature, you know, a handwritten on a piece of paper saying, yeah, right. Dave Huesby (1h 0m 37s): And we could push this technology in that direction and people will accept it. I'm certain because the underlying data is not fake. It's authentic. So you can do business contracts and stuff based on the information, because the information is independently verifiable and you are trusting the data as, as much as you trust the issuing organization. So if it's a presentation of zero knowledge proofs based off of data that came to me from Equifax, that presentation links it to authentic data given to me from Equifax. Dave Huesby (1h 1m 17s): So the question is, do you trust Equifax, not do you trust me? You can verify that I didn't modify it. You know, that it came from Equifax this way and that they digitally signed it. You know, they're, they're standing behind the data. And so it changes the way we think about data in all kinds of transactions, because what we've successfully done is taken human scale, trust, societal trust, and computerized it, which is digitized it. And the, the problem here is that this could turn into the mother of all digital rights management systems, right. And it probably will to a certain degree, but if we do this and permissioned, issuer aggregated, the four party model, I, I think we have enough decentralization that, that this system becoming sort of the tool of the oligarchy right. Dave Huesby (1h 2m 16s): Of 1984, right. Is limited because I could see there being publicly run aggregators, for instance, you know, maybe this is the future of Mozilla. You know, I could see an open source verifier app. Like there needs to be a Firefox of verifiers. You know, there needs to be a Firefox of libraries that go into apps, the whole data hold authentic data. So anyway, I'm, I'm kind of rambling at this point, but Doc Searls (1h 2m 45s): No, that this is good. I think we should put it out to the SSI world and not just our listenership itself, but actually reach out to the SSI world. Cause I think you've Dave Huesby (1h 2m 58s): Got one of Doc Searls (1h 3m 0s): The best. It's not a critique. It's more like a yes. And to the, to the SSI world, it's like, yeah. Great idea. Great structure. Lot of good movement, a lot of good things happening there. You need this one more thing, Dave Huesby (1h 3m 15s): Right? Well, and you knew you needed to solve these certain problems. Right? The crypto had to be solved. We had to figure out how to get the blockchain out of the middle, which is also part of this architecture. So, you know, I don't want to pick on any one, but like Hyperledger, Indy based SSI systems are very slow because indie is, you have to talk to it every time you issue Prudential. And this is our new architecture does not do that. It pushes the blockchain to the edges because we recognize that all the blockchain needs to do is be a proof of existence for your digital identity. So really all you have to do is anchor your did string. I'm getting technical here, but yeah, that points to your key event. Dave Huesby (1h 3m 58s): Log. I, this doesn't exist Doc Searls (1h 4m 1s): Digitally identifier. Dave Huesby (1h 4m 3s): This doesn't exist in the world yet. But the closest thing that does exist is Microsoft's ion platform, which is a side tree based off of Bitcoin blockchain. So it supports high speed credential issuance that did document Doc Searls (1h 4m 17s): Side tree is like a Merkle tree. Right. Is that what that is? Dave Huesby (1h 4m 20s): Yeah. Okay. So very technical Doc Searls (1h 4m 22s): People. They may know what that is. Okay. Dave Huesby (1h 4m 25s): Yeah. Yeah. It's a Microsoft ion system is not a credential issuance platform. It is for storing your digital identity documents, which has like your public key in it. And any end points or anything that successfully pushes the blockchain out of the middle, because it only uses it for anchoring for proof of existence and providing an immutable record that this data does exist. And that it is in this current state and it's independently verifiable. So we had to do that. And then, you know, so blockchain out of the middle fix the crypto, make it efficient, you know, and inclusive, which is the most important part. Dave Huesby (1h 5m 5s): Like we were trying to figure out how to do paper credentials and that led us down this, this road and revocation, and that led us down this road. And then also the four party model, because we have to move away from permission issuers to UN permissioned issuers, if we want to scale to global scale, because there's going to be billions of issuers issuing billions of credentials. So how do you, how do we do that? We designed from scratch how to do that heavily inspired by the SSI community, which I think is amazing. They had to go before us for us to get here. Like I am truly standing on the shoulders of giants, giants, like Timothy Ruff and giants, like Daniel Hartman, you know, Jason law, you know, like there's some serious giants in this space that I am just this tiny little guy standing on their shoulders saying, yes. Dave Huesby (1h 5m 57s): And if you do this, then we can do the whole globe. Then you're the guy on the show. So that makes you huge. Okay. I just need to, I just need to give credit where credit's due. Yeah, no, you're doing a good job. I knew names, good people. We'd have them on the show at some point too. Yeah. That'd be great actually to have everybody in to pick it apart actually after I, I w after we, we showed everybody how to do it, we're going to have a lack of one on it. I w this for sure. It's a lot of anticipation it's coming up in April for people that don't know that it's the end of that identity workshop. Will you look up I workshop.org and you'll see what it is. Okay. So Catherine, take us out. Okay, well, Katherine Druckman (1h 6m 38s): Please reach out with questions, comments, any input that strikes you, you can reach us at info at reality, to casts.com. You can reach us on our website, reality to cast.com, which is, which has the number two in it. We're in, on various social media platforms. We're pretty easy to reach. So, so please do that. And we look forward to talking to Dave next time where we will be armed with many, many questions from the audience. Doc Searls (1h 7m 12s): Okay, great. Thank you so much for being here.