[00:00:00] Katherine Druckman: Hey everyone. Welcome back to Reality 2.0. I am Katherine Druckman. Doc Searls is joining me as always. I know I say that every it's true. Uh, except every once in a while you actually travel and, and have a life and stuff, and, and we have to pinch hit with Shawn or something so it is relevant. Um, yeah. So here we are and we're, we are back with Dave Huseby who has been on the podcast many times and we love having Dave because, uh, he always has some really [00:00:26] Doc Searls: He can't help being interesting. It's [00:00:28] Katherine Druckman: It's true. Yeah. You just stumble into [00:00:31] Doc Searls: hazard elsewhere, but not on this [00:00:32] Katherine Druckman: yeah yeah yeah. So, so we're gonna talk about some things and involving identity and commerce and things anyway, but we'll get to that in a minute. Before we do that, I wanted to make sure to remind everyone to check out reality two cast.com. That is our website, where you can find show notes and uh, you can find [00:00:53] Doc Searls: And [00:00:54] Katherine Druckman: can sign for the newsletter. Yes, you [00:00:56] Doc Searls: wanting to step up, but um, [00:00:59] Katherine Druckman: well, but, but the good news is that we don't spam anybody, and when it comes out, it's, we promise to make it interesting. So that works. And I, I actually wanted to give ano yet another plug for the podcast that I have launched at Intel called Open at Intel. Please follow that as well, especially if you're interested in security. We're, we're starting, we're kind of ramping up on an interesting security conversation over there. so yeah, I hope you'll join me. But yeah, in the meantime, Dave, tell us, tell us what you're working on. I think it. [00:01:25] Dave Huseby: Well, you're, first of all, first of all, I wanna say you're just too kind about saying I'm interesting. I, that's not what I get at cocktail parties, but [00:01:32] Katherine Druckman: You're going to the wrong [00:01:33] Doc Searls: you're the way [00:01:34] Katherine Druckman: I'll invite to some except I don't have any. [00:01:38] Dave Huseby: I, well, I'm the, [00:01:39] Doc Searls: cocktail. Without the party, it's it, it'll be fine. [00:01:42] Dave Huseby: isn't, isn't drinking alone what you're not supposed to do, [00:01:45] Doc Searls: Uh, yeah, [00:01:48] Dave Huseby: Well, um, so thanks for having me on again. I, it, it's been my habit to get to something that I think is really interesting. Um, some turning point in what we're doing at Cryptid Technologies, um, or in our research and development. And then I call you guys and say, Hey, can I come on and talk about it? And you're always just, you're so kind and welcoming to let me in. Um, and I think today, Is not gonna be any different. Um, if you go back and watch the previous episodes, you can kind of see where we're going. Um, I've kind of kept you abreast of where our research development was going, our product development, everything. And we're right at the point of launching and we're launching two major products for us anyway. Um, the first one is a very low level API security system that uses succinct zero knowledge proofs. Um, it's entirely open source, but we automate like sort of the management of it, um, from the server standpoint because, you know, managing who can access your API and stuff like that is, is not easy and there's lots of tools for that, but, We built one that, uh, uses our, this new protocol called Obon that was developed by, um, well, a friend of the show. I think you guys probably know Mike Lauder, um, from, I I w He used to work at Nym and his, um, [00:03:14] Katherine Druckman: yes, Everton. I [00:03:15] Dave Huseby: yeah. And then he was the security maven over at, um, sovereign Foundation when I was the security maven at Hyperledger. So, He and I were involved with the Hyperledger Ursa, and we've been friends for years now, and he's just wrapping up his PhD in cryptography. And this was part of his, his thesis. And uh, the cool thing is with this product is that. Something like 97% of all APIs on the internet are still secured with usernames and passwords or some big random number that you put in the URL and it's transmitted. All this stuff is transmitted with every API access and. The result is we're seeing all these huge security breaches on APIs. And to be honest, the way the internet works now is APIs talking to APIs almost entirely now. And so our first one is the serial knowledge proof, um, API security system that allows you to, well, it has a lot of really interesting security properties that don't exist in the market right now. Like the endpoints for a, for an API that have to verify. The, the zero knowledge proof that you are allowed to access this. They only possess things like public keys. So the. The server that is most exposed to the internet doesn't have any doubt on it, that if it's ever taken over, could be used to MiSiS issue access tokens. You can't change the access. Um, so there's defense in depth. It uses a blinded issuance protocol. So that, like I generate a big random number on my side, and then I go through a protocol with the server that allows a server to calculate a digital signature over it without ever seeing the, the big random number that I generated. The, so now I possess a valid token that's signed by the server. I can prove that it's properly signed by the server and I never have to transmit my credentials other than this zero knowledge proof that it's been properly signed by the server. And. The cool thing here is like the server can't ever impersonate me as a client. Um, I never transmit my credentials, all kinds of stuff. So there's, we're eliminating whole classes of attacks against APIs with this one. And, um, we, it's already being used in production. Um, we have, uh, I think, I think, uh, intrinsic's already got it available for their self software and identity wallet product. Um, they're using on for access and. We're seeing traction there because it's easy to deploy. It's down to like just a couple lines of code and. , it's really easy to do secure onboarding and enrollment and supports revocation, which was a lot of our research for the last few years. So anyway, there's that, but that's not what I wanna talk about. [00:06:01] Katherine Druckman: Oh, okay. [00:06:02] Dave Huseby: the other thing that I want to about, and [00:06:04] Katherine Druckman: twist, [00:06:05] Dave Huseby: Yes, which is built on top of Oberon actually. Um, and this, I teased doc with this, I think, um, saying that this is leading ever closer to intent casting and. User-centric transactions online. So here's, let me give you the high level of sort of the architecture. Um, we have an app that's doing identity proofing, um, very deep identity proofing. So video, you know, it's, we're trying to make it the, the gold standard for identity proofing on the internet and identity proofing is when. , you interact with the system. You know that they say, take a picture of your ID front and back. You know, do a selfie making a hand gesture, maybe a live call. They're just trying to make sure you're a human being and that your official name and address and all that stuff matches up. Okay? Now, inter, the IRS tried to do this last year, right? Id me, they said, Hey, you have to do all this just to pay your taxes. And there was quite a fervor. A fervor about it, right? Everybody got pretty upset and it was shut down, I think a couple months, well maybe even just a couple weeks after it launched. Um, and rightly so. I think because it was a government database of your biometric data. Um, I think everybody in security in privacy knows that we don't want big databases of people's biometric data because you can't rotate it. Like you can rotate cryptography keys, however, and this is the dilemma of systems designers, any kind of system that has authentication, that requires, uh, humanness right, some kind of identity as part of the authentication. And it's, it's for a human, you have to have some biometric binding to that authentication because if there's no fingerprint requirement or a facial recognition scan requirement, then there's nothing that keeps me from handing my phone to you, Catherine, and you being me. Right. So how do you resolve this conflict where we have to be able to have biometric authentication, but you don't wanna build central databases of people's biometric data? Well, one of the things we invented was client site only biometric data that's built into the protocols, um, using the cryptography. So what we've built is an app that does deep identity proofing and. That data is stored on your device. And actually we are looking into setting up, and this isn't one of our ideas, but this setting up a data fiduciary, this is an idea that came out of i I w a few years ago. And um, it's the idea that a business could have your personal information and they are legally and morally bound to represent. As the individual. So they, it, it's like your digital twin, but it's in a lockbox, right? And that company or the, wherever that data is stored, only acts on your consent, on your, like, I want this data to be used here and on my consent. So what we've built is this app that does the identity proofing, then the ability to store that in a data fiduciary relat. And we have relationships with companies that are in the business of knowing what is and is not true about you. So you, these are familiar to most of us, right? Companies like credit rating agencies, companies like, uh, know your customer vendors, things like that. These are companies that are in the. Business of knowing what is and isn't true about you. The thing that a lot of them don't do very well is identity proofing. When they match you up to the, the data they have. Uh, I think credit rating agencies get a lot of their data from spending records and stuff from the credit card companies and from cell phone companies and payments processors, and so, If you went to one of them, they would have to identity proof you before they gave you access to any data they had on you because they'd have to match you up with the record that they have. So there's an opportunity here. You could set up a company, and this is what we're doing. That identity proofs individuals. One last time. Essentially, the data stored in a data fiduciary that represents the individual, then the individual through like an app-like Interface says, I consent to using my verified information to establish a relationship with one of these backend data sources that knows Truthy data about me. Why would they want to do that? Well, because the other part of our platform, the other stuff that we've invented, Doing transactions entirely and zero knowledge. The problem is, is that zero knowledge true zero knowledge proofs are just self attestations, meaning nobody trusts them unless you can prove that the data that they're generated from is authentic data and exists and hasn't been revoked. And it comes from an organization that people trust or that the relying parties trust. Think about it like this. Banks and businesses trust some organizations to know what's true about your credit worthiness, right? Or they trust, uh, companies like clear to know that who you are or that you can travel, you know, today or whatever. So there's lots of companies that know truthy things about us, and what we've built is the ability to leave that data in place. We have a data fiduciary that represents you and is operates entirely under your control and your consent, and allows you to connect to these, your identity, your verified identity to these like, you know, your truthy data about you, your, sorry, your identity proofed data to these companies to match it up to the records about you and then transact using zero knowledge proofs based off of the data inside those organiz. And what it looks like from the outside and, and this is where I get to that point, that, [00:12:12] Katherine Druckman: this was gonna be my question. What does this look like? [00:12:14] Dave Huseby: yeah, so, so where, where I'm getting to is, you know, I started this off by saying privacy and identity are only byproducts of a system. They are not the product. And so we designed a system in which functions like you have a digital identity, it functions almost exactly like you have a wallet on your phone and you've got a digital credential on your phone and all that stuff. None of that works because of a bunch of problems we were talking about just earlier, and I could go into that later. But, um, it works like that except your data never leaves the organizations that have the data about you. The only things that leave are zero knowledge proofs, and that only it happens under your consent, under your control. And so if you have a relying. Say like a social media platform. Let's talk about Twitter. You know, uh, Elon Musk get Last May tweeted out that authentication was important, but so is anonymity. A balance must be struck, I think is almost exactly the language he used. So, and that was in a broader context, a broader conversation that they need to know that you're human, but they don't need to know who you are to participate on Twitter. What that means to me is that they have a policy. Twitter has a policy to access the Twitter platform, you have to prove that you are a human. You have to prove that you are a unique human and you're link. You have control over an and account on Twitter. Um, but you don't need to share any of the information about who you are. Now, there's one little caveat here. We do live in a society and we do have laws, and so. That would, what I just described, would give you anonymous access to Twitter because I could be identity proofed with the data fiduciary. I could then give consent to that data fiduciary to connect to a K Y C vendor, and then I could give consent to use zero knowledge proofs from the K Y C vendor to prove to Twitter. That I am a human being and that therefore I should be allowed to have an account. And then once I establish an account, I can prove that I control this account. So at that point, Twitter doesn't actually know anything about me other than the fact that I'm human. So I could be, you know, uh, I could be Dad bought 1 6 24, right? I could make up whatever name I wanted. I don't actually have to, to tell them who I really am. in real life. And, and this is one of those principles of user sovereignty that I wrote about three years ago, which is your, your absolute privacy by default. I get to choose when I want to reduce my privacy, but I start completely private, anonymous. The problem is we live in society. So if I'm on Twitter anonymously and I go and I make, you know, threats or I make something that violates the law, right? Law enforcement needs to happen. So what we also provide is verifiably encrypted de anonymization Now, Twitter wouldn't be able to decrypt it, but they could verify who can decrypt it, and in this case it would be the K Y C vendor or maybe even the D data fiduciary, but it's probably the K Y C vendor. So then if there is a judge that issues a warrant, Hey, that person made some threat. We need to figure out who it is. Twitter hands over that little piece of data, it gets walked back to the K Y C vendor, gets decrypted, and now we know who it is. This is perfect Fourth Amendment privacy, and I think I've mentioned this in the past in in one of our podcasts, because we do have this compromise in our society. The compromise is we're private in our persons and papers, in our dealings unless we, there's, you know, enough evidence of reasonable suspicion of a crime, and then a judge can issue a warrant to violate our privacy, to figure out, to investigate the crime, right? We have built in, in code using cryptography, the ability to do these transactions entirely in zero knowledge and to support that compromise. We're anonymous by by default. But you know, if I break the law, I can be de anonymized, at least in that, uh, scenario. Now, this extends to every transaction on the internet. So let's say I want to go and transact and buy cryptocurrency, um, the policy. to transact in cryptocurrency while it will be very soon, um, because I think everybody's anticipating regulation actions, you know, from the financial regulators, the policies will be, you need to prove that you. Our human, um, that somebody knows who you are, not necessarily the cryptocurrency exchange, um, and that you pass anti-men laundering checks and, uh, you're not on the sanctioned names list and all that stuff. Now, with what we've built, You could consent to connect to a, a company that does financial regulation, compliance checks, you know, AML checks, CF T checks, sanctioned name checks, and then you could provide zero knowledge proofs to the cryptocurrency exchange that you comply with their policy. So this is actually a new form of authentication, that app that basically relying parties. Say, here are our policies. You have to be human. We, you have to be known. You have to pass all these things. Okay? Those policies can be met with zero knowledge proofs and authenticity proofs and of cryptographic paper trail of where the data is, of where those proofs are or proofs came from. And, That can all be done at authentication time, either as an API security authentication or through a web login or anything like that. And the the last thing I wanna point out is that if I use this system to say, join a credit union or something, and then I use this system to get pre-approved for a loan, that credit union now becomes a data source itself because they can provide zero knowledge proofs that I have been pre-approved for a. And so this facilitates this whole sort of virtuous cycle. Most companies know something about their customers. They may not know everything, but they have a small piece of it. And that can be monetized, by generating zero knowledge proofs, not by selling the data, but by providing proof that does this sort of enhanced due d. authentication time. And so the last thing I wanna point out is this significantly reduces security and we think it meaningfully reduces fraud. because even the most state-of-the-art authentication right now relies entirely on the client, the customer, the, the, the individual, like whoever is connecting to the server and what they can observe about them, like AI and everything just looks at them. But what we're doing now is getting that plus zero knowledge proofs on data from other organizations that are trusted by the relying party. So this is a, what did I, someone told me the other day, this is a better analog to digital conversion. Of the way real life authentication works. [00:19:49] Katherine Druckman: . Did did you say this significantly reduces security? [00:19:53] Doc Searls: No. [00:19:53] Dave Huseby: is significantly, [00:19:54] Katherine Druckman: mean. [00:19:56] Dave Huseby: Increases security. It reduces, it increases security significantly. Reduces fraud. Yeah. No, no, no. [00:20:02] Katherine Druckman: Thank you, [00:20:03] Doc Searls: so, [00:20:04] Dave Huseby: Significantly increases security because it's a multi-point kind of authentication thing, right? [00:20:09] Katherine Druckman: just wanted to make sure [00:20:10] Dave Huseby: it's like when you join a bank, like if I went to go to a bank and open an account, they would say, oh, uh, bring your two forms of id, uh, utility bill, all this other stuff. Right? That's all corroborating We call this the corroboration. Model of security, and we've unlocked this and it operates in zero knowledge. And so the data stays where it is, which significantly reduces the risks to your data being stolen and it being further marketed and all that stuff. So it, it basically, you know, the other thing is, is we, we flipped the model. Like SSI wants to charge for issuance, we charge for authentication because that's who benefits from it. They get enhanced due diligence. To authentic to get this deeper authentication. Um, and then that money flows back through the system, to the data sources, to us, uh, you know, to enrolling institutions. So it's like there's this virtuous cycle, um, to it all. Anyway, I, I have talked way too long. Let's [00:21:04] Doc Searls: No. Well, so you're, you're good at it. So, um, so I'm curious, so what you're describing is less a new ecosystem than a, a new way to make existing parts of existing ecosystems kind of hang together. Um, [00:21:21] Dave Huseby: Yes. [00:21:22] Doc Searls: and, uh, I mean to, to contrast it with ssi, which you were doing earlier before the, the call, is that there are lots and lots and lots of solutions and standards development work and so forth, and so far zero actual adoption where people have recognize it in the field when you're talking about is something where, companies doing what they're already doing have a slightly new different, a different way of doing it and working together. So what I'm wondering about is, first your, your company's crypted, right? Uh, do you have a, you have a website for this? R [00:21:54] Dave Huseby: It's Crypted, C R Y P T I d Tech, T E [00:21:58] Doc Searls: Do tech Crypt. Crypt. Like Cryp? Id do tech. Okay. And your, so your, so I have two questions. One is, Who are your customers and how do they work with you first? And the second thing is just cuz I'm curious about it, I know there's a company that's interested in this. is there a business in authentication or is that your business? Or is that, does that become a separate business? So I wanna go in the authentication business. What is that? [00:22:27] Dave Huseby: Well, there are already many authentication companies out there, companies like Okta, tma, Bravo, like they just provide a simple API for doing API authentication, authentication for your service or whatever. Okay. So that's an existing business. The, the, the most state of the art. There though only relies entirely on the client themselves and what they can observe about them. And so there's not been any real meaningful reduction in fraud, I don't think. In fact, I think fraud's getting worse despite the deployment of ai. Cuz now we're seeing clients who are using AI to beat the AI on the server side. And it's a, it's a whack-a-mole. It's a arms race. . we fundamentally changed the battlefield here because now it's not, can I use AI or can I just defrauded, can I convince you that I'm somebody else? But then I would also have to corrupt organizations that are in the business of knowing what is and isn't true about individuals. So I'd have to corrupt credit rating agencies. I'd have to corrupt K Y C vendors. I'd have to corrupt the financial transaction. Uh, search. You know, the financial transaction compliance vendors, vendors that the government themselves use. So I would have to go and corrupt a whole bunch of data in a lots of different places to create or to bypass this authentication as somebody else. It, it significantly changes the economics of committing fraud, uh, at the authentication level. Um, and it basically makes. I, and I can't, I don't think I can legally say it's impossible. I'm sure there's a way to figure it out eventually, you know, who knows? People will figure stuff out. But I think this is like our first real rethink of authentication in 30 years. And I think this is the first time we'll actually see a significant reduction in, in fraud, in authentication. And so what we are offering is automated policy and regulation compliance at authentic. So we have tools that, you know, like let's say a, a credit union, they, everything they do at a credit union is policy driven. Everything at any business. Is policy driven. They may not think of it that way, but banks and credit unions, do they actually have like written down? These are the rules, right? And we can encode those as essentially like a cryptography puzzle that can be solved using zero knowledge proofs from data sources that they already trust. That's the thing. This is transitive trust. This is transmitting trust on the internet. A bank will trust a credit rating [00:25:02] Doc Searls: data you talked about in the past, on [00:25:04] Dave Huseby: Yes, this is, this is a realization of the application of the authentic data economy because zero knowledge proofs are not useful unless you can also provide proof that the underlying data is authentic. It exists in this place and in this state, and that it hasn't been revoked, and it's the revocation piece that's critical that nobody else has except us at this point. Scalable revocation. So, yeah. [00:25:31] Katherine Druckman: I have a question it, it kind of goes back to actually what Doc started to, to ask you too about in in the ecosystem and, okay. So this is the, this is kind of the way I was thinking about it too. And, um, do you see yourself creating a new way of doing things that replaces outdated ways, or do you see yourself fitting within an existing system and satisfying the needs of a population that cares more about privacy and. [00:25:58] Dave Huseby: This is strapping a jet engine to a pinto. This is really what it is. Um, [00:26:04] Doc Searls: owned a pinto, uh, and, and you know, , you can pinto without a jet [00:26:09] Katherine Druckman: But that dangerous. That, like something that's going to, you know, be too early and, and, and pass us by and go over most people's heads. [00:26:16] Dave Huseby: so we, [00:26:17] Katherine Druckman: adoption and, and you know, [00:26:19] Dave Huseby: so we've, we've gone after some really cool low hanging fruit. what? Product offering is, is not a rip and replace. It just sits right alongside whatever's already been deployed. So, it, it can actually function like oof and, you know, and O I D C, like O Open Id connect. Because what happens is someone will try to log in and it'll say, oh, we do this new policy thing. Go over to this, you know, and it goes to crypted. Then Crypted says, okay, well we need to enroll you if you haven't been identity proofed yet. But don't worry, this is reusable and this is a, you're setting up a relationship with a data fiduciary who doesn't make money selling your data at all. So the business, so part of what we've invented are like, Part of our innovations here are business innovations, not necessarily technical, like we figured out how to make a data fiduciary profitable. That's important. Data fiduciary idea has been around a long time. Why isn't there a company that puts my data in a safe deposit box? And then I get to tell them whenever, you know, I need to establish a new relationship kind of thing. [00:27:24] Katherine Druckman: of economic incentive, [00:27:26] Dave Huseby: Right. The only way previously to do this would be to sub charge subscriptions, and that just doesn't work. But if that data fiduciary is also managing your consent with these other organizations that have, you know, like say your credit history or have run you through financial transaction checks. You know, like do you comply with the financial transaction laws? and then that. All of this is generating zero knowledge proofs to meet the policies and to provide enhanced due diligence and reduction in fraud for cryptocurrency exchanges, for financial transaction platforms, for banks, for credit unions, for auto sales, for Amazon, for Twitter, for any of those people that need to know you're human, that somebody knows who you are, um, that you qualify, that you pass all their policies, that you meet their policies for the transaction. And this whole enhanced due diligence, which reduces fraud, right? So they pay for this and that money, that the innovation here in the business side is that money flows back not only to the, say the, the credit rating agencies and the, the, the financial regulation check vendors, right? The data sources. But it also flows back to the data fiduciary who manages your consent and your relationships with all these, and at any time as a, as an individual, you could say, look, I don't want you to, like, I wanna break my relationship with this credit rating agency because I no longer need those generalized proofs for anything I'm using of, or here, you know, and relying parties. And so I can just swipe to delete and remove my consent. Everything we're doing is building cryptographic paper trails of collecting my consent, connecting my identity proof data with these data sources, uh, allowing zero knowledge proof from these data sources to be used to meet policies for these certain relying parties. It's, it's a new way. Thinking about an old solution, right? This is really what it is. So if you meet a policy, we can issue open, ID connect, you know, <INAUDIBLE> tokens out the back, you know, so, and we can do SAML and we can do all these other things. We can even do Kurbo Ross tokens. It doesn't matter. Um, we sit right beside them and do policy and regulation compliance automat. And enhanced due diligence in a user-centric consent driven zero knowledge way. So now data stays put, but you didn't have to change anything. If you say you're a bank or you're Twitter, whatever, you didn't have to change anything you said to sign up with us and then we will help you, you know, with our tools. Encode your policy. And then you just present your policies and you, and you basically bounce them to us just like Open OCH does. And what Fido does, you just bounce 'em to us for the extended, um, authentication stuff. Now we are everything at this point, but this is a four party model. There's data sources, there are relying parties, there's a data fiduciary, and then there's an aggregator. And we are both an aggregator and a data fiduciary at this point. But I see those two functions separating. And we built this to white labels. So we we're, look, you know, we're trying to find data partners, like partners to do data fiduciary partners to do the aggregator ps. Um, and we're looking for clients and, and we have some data. Partners already. So let me tell you one really cool use case, and I know we were talking about politics earlier, but this is a politically neutral thing that I think everybody would find interesting. We now have the possibility of building essentially a private social media platform, like a private Twitter, or we could take Mastodon or something, it doesn't matter. And we could go, let's say here in the state of Utah. If we get buy-in from the, well, what we do is we say, look, you're, you get identity proofed and we will verify that you are a citizen of the state of Utah and we will verify your address and we will know which voting precinct you're in. Now, that data will be used to create accounts in zero knowledge on a social media platform that doesn't know who you are. but they know for certain, or at least as certain as we can be, that you are a citizen of the Utah, that you're a voter in District five. Okay? Now, that can then be prepopulated. Your account can now be prepopulated with connections to all of your elected officials at all levels from your, you know, hoa all the way up to the governor, all the way up to the president, right? And the thing here is, is. It's an extension of the ballot box because your pre, your participation on there is not tied to your real world identity, but you've been verified as an actual registered voter and a constituent. So now when you message your politicians, your elected officials, and they get a thousand messages from their actual constituents, that's a real signal. To them, and we've already talked to a number of politicians and parties and they're absolutely excited about this because their reality is they get spammed a million emails a day and all kinds of bot traffic for their Twitter and their Facebook and all this stuff, and they get a lot of noise and very little signal. And it's coordinated and it's automated. They get all these pressure campaigns on them, and one of the things we could do is build a system where we could facilitate legitimate communication between verified voters and their elected officials. And I think it could radically change the nature of political communication in the United States. [00:33:07] Katherine Druckman: That is very [00:33:08] Doc Searls: Whoa, [00:33:09] Katherine Druckman: possibly the, the subject of a whole other episode maybe, but [00:33:13] Dave Huseby: Yeah, imagine. Imagine an anonymous Twitter where you join by proving you're a registered voter in a certain state, in a certain precinct, and you're immediately automatically following your elected officials and you can DM [00:33:28] Katherine Druckman: I'm not sure if that's, if there's a market for that, but [00:33:32] Doc Searls: Well, [00:33:32] Katherine Druckman: given elected officials, I don't know. [00:33:35] Dave Huseby: Let me tell you. There absolutely is, and I know, I know there is because in the last election I, I've seen numbers. So I've looked at what's the data market look like for political communications. Did you know in the last election a verified phone, a verified phone number, a phone number that they can send a text message to that they know is one of their constituents cost them $100? [00:33:58] Katherine Druckman: Good grief. [00:33:59] Doc Searls: There's a, uh, a Brit blazer, uh, who's not been on the show, but would be fun. Um, years ago, uh, created, uh, a system. where, I mean, it's, it is very kind of Web one Oh. Uh, but the idea in it, one of the ideas in it is if you want to participate in politics, all you'd have to do is pay a dollar. And that the only reason you is card and you've, and as as you've spent with a credit card, you're already, that's your, that's your ticket in, as it were. And, and that might be the way to go. Something [00:34:35] Dave Huseby: that that actually is almost exactly what we've done. The reason you would use your credit card is to verifies your human being and it can map you to an address, and you're like, okay, you are a constituent. We can do that without that data. Transmitted and we can do it in a way that the platform would never know that information. So it is an extension of the ballot box. You would be free to speak your mind. Um, but at the same time, we're also doing that, that verified encrypted piece. So if you get on there and you start threatening the president, we're gonna figure out, we, we, we have the means to know who you are. If a judge orders so orders it, right? But no, no one party has the ability to unmask you the platform doesn't It's a separation of concerns. The platform itself doesn't know who you are. The data fiduciary, which would be us. We know who you are, but we don't know necessarily who you, we know that you've consented. To participating there, but we don't know it was you that made the, the, the threat. And then, you know, it would have to be put together by law enforcement and have a judge sign off on it and then we can all cooperate to under the warrant to figure But otherwise, you are anonymous. Right? [00:35:46] Doc Searls: the fraud or attack surfaces are too scattered for by purpose. They're, they're separated out. [00:35:53] Dave Huseby: it's a separation of powers, essentially. Separation of concerns. There's not, all the data is in one place. I don't know what you say on that platform as the data fiduciary. I just know you've consented to using your personal data to generate Z Knowledge, PRUs to authenticate with that platform. That's all I [00:36:08] Doc Searls: May, so maybe you've already covered this and I just didn't catch it exactly, but without naming names, what kind of entities are your customers or your customers'? Customers, if you're white labeling this, [00:36:21] Dave Huseby: So, um, so we're often both white labels and, and just like direct integration. Um, the, the thing that we're working on right now that I am actually would like to, I, I wanna float this out on LinkedIn too. Um, we are engaged directly with FinCEN and the s e C at this point because this represents, I think, the first cryptography native. and I mean that not cryptocurrencies, not, not the actual like bitcoins or whatever, but this is like a crypto cryptographic protocol that can be tied to cryptocurrency transactions. Um, this is like the first cryptography, native zero knowledge way to prove that you comply with financial transaction regulations and link it directly to a fi a a, uh, cryptocurrency. Exchange transaction and without the exchange having to be their own K y C or collecting all that information or anything, they, they're just in the business of doing cryptocurrency. This whole other thing. Like you prove that you meet their policies every time you go to transact. And so we're, we have this industry group that we've been slowly building with our, um, our attorneys, um, and. We are in direct communication with the financial regulators. And so I see some of our first low-hanging fruit will be decentralized finance, you know, decentralized cryptocurrency exchanges, um, that they, they know regulation is coming and the conversation is already happening, and nobody knows exactly what it's going to look like and what we are trying to do. Stand up in the middle of the crowd and just be like, Hey, time out. Okay. We can do this in a way that does not ha slow down innovation, but does allow for financial regulation, compliance. And, um, so I, I'll be talking, I'm, I'm actually leading a panel at, uh, east Denver, uh, beginning of March. It's called, uh, it's in the privacy. There's a whole series of panels under privacy. It's called Take a Stand, and my stand is that we do not have to give up the ideals of, say, cryptocurrency and web three and decentralization, um, to comply with regulation. There is a third way, and that's what we have built. And this is all going to be, you know, we're in the process of opening the protocols and the file formats, but we want to get buy-in and show that it works in the. And then we'll write RFCs and we'll, we're already open sourcing a bunch of code, and so this will be decentralized. Like email is decentralized, right? It's open protocols, open formats. And I think it's important to point out that even though email is still the most decentralized and open, sort of like communication platform, pla uh, proprietary vendors for email services like Gmail and Yahoo Mail and fast mail make mountains of. So openness and decentralization are not, um, mutually exclusive from having a proprietary implementation and making a ton of money. So that's where we, that's the sweet spot we're going for. It's like open formats, open protocols, strong open source cryptography, zero knowledge based. Transactions and a separation of concerns. I would love to white label into an aggregator in a vertical like, Hey, I wanna do licensing of avatars from Hollywood movies for use and video games. That would be one aggregator, right? The data fiduciaries are going to be working with multiple aggregators who work with all of their data sources and everybody on the internet who is a relying party who does authentication, could rely on this and get the reduction of fraud. Um, and keep, help, keep data in place and protect people's privacy and the, and, and make a huge dent in identity theft. That that's our vision here. And, [00:40:18] Doc Searls: what, what does it look like for the individual? Okay. Do I have wallet? [00:40:23] Dave Huseby: it, it's, no, there's no, no wallets here. There's nothing. So y the trick is you have to have a relationship with a data fiduciary. And now I've always thought that credit unions would be perfect for this because, you know, nobody goes into a credit union per in person. Anymore or hardly ever, but they used to sell like safe deposit boxes and stuff. There could be a digital safe deposit box, and they could be a digi data fiduciary in charge for that, or at least monetize it by, you know, as I said, um, earlier, uh, what it looks like from an end user. At least right now because we're proving out that the whole system works is you download our app called One Last Time and it's called that because it's a joke, right? You're supposed to get identity proof just one last time. It's sort of an aspirational name. It's not a [00:41:10] Doc Searls: Just the first of the last times. Okay. [00:41:12] Katherine Druckman: Wait is, is this app actually a thing right [00:41:15] Dave Huseby: Yeah, it, uh, no, it's, uh, it's currently on test pilot right now, so, so this is part of the launch, right? So you would download this app and it would run you through identity. As best as we can. So taking pictures of your id, uh, doing selfies, that kind of stuff, and we bind it to your biometrics. So we leverage the, you know, facial recognition or, or fingerprints on your mobile device to bind all of this to you physically. But again, there's no database of that, uh, that's client side only on your device kind of Um, and then when you go to use it to say, like, join a bank, or, you know, any company that that has. Uses us for authentication, it would say, okay, this app, website, whatever has this policy. You do not have data sources that meet this policy, but we have data sources available. Would you like to enroll? And so let's say it's, I wanna join Twitter. And Twitter has policy of just prove you're human right. They would present the policy and it would say, look, you can't comply with this policy, but we do have a K Y C vendor who will verify that you're human and provide a zero knowledge proof that you're human. So do you consent? You say yes. Right? And, uh, behind the scenes they're doing the check using the proofed data that we have from, you know, the data fiduciary has, and then is their knowledge proof gets sent over and boom, you're in. Okay? So from the end user, it's, you download the app, you get identity proof once. Okay, now I want to go join Twitter. Twitter presents their policy. It says, you need to enroll. I'm like, you know, yes. I give consent. It enrolls and I join. So, and to give consent, that's a biometric and maybe some other factors, but at least a biometric. So a lot of cases, enrollment is just do face id and then behind the scenes, zero knowledge proofs travel across and you're logged in or you're, Jo, you've joined, you've created an account or you've logged in. And so not only do we streamline sort of the onboarding process, we're also reducing authentication down to just collecting a biometric and. I think it, it's difficult for me to differentiate sometimes because everybody's like, well, isn't that what Id me did for irs? And it's like, well, yeah, except they built a database and we have it. We have cryptographic protocols that don't ever require us collecting that data into a central database. It's all done on the hardware, on your phone. So, um, anyway, from the end user, it looks like just, you know, get identity proofed and then you're giving your biometric to give consent. And if you no longer wanna do that anymore, you just swipe to delete your consent. And that breaks the connection that it, it simplifies your interaction with all these platforms, but the platforms are, Knowledge proofs based off of verified information from organizations they trust. [00:44:11] Katherine Druckman: do, do we have time for another [00:44:13] Dave Huseby: Yeah, sure. I sure. Yeah. [00:44:15] Katherine Druckman: how, how does the, the experience of that, or actually no. Under the, under the hood, how does it differ from existing methods of verification? You, you talked about processing the, or storing the biometric locally and, and, and. Anyway. I'm wondering how this differs from things that people are already used to using, like the way that Apple does things or the major vendors do things the major difference to an end [00:44:44] Dave Huseby: Um, so a lot of the stuff that you get, like that Apple does right? Um, is very similar. Uh, the cryptography is different, I think, but I think so the biggest part here is that right now, The trend in authentication is that everybody is going to identity proof you. So, and identity proofing is the thing that makes people scared of, that's what id.me was doing to just pay your taxes. It was okay, you have to get on a video call, you ha, we have to see all your id. We have to know everything about. and there is concern about that being the norm. Because if you can't be on any platform without being fully identified as who you are as a human, then it reduces your ability to be honest and to speak your mind and and the reason that is true is because like say I go on to Twitter and they know I'm Dave Huy and I say something that's unpopular. Well, it's not just that I'll lose my Twitter. but I'll also lose access to, you know, I won't be able to process credit cards with PayPal. I won't be able to, you know, I'll lose my bank account if I'm at certain banks. You know, like we've already seen an example of once you're unmasked and you say something that's unpopular, you can be excised from the internet and pre prevented from doing business. So the trend in authentication though, unfortunately, is that you. Buy this.com is gonna identity proof you. The IRS is gonna identity proof you. This other's website's gonna identity proof you. This business, everybody is going to identity proof you because fraud is costing everybody way too much money. And the only known defense for this at this point is, well, we have to take pictures of your id. We have to do a video call, we have to do all this other stuff and that data. Now you're gonna be identity proofed in a million different ways, different places, right? You're gonna do it to get gas, you're gonna do it to buy dog food, you're gonna do it for this and, and every one of these companies are building these giant databases of your verified personal inform. and that's making identity fraud even easier actually. So they, they think that's their way of preventing it. But they're building huge databases that are massive honeypots, you know, uh, they're massive, uh, attack targets and it's gonna make identity fraud even easier. Identity theft, even easier because it's verified information and it's in a million places and. every company has their own level of security around that kind of data, and a lot of them, you know, unfortunately don't do a very good job of it. And that's why we see these giant data breaches. So what we are trying to establish is, yes, identity proofing is necessary, but if it's done uncorrectly, you really only have to do it once or at least one more time. Or maybe you. Ideally, just ideally it would be one more time and it's with an organization that is legally bound to be on your side. I could see this being a a B Corp, you know, those public benefit corporations. [00:47:56] Katherine Druckman: Hmm mm-hmm. [00:47:58] Dave Huseby: And our innovation here, at least with the, the authentication, the policy regulation and all that, or the policy compliance stuff, is that it generates, it's a business network. It's like almost like how credit cards work, right? The fees generated from the use of credit cards pays for the entire thing. It pays for all the companies involved and all the services involved. The way I look at this is the revenue generated from the reduction in fraud and the enhanced due diligence at authentication time can pay for an entire network that is user. It can pay for a data fiduciary so that a data fiduciary doesn't have to charge a, a, a subscription. So they can ingest people, they can date, they can identity proof people and have them join for free. Minimize that, um, that friction, and then they can represent them in the digital realm and still make money and still be a, a for-profit corporation. And so what's different here is that. We acknowledge the way the world's going. We know where things are gonna, we're skating to where the puck's gonna be, and we're trying to keep the puck from going in our net. You know, I guess this is a, I'm just torturing that metaphor, but, um, we know where it's going. If we do nothing and we're trying to provide an alternative to. that is open and decentralized and user, user sovereign in, in actual se, user-centric. And it's, it has the byproducts of being privacy first and digital identity first, right? So this would allow the creation of things like digital vaccine passports that aren't automatically a social credit system. So, It's a bunch of new thinking. It's been under wraps for three, three and a half years now, r and d. And we finally found a, uh, a market. We found, found an application that we think is low hanging fruit we can get into, uh, fairly easily. And um, that political communications app is just one of many low hanging fruit that we see. And I. I just, I'm call, I'm literally calling everybody, are you on the internet? And they're like, yeah. I'm like, Hey, uh, let me, let me do my sales pitch. , you know, if you're on the internet, you need our technology. And, um, yeah, I, I kind of see this as almost like a moral crusade in many ways. We solved a bunch of technical problems, bunch of business problems, and I think this is the closest we're gonna get to meeting the, the sort of the promises of self-sovereign identity. I think we meet all of the original promises. Like if you look up the very first, you know, speeches by, uh, Timothy Ruff and Sam Smith and all them are talking about, you know, and, and drum and Reid talking about what self-sovereign identity will do, right? What they have all built doesn't do hardly any of it. But what we have built does, and it is actually decentralized and it's highly profitable actually, , there's a lot of opportunity here for people to make money and a lot of opportunity for regular businesses to reduce their fraud and to streamline their onboarding and their their customer relations. So that's my sales [00:51:08] Katherine Druckman: is. This is why I love having you on, because I always feel like the end of these episodes when you're here, we're like, we're gonna start a revolution. Like [00:51:17] Dave Huseby: It's not though. [00:51:18] Katherine Druckman: very [00:51:18] Dave Huseby: not a revolution. , I have to correct you [00:51:21] Katherine Druckman: it feels like. [00:51:22] Dave Huseby: I, it and it has, it's been a long [00:51:24] Doc Searls: up a lot of stuff that's already working. I think it's kind [00:51:27] Dave Huseby: Yeah. [00:51:28] Katherine Druckman: you know, it's a quiet revolution. [00:51:30] Dave Huseby: there's actually, [00:51:31] Katherine Druckman: the best kind [00:51:31] Dave Huseby: I'm gonna, I'm gonna risk clout here. Uh, I would argue that technology, there has never been a re revolution in tech. It's always been an evolution. Almost everything we use today is like, This version plus one. Right. And this version plus one. Um, there have been new inventions for sure, but I don't think anything has been like totally revolutionary. I think everything has just sort of evolved. [00:51:55] Doc Searls: I think that there are, there's a big one. The biggest one to me is the pc. [00:52:01] Dave Huseby: Oh, sure. Yeah [00:52:03] Doc Searls: personal. That was, that was revolutionary. I, the, the mobile phone, very incremental. That kind of turned to what we, into what we have now, only in the last five years, maybe seven years. Um, and a lot of that had to do with the build out of infrastructure where you could presume a ambient connectivity pretty much anywhere you are. [00:52:21] Dave Huseby: Yeah. The web was revolutionary because it made publishing cheap and easy, and I think Bitcoin's revolutionary because it makes receiving, I think Bitcoin's revolutionary because it made receiving money cheap and easy. try receiving money in dollars. It's not easy setting up a business. I know receiving money is very, very difficult. [00:52:45] Katherine Druckman: Well, revolutionary or evolutionary. I appreciate your enthusiasm, [00:52:49] Doc Searls: Yeah. [00:52:50] Dave Huseby: well, I, you know, I'm gonna be talking more about this at East Denver and I'll be demoing there, so if anybody wants to go, that'd be great. Find me. Um, I'll try to get to I a W Doc [00:53:00] Doc Searls: and I, I I have, uh, two parties I wanna introduce you to. [00:53:05] Dave Huseby: be great. [00:53:06] Katherine Druckman: When is I a W again? I might as well plug it again. , plug [00:53:10] Dave Huseby: it's in April. [00:53:11] Katherine Druckman: when is it It's in April But do you know when exactly [00:53:14] Dave Huseby: Mm. It's usually like April 20th, something like that. It's usually towards the end of April, if I remember correctly. I haven't been in a, a couple of years. [00:53:23] Doc Searls: Yeah [00:53:24] Dave Huseby: I think what we're gonna do is, uh, an enrollment bootcamp for anybody who wants to try out the authentication. And, you know, and any companies who want to get into the business of setting up a data fiduciary or setting up an aggregator, um, we'll just help them with few lines of code to get it going. [00:53:40] Doc Searls: the, uh, I I w is April 18th, the 20th, [00:53:44] Dave Huseby: Boom, [00:53:44] Katherine Druckman: There you go [00:53:45] Doc Searls: what it. is [00:53:47] Dave Huseby: All right, I'll, I'll make sure we go. That'll be [00:53:50] Doc Searls: do it it [00:53:52] Dave Huseby: All right, I gotta run [00:53:53] Katherine Druckman: cool [00:53:54] Doc Searls: I too. I [00:53:55] Katherine Druckman: We do Okay well thank you both for for for this And thank you Dave for filling us in and we'll talk to you [00:54:01] Dave Huseby: Thanks for giving me a, thanks for giving me a platform to just ramble for . An hour [00:54:06] Doc Searls: No good good stuff [00:54:08] Dave Huseby: and oh, and I'm actually on LinkedIn now for the first time in my entire life, I finally I j I joined a social media network, so find me on LinkedIn and, and pair up with me and I'll, I'll give demos. I'd happily give demos and. [00:54:21] Doc Searls: you, if you go there, maybe my wife will go too. She's just, she's on nothing on purpose. [00:54:25] Katherine Druckman: Don't, don't hang up. Don't go anywhere. One second. Okay. I'm stopping. That was the end. [00:54:31] Doc Searls: Okay, that was the end.