Hey everyone. Welcome back to reality. 2.0, I'm Katherine Druckman. I am talking to doc Searls today and Dave Huseby who you probably remember from a couple of previous episodes that we will link to and Dave, well, Dave, your bio kind of changes. I wonder, I wonder what kind of a cool stuff you could, you could tell us about yourself, but before we get into that, um, I wanted to remind everyone to visit our website at reality, two casts.com. That's the number two, where you can find links and a lot of other good stuff. Um, but yeah, so, so Dave, first of all, welcome back, but second you're always up to something really interesting. And so it's always really fun when we get an email from you with a, with a bunch of links that are, uh, well, so occasionally over my head, but that's fine. Um, that's how I learned things, but yeah, so, so, so what are you doing these days for real? Like if you have to introduce yourself, how do you do it today? Uh, well, I mean, I'm the universal internet malcontent, apparently not to put you on the spot or anything. So what are you doing? So my day job officially is I am co-founder and CEO of a crypto engineering firm called crypted technologies. And it was born out of everything that I have learned. And my co-founders have learned from our long histories being on the internet. It's a bunch of gray beards. Well, not all beards because that, but, uh, we are industry veterans who have been in and around security and technology for decades. And we all met as part of the Hyperledger project over the last four years. And we've also known each other through the internet identity workshop, um, from the last 10 years or so. And I started my journey down this road back in the early nineties, you know, when we had the first crypto wars and, you know, I've kind of come in and out, but for the last 15 years, um, I've been really focused, mostly on photography, privacy, that kind of stuff. Cause I got in bald at the Tor project very early on, uh, in really two thousands, um, as a volunteer and anonymous volunteer, by the way I taught myself everything I needed to know about how I could use the internet anonymously and you know, part of the cypherpunks mailing list and, you know, anonymous retailers. Remember when that was the big topic, how do you send email anonymously? Um, someone I wanted to bring up, like I. Going down memory lane lately, um, and wondering where the philosophy went, what changed? That's what I want to know is what the hell changed because where all the cypherpunks doc, you and I aren't the only ones left. I know that's not true. I was never, uh, competent enough to be a cypherpunk, but I knew a lot of them, um, you know, it was getting Eric Hughes who was absolutely brilliant, but I don't know where he is now. Right. But that's my point. Where are they? I don't know where it is. So, um, yeah. Anyway, like, you know, I worked at Mozilla, I worked at, um, the Linux foundation on Hyperledger and everybody's groping around to try to figure out how to deploy blockchain technology. And they've, we've invented this in FTS and these web three things. And, and what I wanted to come on and talk about was I think what everybody's groping for. Is authenticity trust in data? Is this data real? Can I trust what it says? Does it come from where I think it comes from? Does it say what it says? It says, you know, like, has it been modified, you know, where did it come from? Is it still valid or has it been revoked? That kind of stuff? Um, we covered that in my two previous times I was on here. Um, and I wanted to give an update because this has been a multi-year project for me. And one of the cool things about this podcast and your listeners is if they go back and listen to the ones I was on, you can actually see how my thinking has evolved and how we have learned as we have been in the market and building solutions for our code. And anyway, that's where I wanted to go. Catherine, you had a couple of questions. You wanted to do some introductory questions before I dug in and held sermon for an hour. Okay. Um, yeah. Um, so we'll actually doc, uh, wanting to, I think go into what zero, zero, we were having a conversation and it might be worth just going into this thing going out of it, which is that you pointed us to something that news-wise could help locate it in time, came out of the white house, this, uh, yesterday random and improving cybersecurity security department, defense intelligence community system, which has some cool in it. And you just tell us what that is and why it's cool. And then we go to your rest of your yeah. So excuse me. Um, like everybody else, I also have a bit of a cough. This memorandum out of the white house, this is really. Um, I carefully watch our politicians because, um, they tend to be reactive rather than proactive. And it's really interesting to see what their perception of the world is. Um, and you can kind of gauge that based on how they react. And this came out of nowhere. I don't, I understand that the government, the federal government has been talking about improving cybersecurity and there's been a lot of warnings like, oh, we're losing the security race and whatnot. And there was an executive order last year about, um, improving security using what's called zero trust architecture. And this memorandum that came out yesterday furthered that, um, I don't know what weight this has. It's not an executive order. It's just a memorandum, but I guess it's sort of an intent document from the white house. This is what we'd like to see all of the executive, uh, um, organizations. Heading towards, right. And that then I guess requires some explanation. What is zero trust architecture? This is a radical departure from traditional computer security. Forever. Computers have always had digital dates around them. And for most people it's always been in the form of a username and password. So you want to log into a computer. You want to send an email, you want to do any of that stuff. You want to check your bank records by and large. You still need a username and password zero, zero. Trust architecture throws that out. It changes the access. From being one, which is called, uh, access control lists, access control lists, or like on the computer it says, oh, that's dark. He gave me our username. All that stock doc can do X, Y, and Z. Oh, that's Catherine. Catherine can do a, B and C. We're moving away from that to a more decentralized approach for security where, I mean, it's, it really can be boiled down to trust me never. And that's why they call it zero trust architecture, like no service or no process on a computer, no service on a computer, no computers ever trust each other. It's mutual distrusted. Of the computer of a computer system. So a one process wants to like say, I've got a web browser on my computer and I want to copy and paste from the web browser into a text editor. Those are two separate processes on a computer in a fully zero trust architecture. You can't do that because those processes are in jails, preferably in a VR like as virtual machines on a hypervisor and you just can't copy and paste between them. But in a network level, when you're talking about multiple computers talking to each other, they don't trust each other. There's really no such thing as inside the firewall anymore. There's no blanket access where if I can get inside the firewall, I can access everything inside the firewall. It kind of makes the firewall somewhat irrelevant in the sense that. Even if you get inside the firewall, every computer you talk to is going to ask you to prove that you have access to it, not prove who you are, but prove that you possess some piece of data that grants you access to the services on that computer. So does that make sense? Does its own firewall in a way, and, but it takes, it actually invalidates the firewall metaphor that you have this kind of wall there that has been being trained by lists on the inside and login and password on the outside. That's correct. Yeah. There is no such thing as like a walled garden where once you get in, you can run wild and do whatever you want. Um, zero trust architecture is built on cryptography. And so what I want to say is, you know, the, the strategy, we just had a strategy meeting inside our company just the other day. Um, and we're, we've been trying to come up with some marketing messages. For all the different people. We talked to business leaders, technical people and all that stuff. And, and the common theme is that applied cryptography is now the new foundation for enterprise computing. Because if the federal government is reacting by adopting zero trust architecture, which is applied cryptography for security purposes, then it's already common in industry because our government rarely leads. They usually follow. And so we're seeing widespread adoption of applied cryptography, and it just means that it's going to drive like data and authenticity around data. That's what our last two podcasts were about. You know, the authentic data economy, it's going to drive things like I am, uh, DRM, digital rights management, all those kinds of things. And so being sufficiently well-versed in applied cryptography is going to be an essential. Uh, component of any leadership in any business, you know, any technical leadership in any business, um, if it isn't already, and that's, what's interesting about this white house memo, it's just basically saying, we recognize that there is a shift in the world that this is the current state of the art in computer security. And we're going to head in that direction. What they don't say is what zero trust architecture is. And that's what I'm telling. It's applied cryptography. If I want to send a message from one computer to another, I have to present proof that I have, um, an the, I have access access has been granted. And the data that I'm sending is encrypted in a way that the recipient can decrypt it. Um, every step from authenticating me to authorizing me to communicating with me every piece has encryption involved. Every piece, whether it's a zero knowledge. Um, to prove that I possess a piece of data, you know, such as like an access token or it's a digital signature that says, you know, I control a key pair, uh, that's associated, it's been, you know, granted access to write data into the system. So I sign it with that key pair, the system validates the key pair and it accepts the data or just, you know, standard encryption on the fly, right. Just encrypting the data on and the move, uh, every step requires encryption. And that's why, you know, and I'm not, I don't want to turn this into a sales pitch for my company, but, um, we are launching in the next month or two, uh, secrecy as a service platform. Um, and our sales pitch is why are you still doing cryptography? It's really hard to get, right. Just let us do it right. But I, you know, I don't want to, I'm not here to sell the company. I'm here to talk about something really, a lot more important than that. Um, uh, so the zero architecture is really cool, but we take it a step further. And this is where we get back to user sovereignty, doc and the, the, um, the authentic data economy and how we're going to build on top of zero trust architecture to move us in a direction, move the whole world in a direction where we can reduce fraud. And we can up the trust in systems and comply with all these privacy regulations in a sane way. I think everybody understands that privacy regulations are just, you know, companies comply by throwing an interstitial in front of you and nobody ever reads them. And they're just like, yep, fine, whatever. And they clicked through the ULAs on websites and about cookies and whatever. And so GDPR and CCPA and those, those kinds of regulations really haven't shifted the privacy landscape that much. It certainly hasn't slowed down surveillance capitalism. Um, and so I wrote about this recently on medium, and I can give you guys the link to my stories here, if anybody's curious, but I wrote one called, uh, achieving absolute privacy. And it's about how the most important thing we can do in the authentic data economy is to combine it with, um, verifiable computation. And I can explain that one to verifiable computation for those of you don't know, is a way to send software, send code to another computer or another party. They execute the code. In such a way that they can prove cryptographically, right? So there's cryptography of all. They can prove that they did the calculation correctly. Okay. So this is how Ethereum works in the end. Right? You write a smart contract. Um, they get sent to, uh, all the nodes in the network. The net, the network runs them. They are verifiable computation. They run the smart contract and they can prove that they did it. That's how they can claim the gas, the fees, if they can prove they did the execution correctly. Um, what I'm trying to bring to the world. And this is part of our platform and a lot of open source code that we're doing is this thing we call verifiable or sorry, cryptographic qualifications. And I call it that because it's like, I am qualified to access the system. I am qualified to do X, Y, and Z. Like, I am qualified to go to this concert because I am a paid ticket holder. I am qualified to enter this space because I am vaccinated. And the reason this is important, this approach is by encoding the policies of these digital gates. As I called them, you know, these gates, they check you out, like, you know, do you have a ticket? Can you get into the game? Uh, are you vaccinating? Whatever these are digital checks, right? Um, their gates, you encode the policy of the gate as a verifiable computation and you send it to the person trying to get through the gate. So that's why it's called a cryptographic qualification. I I'm using cryptography to quality. The past the gate. Okay. Now here's our novel invention, which is with the whole authentic data economy, which is essentially the, like the mother of all digital rights management systems. It's it's if you publish data as authentic data, that data can be independently verified, where it came from, who got has it, that it hasn't been modified, the data hasn't been revoked. Um, we combined that with verifiable computation to create private, absolutely private transactions. There's also regulation compliance because I, if I want to buy something, I can prove that I'm KYC without giving any of my QIC data. The, the store can give me, uh, you know, a KYC check is a verifiable competition. I have my KYC data as authentic data that I keep private. I combine authentic data that I hold privately with this, uh, verifiable computation that I get from this gate or this company or whatever. And it produces a yes, no answer and proofs that the inputs for were authentic. And I did the calculation correctly. Um, if I give that along with verifiable and verifiably encrypted, say like transaction ID, that can be walked back to my KYC vendor in the happy path I have just now legally. And I, and I'm not a lawyer, right? I'm not a lawyer. So talk to your own counsel. Don't take my, yeah, don't, don't take my advice on this, but I think, I think that we can meet not only in spirit, but legally, um, things like KYC laws, because what I've done is I passed the KYC check. Like I have been Caitlyn. And you sent me some code that verified that I had been KYC. And then I give you the proof that I did it all correctly, and that the KYC input was valid. And I give you an encrypted ID that the merchant or whoever can verify, can be decrypted by my KYC vendor. So in the happy path, the merchant knows nothing about me, but they're able to verify that I have been KYC by a legitimate KYC vendor. And they now possess a little piece of encrypted data. They know the KYC vendor can decrypt if there's ever a need to be in an investigation. And there's like a warrant issued, the merchant can give up that data as part of the investigation and they can walk it back to the KYC vendor. And de-anonymize. So for those fourth amendment privacy, essentially using cryptography, we go search and seizure. So a so for those who may not know all their TLAs KYC is know your customer. Oh, yeah, that's right. I know your customer, but I'm sorry if you're going the other way. Okay. Which is know your company. Okay. Um, because the, the world that I envisioned is one in which I know that I am a customer of, or that I have dealings with these five companies. So these 50 companies, whatever they are, in other words, what I'm keeping on my side is no longer, um, uh, a roster of logins and passwords. Um, carrying what in the, in the SSI world are calling verify all the credentials, but you're, you've got a twist on this, which is verifiable encryption has been, has been done in some way. And. Architecturally, it's pretty similar as it not. Or maybe this is a radical departure of what the SSI community is doing. The stuff that you're seeing out of the W3C does not do anything to really preserve privacy. They are doing what's called selective disclosure. But because of my years at Mozilla studying global surveillance, Nancy's surveillance on the web. I know that there are empirical models out there. Uh, you know, Facebook, Google, I don't know if they have them. I don't know exactly which companies, but I know that there are these mountains of data where they have observations, you know, thousands, if not millions of observations on every human that is tied to our real identity. And so if you give away even just a little bit, like, you know, you've been fingerprinted and whether you like it or not, and it doesn't take a lot of minutes of information to identify I'm looking for maybe is, is as a. Part of the idea. I mean, I think the, so in a way, what you're talking about is self sovereign, but not necessarily with that schema, but what I would, what I've always liked about SSI is that not, I don't even know what the hell W3C is doing. So I'm not that versed on any of this, but I'm looking when it's shared that instead is I I'm. Okay. I am, I am a customer of a budget, rent a car. I want to let them know that I am a customer of you guys, or they want to know. I just want to run a car. I mean, it's there. I, I'm just thinking of what are the types of interactions that might happen, that are confined in the way that you're talking about. That doesn't involve me having to spill a whole bunch of shit about myself that is irrelevant and can be used for fingerprinting. What is it that they need to know? Okay. Right. If you want to rent a car, what do they need to know? The you're a licensed driver that you have insurance, or that you're going to buy insurance, uh, that, uh, you're old enough. That's really what they need to know. They can encapsulate that as a cryptographic qualification, you know, some verifiable computation that they give to me. And I run it over my authentic data that has been issued to me, uh, from the societal institutions that are going to be, you know, be it'll be enterprises, it'll be governments, it'll be nonprofits, whatever, you know, we're already talking about mobile driver's licenses. Um, there's also. I saw the credit rating agencies are very interested in doing this, issuing all of that data to us. Well, that data as authentic, I'm saying, give it to me. Don't give it to anybody else because it's about me. It's my data. And we will possess all of this data about us. And if you encapsulate it in a verifiable computation in these cryptographic qualifications, they can send it to me. And I run it over my private data. And so their policy would be, are you old enough? Are you licensed driver? You know, do you have insurance or have you purchased insurance for this trip? And what gets goes back to them is a yes or no. It is literally one bit. Do I qualify or do I not qualify? And then also the proofs that I did the competition correctly and that all the inputs were correct. Now, if they need to have a way. To unwind my anonymity to de-anonymize me, or to know who my insurance company or is or whatever it is. I can also give them this verifiable verifiably, encrypted data. They can't be scripted, but they can verify who can be encrypted. And so I could say, look, I am a customer of USA bank and I have USAA car insurance, auto insurance. I can give them a P like my insurance information, verifiably encrypted for USAA. So if I never have an accident and there's no need to go to my insurance, my privacy is maintained. Absolutely. The only amount of data they get from me is that I'm a customer of USA. Right. And we have to be careful about how many of those we give, because that starts to narrow down. Like, I mean, you, you know, let's say I am a customer of USAA and I also remember, right. The combination of these verifiable encryptions could be a fingerprint in and of itself, but I'm comfortable, you know, this allows people to dial in their comfort level for their privacy. I'm comfortable telling someone that I'm a customer of USA because they have millions and millions of customers. And I haven't given them any other data about just a one bit, yes. I meet your qualifications to rent your car. Right. So I guess my biggest question about all of this. And so now that you've sort of unlocked, the was part of the answer anyway, what is the incentive for adopting this way of interacting? Oh, that's the perfect question. I mean for businesses, obviously for us, you know, for those of us who care about our privacy, there's great incentives, but you know, again, two answers group. I have two answers, sorry for cutting you off two answers. One of them is regulation compliance and how much money it will save the business. The other one should scare the hell out of everybody else. So the first answer, we'll, we'll start with that one. Um, if a business, let me, let me tell you a story. So I have, I have a friend who works for a really great organization and I don't want to out them, but this organization funnels corporate money to the employees of these corporations for the purposes of disaster relief. So these large corporations that, and I'm talking about companies that employ tens of thousands, hundreds of thousands of people, they have employees all over the world and they. Do it's essentially kind of a profit sharing where they say here is a pool of money. So your $20 million that we want to make available to any of our current or former employees, sometimes it's just current, but it could be current and former employees to apply for a grant to help them rebuild after a hurricane or an earthquake or whatever. Okay. This organization that runs, this is a nonprofit and they are doing the Lord's work. They, they have to feel these requests and get the money out to the people. You know, my buddy loves it. He's like I just give away money to people who really need money every day. And it feels amazing. And, um, their problem though, is that they have to gather a bunch of personal information to classify these. They come in and they say, I am an employee of corporation X. Uh, I live in this place that just was declared a federal disaster area. Or I live in this country where we just had this really bad earthquake. Um, and I need money to help me rebuild my life, my house, right. So they have to collect all this data and that makes them subject to GDPR requests and, you know, CCPA regulations and all that stuff. And they have a significant burden, uh, not only collecting the data, but storing it securely and being able to answer to these requests and delete requests. And it was to the tune of like thousands and thousands of dollars to do a full audit of their databases, to gather all of the GDPR data that they could respond to it. Right. What we can do is I could build that policy as a verifiable competent. And as long as these corporations issued employment data to their employees as verifiable or as, uh, authentic data, uh, which is trivial by the way, it's just a little bit of crypto and makes it verifiably authentic. Then when they came to this organization for help, they would send the code to them to run and their in their wallet or whatever over their private employment data. And it would verify that they are an employee and their home address is this. And they live in a federal disaster area and they qualify. Now that organization gets the answer they need and the paper trail they need without collecting any of that private information. And it essentially eliminates like to zero their compliance burden for GDPR CCPA, because they are literally not collecting any data on any of these people. They enter. So someone says, Hey, what I I'm making a GDPR request. What data do you have on me? And the company goes nothing because they don't have any, they haven't collected it in the first place. You don't have to collect the information to make that kind of a business work. And that's true for the vast majority of businesses. That's the cool thing about this. And so when I sell this to companies, it's like, how much do you spend on your GDPR compliance? How much do you spend on, you know, trying to avoid FTC fines, because someone broke into your database of credit card numbers, like all of that information, how much is that overhead costing you? We could restructure your, you know, e-commerce your, your electronic transactions systems so that you can still function as you do today without ever collecting any private information at all. And they just lot, not at me. And they're like, if you could make that happen, So that is amazing. Right? And we're seeing significant interest from industry, um, number of projects underway right now. We're essentially flipping the arrow on digital transactions. The current state-of-the-art for digital transactions right now is how do we secure the data going to the code? You know, and there's a lot of cool companies out there. Like an ever vault out of Europe has really cool startup. They, uh, you know, they intercept data coming off the page, they put it into secure enclaves. They, they keep it encrypted. And then the companies that use them send code over and any decrypted inside the vault and they do it all there, whatever. And it's all about still sending the data to the code and they have to jump through all these crazy hoops, but it's very innovative and that's what I would consider the current state-of-the-art. But what we're talking about is reversing the arrow. We're not going to send any data anywhere. We're going to send the. To the data itself. And I think the reason this hasn't been done before is because one, we don't have the authentic data economy. We didn't know how to do it because we couldn't do revocation at scale, but we've solved that now. And we, um, we now know there's been some advances in, in, uh, what are called untrusted setups, verifiable computations that have happened in the last two years that make this actually feasible. You can do small verifiable computations in a reasonable amount of time on mobile devices. But even then, if, if they're more complicated, one of the areas my company is moving into as an acceleration platform for the verifiable compensation. So, um, we'll be able to do that as well. Anyway, this, this is really critical that we've realized that we should stop sending data anywhere. You should get the data about you from the organizations that give data about you, your DMV, your doctor, your credit rating agency, your bank, and you should keep it private and you should never give it away because the second you give it away, you lose all power on the internet because there'll be copies and it'll get sent everywhere and monetized and analyzed or whatever. We're going to say, doc, you had something you wanted to, oh, I got a bunch of things. Um, so the defaults, I just put this in the chat because nobody's listening to the chat, but if you look up GDPR compliance, you're going to get depending on where you are and what Google is doing at the moment, over 200 million results, almost all of those are companies selling, um, ways to obey this, the letter of the GDPR while completely screwing its spirit. And by continuing to track. I don't, I, they're very, very, very, very few companies that at least if you're looking at them through their websites that are not trying to harvest as much personal information as they possibly can, no matter, right, the GDPR and the CCP have done nothing to slow this down, but they've done a lot to completely screw with our experience of using the web because it makes it harder and we're even more exposed than before, because most people would just say, screw it. I'm just going to accept the thing in there. You know, you've got a cookie that says, go ahead and harvest all you want about me. And I'm wondering what kind of companies are going to, I mean, in, in my experience, okay, it's, it's really, really hard to sell against the defaults here. The defaults are, you're a company you want to do KYC, know your customer. You've got to harvest everything you can on the open web. And in every interaction you have with, uh, with a customer you're going to get. Um, uh, uh, you're going to go to, um, forgive me, my phone's ringing. I'm ignoring it. Um, uh, you're you know, you're going to go to, um, Axiom and hell Experian. I dunno, all these companies, and they're going to give you data about yourself, your, your hunting license, your DMV, all these people are also busy selling personal data about you. There's a vast market in personal data. How do you argue against that? I don't know. I mean, I'm sure you can find some companies that are buying your service, but it seems to be like it's an uphill. Oh, yeah, but you can't be zero trust architecture for gathering data, first of all. Um, and so, you know, in many ways the federal government saying we're going to do this, it's going to force all of the companies that are interfacing with them to do it. The second thing is, is does that marketing information that you're selling your company selling? Does that really cover your bottom line, your clients, your risks? It's a horrible market. I mean there, and in fact the, I dunno, I okay. Here's okay. This is Kim walked Tim long. Good friend wrote this. Two years ago, subprime attention, crisis. It's about the bubble. That is the entire advertising FICO system. I call it cause it's fecal. Um, I've never heard that. That's good. And yeah, and, and I was, you know, he sent in 2020, the thing's going to crash. I sent it 2008. The thing's going to crash because it actually does not work for the most part. Now ignore Facebook. Let's take Facebook and Google off the table because they do work in their own different, very different ways in our examples only of themselves. But yeah, every website you go to is busy, harvesting a zillion things about you. And, and that's, that's the default, but I'm wondering if there is now that the feds have come out with this, uh, uh, uh, zero trust target, zero trust imperative. Is there a subset of all the companies out there as a yeah, we're on board for zero trust. That's your market obviously. And you're, and you're telling me. Are you telling us that there is that, I mean, there is a subset of, of the multitude of companies that are busy trying to screw you that some that are saying, you know what, the zero trust thing is really a great place to restart the economy reaction you need to have. Also, it gets us out of a lot of trouble that we're never going to have because we don't even, we don't even care what the GDPR says cause we're not collecting any, it was crap. Right. That's exactly right. And I think in the end, there's a bit of a double network effect because once we get one company that is producing their outbound data as authentic data, so that any company that consumes that outbound data, whether it's credit reports or, you know, whatever insurance, I don't know anything like bills of lading, you know, all that kind of stuff. Um, The increase in trust reduces the risk to any company that is ingesting that data. And then if you combine that with cryptographic qualifications, so that you avoid the regulatory risk burden regulatory burden in the, in the risk of collecting personal information, um, and avoiding like the straight bottom line overhead of having to do audits annually about your, you know, SOC two compliance and your PCI compliance and your GDPR, you know, all that stuff. Um, it's actually, I argue it's a competitive advantage. The first companies in we'll see huge gains in their bottom line because it'll eliminate a lot of their, their, their liabilities. And then also increase, uh, the efficiencies between, you know, B2B B to C to B, that kind of stuff, because they'll be able to issue data for their customers as authentic data, which will then be the preferred date. Taken as inputs, you know, two other companies. And I'm hoping that there's a double network effect here. You know, once one gets on board and they S they are making a better profit and are more competitive in the market. You'll see all the companies being like, holy crap, we need this. And in many ways, it's like, once I'm publishing as authentic data, I'm going to call my business partner or, you know, the other businesses I do work with and say, Hey, if you have this, you can, you can verify all the data we're giving you is authentic. Right. And that alone should significantly incentivize companies to, um, To switch over to at least to be able to verify that the data they're taking as often in a way or cell is there, there's a subset of companies out there that want, what I just put in the tread is a zero bullshit overhead, right? Because that's ZB. So you have zero bullshit overhead. I mean, one of the first chart here just made it up. But I, I was thinking that, I mean, one reason that trader Joe's sells good food for a low price is they have absolutely no interest in customer information. None there's no, there's no loyalty program. There are no gimmicks. They don't even have multiple prices. There's no, you know, you're a loyal customer. You get a different price, they get a price. That's the end of the price. Nothing is on sale ever. So all that overhead that was basically gaming, the customer goes away. Yeah. And I've been waiting a long time for that sensibility to hit, but maybe this. Zero trust is one of the bullets that can hit, hit a target on this one, because, because that overhead is massive marketing overhead is massive. And when, when, when the COO kind of rose up the corporate ladder, there's no such thing. As a chief marketing officer, there, there was a VP of sales and marketing, and it was a salesperson and the marketing person had no power. That was 30 years ago. Now the chief marketing officer for the last 15 years has been sucking, um, a budget away from every other corporate function. In order to, you have to have data on the customer, there's data out there, suck in all the data. You can, you need big data at all costs, you know, and I'm waiting for that tide to turn, and maybe this will help turn it, especially if you actually have customers, then it, then the tide is turning, I would say. Yes. And, uh, so I had it's as she brought up groceries. Um, because the, the second answer, the one that I think should legitimately scare everybody out there, um, is the topic of my most recent article called the theory of digital gates. I was waiting for you to get to that one. This is the good controversial stuff, probably I think. Right? So here's the theory. And maybe I didn't do a very good job in this article. May almost want to rewrite it, um, having re-read it, because it maybe doesn't get to the kernel of the idea upfront, or soon enough, I kind of buried the lead a little bit, but, um, if you go outside, which I know a lot of us don't do anymore, but if you go outside and you go to drive down the streets where your supermarket is, and there's some retail stores, all of those stores are what we would call open to the public. Right. Which means anybody can come in, anybody can buy something, anybody can leave. Right. Um, and the important thing to realize about open to the public is that the corporate leaders who run the companies that own, those spaces have plausible deniability about who comes and goes from their stores. Right? Yeah. Okay. Like if you went to an executive from Kroger, grocery store or any grocery store, and you said I'm mad because so-and-so uses your grocery store. And I don't like so-and-so because I don't like their politics or I don't like what they say on Twitter or whatever. How dare you let Ted Cruz into your restaurant. Exactly. Right. Okay. Now the people who run those businesses would say, I don't know who uses my grocery store. I don't know who uses my, you know, subway franchise. And, um, it remains to be open to the public. Anybody can come and go because of that plausible deniability. I want to point out who doesn't have plausible deniability. The companies that do not have plausible deniability are online services, primarily social platforms, such as Twitter and Facebook, um, PayPal square, you know, stray or, uh, you know, um, all the payment apps that we have, you know, like cash app and stuff like that. They do not have plausible deniability. They know exactly who uses their platforms. I mean, we're ignoring all the sock puppets on, on Twitter, but they, they strive. They put, they put a lot of effort into knowing exactly who uses their systems. How do those systems operate? When someone who is persona non grata, you know, popularly unpopular people try to use. Whereas Twitter, they get removed. Right. And why? Because the internet mob comes after them and it's like, how the hell do you know how dare you give a platform to this person? Who's popularly unpopular. That's one of my favorites level. I assume they're worried about liability it, you know, at some level or in the future or God don't. Yeah, it has a name it's called reputational risk. This is what financial systems financial institutions use to deny processing electronic payments for companies, individuals, as they say, if we were to do business with you, that would present such a substantial, uh, reputational risk to our organization. And we have a fiduciary responsibility to preserve the value of our company by minimizing reputational. This is the justification that visa uses to prevent processing payment and PayPal to prevent or to, to ban perfectly legal companies. I need to point out perfectly legal companies like defense distributed in Texas, right? They follow every law. They do all the right things. They pay their taxes. That's Cody Wilson's company that sells machines to turn, to make those guts to turn, uh, 80% lowers into functioning firearms. Oh, the 3d printed guns. Yes. I remember. Okay. Defense distributed cells in manufacturing and sells a home manufacturing box and they run Def CAD, which has all these plans for firearms. They are not allowed by the financial system to process any payments. And so wildly unpopular because of the reputational. That's the claim anyway. Um, but that company is, I assure you 100% legal because if there was even a tiny fraction that was illegal about that company, they would have been shut down years and years ago. Right. Um, and this applies to, you know, fill in the blank. Any company that's unpopular, any, any company that runs a web page that has a subscription service, like, uh, you know, uh, there are social personalities who talk about controversial topics. You know, COVID has been a big one and vaccination has been a big one these days. And companies are blocking their ability to have like Patrion subscribers, that kind of stuff. If they don't toe the line on whoever, you know, whatever, which way the wind is blowing, I'm not casting any opinion here. I have no opinion. I just want to observe the system as a whole and how it's functioning or not functioning and in a country that extensive. Has the right to free speech and the freedom of press in it's one of its most founding like cherish documents and as a founding principle of this country, to know that, uh, if you say the wrong thing, you can't collect money for it because there's some corporate, you know, board that says that's a significant reputational risk to the organization. They, that, that seems like a dysfunction, especially in this country. And, um, the only thing I was going to say, I think the only reason that exists is because they don't have plausible deniability. They can't be like, well, we just processed payments for everybody. We don't discriminate at all. Right. But we know they have to, because of financial regulations, they know, they know we, they have to know who it is and deal with this. Where are we going to say about only I was going to say only fans is the, is the one that's always the example that comes to mind. And it's frankly, a more fun example because it's, you know, it's possibly controversial, but more amusing because you know, they, they reversed the decision, which is, you know, interests, which is interesting because I don't know it, I guess it depends on the social climate. But the idea of a platform banning the only possible moneymaker that they could ever have, because is it used for anything else? I don't know. I've never actually seen it. Well, I think they started off as a Twitch competitor. It was going to be people standing out here, video games or something like that, but I've seen only fans full disclosure. I have no idea, but I, you know, I know internet reputation that it has a certain purpose. And if they eliminated that certain purpose, I'm not sure why it would exist anymore. But the pressure, like you say was coming from payment processors, but then I guess they backed them off. I don't, you know, I don't know, but who knows? I don't know. I don't know the story behind that. Why they reversed. That would be really interesting to get that out. Find out next episode. Maybe we'll have an answer. So I'm building to a climax here because what I want to say now that no pun intended. Yeah, no pun intended. So now. The other explained, you know, plausible deniability and, um, reputational risk. This is the scary part. There is right now is a huge push to roll out vaccine passports, right? The VCI just put out a piece the other day, the S the vaccine credential initiative, um, bragging about how 12 states are on board. And there's something like 200 million people are, can have access to their app to have their digital COVID vaccine, blah, blah, blah, blah, blah. Right. And they're encouraging the adoption of their technology. Okay. They're putting up digital gates in front of things in front of spaces that are currently open to the. With the idea that to get into a grocery store, you're going to have to flash your, you know, you're have to do some QR code, electronic thing to prove that you're immune or gun vaccinated or whatever. Okay. The problem with this is that the, they, they claim that it's privacy preserving, but there's biometric data involved to link it to you. And so it's not really privacy preserving. It's not absolutely private as I would define it. And that means that there's identity data available to the digital gate at the time of presentation. So I want to go into my grocery store and I have to verify my COVID status. The data I give that gate includes a biometric data about me. It may not say my name, but it gives enough information fingerprinting. Yes. These big empirical models could be like, Immediately like that. And then you need a new coffee machine. Well, I don't even think it's a marketing thing because my article on the theory of digital gates is that the, whoever runs a digital gate is going to optimize it for the value to their organization. And the inputs to that optimization function are maximizing profit, which is what you just alluded to. Right? Can we do marketing and all that stuff? Um, but it also has to minimize reputational risk to the organization. And they're going to use every piece of data available to them to optimize the value of that gate to the organization, because they would need to justify the cost and whatever the compliance, um, which means that if that gate knows who I am, whether explicitly, because I told it or implicitly, because it was able to fingerprint me and access some database in the backend, then it's going to apply reputational. So even if it says, oh, we're just checking your COVID status. And I'm very popularly unpopular. When I go through that gate, it's also going to be, um, tacitly dishonest and do a reputational risk check as well. So even if I am immune, even if I have been vaccinated, it could still deny me entry into a grocery store because I am someone, you know, that's unpopular. Um, I said the wrong thing on Twitter and, uh, um, they don't want it now. They don't have plausible deniability that I'm accessing their grocery store. And so they're optimizing they're they're, um, they're, they're minimizing the reputational risk. And so the thesis of my article is that. If we build digital gates in front of all of these spaces that are open to the public today, and those digital gates have any identifying information available to them, identity information, or a fingerprint or otherwise, the net result will be the construction of a social credit system without any conspiracy conspiracy. Th it does to a certain degree. So, so th this reminds me, I, I'm not sure. Uh, so I, I understand what you're saying about the reputational risk and people who are popularly unpopular, but I think the more real threat is something like a story I remember from, oh gosh, I don't know. I think it was in the past year, but a young black woman tried to go to, I think it was a skating rink and was denied entry to the skating rink because they had some kind of frankly half-ass facial recognition, software or something at the door that identified her. Looking like somebody who was flagged for, I don't know if it was violent behavior or, you know, some, whatever it was that they were looking for and the girl was not let in to a skating rink. It wasn't her, you know, she, she did was not the person that she was identified as, um, and whether or not she was, you know, it was beside the point, right. It really, I mean, what are grocery, you know, what of grocery stores could flip a switch and, and block entry to anyone who shoplifted when they were a teenager? You know what I mean? It's there, there, there is a slippery slope argument. I think there, there, maybe people haven't thrown up. I think that people are afraid of a social credit system and the, you know, the pervasive surveillance of physical spaces and, and rightly so. I think it is a valid concern. I mean, everybody's seen the black mirror episode and they've seen what's happening in other countries of the world. Yeah. And people are rightly saying like, yeah, You know, you're gonna have to beep me here, but that's up. Right? Nobody wants to live in a system like that. That's global tyranny. Okay. And what my, what I'm arguing is that no gate, no matter what it says it's doing can have identity or identifying information made available to it because it does, it will by definition be using that as part of its check. And it will enforce a social credit, whether it's it was designed to, or not. It, the, the organization that operates it is going to be like, well, we no longer have plausible deniability. So we have to, because of our fiduciary responsibility, minimize reputational risk. So, you know, uh, defense distributed can't process payments, Cody Wilson, won't be able to go to the grocery. Yeah, that's what we're doing. And so the, the social credit system, I think, is going to get built without any conspiracy. There's not going to be like a smoky room. There, there are my groceries. I mean, I still go to a baseball game and I can still do most of the things I need to live without having to prove my COVID status or anything like that. What I'm worried about is that this COVID digital vaccine check is being used as the justification to get these gates in place. And I'm arguing that they will come out of the box saying, all we're going to do is verify your COVID, but they will be used to enforce social credit because they'll have identity available to them. And, and that's why, like my company is open sourcing and we're pushing this technology of these cryptographic qualifications, which can apply a policy while denying the gate any, identifying it for. That's the whole thesis here, right? If we have to have digital gates, then they have to be using cryptographic qualifications. That's the only way we're going to avoid a social credit system. Right? I can, I understand your argument. I think a lot of people would react that, you know, it sounds far-fetched or whatever, but I think, I think the important thing to remind people is that the least things exist. These, there are con there are consumer scores that are based on sort of the data that is gathered about it and they do exist. And it's actually very difficult as I understand it, to have your data removed from these, from these rather weird and shadowy, uh, organizations. It's not just, it's, it's similar to a credit score, but I I'll post a link into, to some more info. We talked about it in a previous episode too, but there are these, they came out a few like in the last year and that, you know, it was scandalous at the time that there are these, these companies who, who, uh, Right. You track this information, but anyway, I think we're where this is going, is that as long as the data's available, it's going to get used for something you don't want it to be used for period. Like that is really, I mean, that's all we really need to know because it's already being used for, for things that we would not be happy about, whether we know about them or not. Totally. I agree with that completely. And one of the cool things is, is that if we can, you know, if, if this technology is successful in the market and starts to display some of the old school, digital transaction systems, um, we can basically draw a line in the sands of time, uh, where we start to deny the internet as a whole, right. I mean, surveillance capitalism is universal now. So, um, we can deny the internet. Our personal information. So there's a line in time where it's like, okay, from now on, you don't get any of my data. We're still going to do business, but we're going to use cryptographic qualifications. And I have my own private data as verifiable data and, you know, as authentic data and I'm never going to let it leave. Um, the cool thing about personal information is it has a shelf life. It goes bad. It is fresh fruit. It is fresh caught fish. Um, we all change as. Live, we change jobs. We changed where we live. We change our incomes. We change our family status. We change our health status. We changed everything. Humans are dynamic creatures. And if you took a snapshot of me 15 years ago, I'm hardly recognizable. Obviously there's some macro data that doesn't change. You know, my race, my gender, my, my height now that I'm an adult like that doesn't change. But all of the other stuff, you know, how much money I make, where I work, what are my political leanings? All those things are in flux and they change. And so once we get that line in the sand drawn and we start starving the system of data, we could actually kill surveillance capitalism. I don't think it will take much doc. You were like saying, Hey, I hope this collapses. I don't think it's going to take much of a push. I think it's a house of cards. And once companies realize that, um, they can make up any law. Uh, from not collecting customer data with the savings that they gain from not having to do all these audits and regulatory compliance and risk insurance and everything around having that data laying around. I mean, private data is personal data is like toxic waste. Nobody wants to collect and store that stuff. So the radon gas of business. Right, exactly right. And so I think once businesses start realizing that, uh, monetizing customer data doesn't actually make sense. And now we actually have an easy system technological system that can replace their existing one that allows them to, to, you know, trade a loss in their income with a savings in their overhead, and also increase the trust in the data that they're producing and the data they're consuming, which also further reduces risk and makes things easier. You know, it increases the overall trust level. Um, I think that we could see the entire structure collapsed almost overnight, the entire like surveillance capitalism business model. I mean, we all know that like click fraud is the thing and advertising is like 30% fraud on the lab, you know, there's, it's not going to take much. Yeah. W there's a, uh, uh, Gustin Fu uh, fou, um, at AC Fu has been on this show several times. Like you have been talking about this and that's his, that's his, uh, stock and trade is a, is ad fraud. Uh, it's massive. Th the numbers he throws out are always kind of mind blowing. Like just the fact that it's, it's not even earning them enough humans alive and using the internet to generate the ad clicks. I know, I know you can only do this with, by faking it up and yet it's so easy. I think we've probably all, if we took any iStat course read how to lie was statistics, which came out in late 1953, and it's this going on? It's only, now it's all automated. Right? It's the language statistics is I throw a lot of numbers at you and yeah. Yeah. Justifies it. Yeah. But, but I, I mean, I, I, I hope it's your mouth to God's ears in the sense that, um, uh, it is a house of cards. I hope it's a house of cards. Um, I tell you though, I've been shaken out house of cards for a long time and it has not budged. So, but you were, but you were actually you've code and I just have words. So, um, that makes a difference and you have a business and you're, so what is the name of your businesses, if you feel okay about sharing it? Cryptid technologies C R Y P. Okay. Like, you know, cryptids are, um, mythical beasts, like Bigfoot and Lochness. And so we thought that it was apropos because, you know, we're, we're hunting the things that everybody says exists, but nobody's ever seen we're hump, we're hunting the truly private digital transaction. We're hunting, you know, true personal privacy, we're hunting, uh, meaningful reductions in fraud. We're hunting all of these things, um, with our technology are, you know, and we are leaders in applied cryptography. We have many years, uh, that's not us not critical. No, we are cryptid.tech and our website is absolute crap at the moment. I think it's probably like a default WordPress we've been in. We've been ed you're hitting here. Okay. That's awesome. Yeah, I don't feel so about some of mine. We've been heads down on the crypto code so that we have to be the first to say to you, Laura, move some dollars. You know what I'm saying? We're, we're planning on making some real splashes, uh, headline grabbers here in the next few months. Um, one of the cool things about our, the technology we've done is, uh, we've worked out a way to do universal and FTS. Cause what we've invented authentic data economy is the, the mother of all digital rights management platforms, universal NFTs, and it teams are getting so beat up right now. And so tell us what is the, what is okay. What is the pony in and if he's that nobody else is seeing, because right now, well, it's approved. This goes back to what we started talking about. Everybody's thinking that everybody's looking for authenticity, provable, authenticity. Okay. And so when people buy NFTs, a lot of there's a lot of speculation, right? This is like the new toy that all the crypto, you know, Richard playing with. But, um, one of the things that supposedly attracts people is that when you buy an NFT, you have provable provenance that you own it. Right. It's authentic that I own it. It's like when I go to buy a DaVinci painting, right. A physical painting, um, I can go and get one that, that is a, like, almost an exact replica of like the Mona Lisa oil on a canvas that someone else painted. Okay. Not DaVinci, not the Mona Lisa, but a copy. And that's gonna cost me, you know, a couple grand. Okay. If I were to go buy the Mona Lisa, it would be hundreds and hundreds, millions of dollars, maybe even billions of dollars. What am I actually getting? If I can get a copy of the Mona Lisa from an artist down the street, what am I getting? If I buy the original one, I'm actually getting that stack of parchment letters that prove the provenance of that painting. That's what I'm really buying is the letter that says the lube got it from this gallery. And that gallery got it from this noble family and that noble family got it from this other church. And that church got it from this noble family who literally paid DaVinci as a patron to paint the painting. Right there, there is a stack of letters that the documents and unbroken chain of custody on that 400 year old painting. Right. And so that's the thing that's different. There's two oil paintings that look identical. One has a stack of provenance letters. And one does not the one with the stack of provenance letters. The provenance log essentially is worth uncounted, ungodly amounts of money, billions of dollars. The one that doesn't, I paid a couple grand for it. Okay. And if T's, that's why this, this whole joke of like I right. Clicked and saved and stole your NFT, it's like, well, I can make a call. I can buy a copy of the Mona Lisa that that's not the Mona Lisa. No, I understand. I'm not saying they are. I'm just, the, my point is, is that that's not the point. The point is you're paying the provenance of ownership. And what I find really funny is that Moxie, uh, being an applied cryptographer himself, Finally pointed out the emperor has no clothes, which is what all of us knew for a long time. When we looked at how they actually work under the covers that they're not recording like hashes and, you know, content, addressable, whatever pointers to the actual data, they're just, they're recording URLs. And we all know that URLs can point to servers that can come and go and that you can program a web server to serve up any data for any URL. Right. And that's what Moxy did. He made an NFT and he, and he made it so that it, it would show a different image depending on who was looking at it. And, uh, it, he rightly pointed out the technical limitations of what these companies are doing. And they bandaged, he bought an NFT or he sold that NFT. And then they took it down off his platform and he was like, wait a minute. I thought I. I bought it right. Or someone bought it. I thought they owned it. How did, how are you able to take it down if it's supposedly in the blockchain? And anyway, it, it's a really fun thing. That's why NFTs are getting beat up. But what we're talking about here is true ownership. I own the data and I own the provenance log and I can, and it uses applied cryptography. And if I need people to trust this Providence log, I can anchor it using cryptographic accumulators that amortize the cost of big blockchain transactions. I can anchor it anywhere. I can put it in big point equity in theory and whatever. And what I put there is an actual cryptographic proof of ownership. It links back to the off chain provenance log. So we're back to the stack of letters, but it's digital. It's essentially the same thing as what we already do today for real property. The title for my deed, for my home or the title for my car. It can be registered at a county clerk's office, right? The deed for my home is right. And the title for my car, you know, can be registered. Uh, if I write a book, I can go to the copyright office and I can register it. And, you know, I can assert my ownership rights. That's what we mean by the authentic data economy. It's like every piece of data has an associated prominence log with it. And some of the cryptographic innovations we've made cryptographic accumulators, um, being the most notable one allows us to amortize the cost of that. Um, cryptographic accumulators can, can hold any, any number of, uh, provenance log updates in them. It could be a billion, billion, billion in a single proof. That's only 32 bytes long, and you can store that in a Bitcoin. So essentially, if you want to look at it from a big point stamp or blockchain standpoint, we can make the transaction throughput of any blockchain effectively, infinite by storing the transactions off chain in a provenance log and storing proof of state in a cryptographic accumulator that gets recorded on chain. And this allows us to achieve scales of authentic data and essentially universal NFTs, um, you know, 10 to the 18 billions and billions and billions of people and devices creating billions and billions and billions of pieces of authentic data that not only is provably authentic, but also it can be revoked. So that's the part for e-commerce. It's like, sure. I'm, you know, I'm Equifax, I'm going to give you doc, you know, your credit history as authentic data. We later find out that you misrepresented yourself somehow we can revoke it. And part of those cryptographic qualifications is verifying that your authentic data is not revoked. Right? So, um, so it all flows together. And so I have a way, I'm sorry, doc, I'm gonna, I'm gonna finish this. I worked at, I worked out a way I own an NFT on it focusing. Well, I mean, I use air quotes, right? Because Moxy showed, we don't really own it. If I burn, if I create, if I download the NFT, it's an image and I build a provenance log using our system, you know, which is open source by the way. And I anchor it in Bitcoin. The hash of the provenance log, which is also the ID of my owners. Like that's the cryptographic proof that I own this document that creates a big, random number. Now, if I go into open C and I burn quote unquote, burn the token by moving the token to an address that is that big, random number, it makes the token unsendable while at the same time, it records in their blockchain, the cryptographic identity of the external provenance log. So I can liberate and if Ts from these platforms and take ownership of my myself, and I can choose whatever blockchain I want to be, because you've got the external Providence log. Yes. Okay. So, okay. As a kind of fun. So, so what do we have for muggles? Okay. So a muggle listening to this say, okay, I want my own external Providence log. And I want my authentic day. Production machine. Can I get one? Yeah. Okay. Right. We're in the process of building that right now, we can make everybody their own NFT marketplace. So if you write poems or you, like, let's say you're have a small following on Twitter because you tell jokes on Twitter or whatever, you could monetize your tweets. You could take the data of your tweets, create a provenance log, anchor it, and then put it for sale on your website using PayPal or whatever we can, we can do it. We can transfer ownership using the provenance. Twitter says, ah, I don't like you anymore. Uh, take kicking all your stuff off of here. I have some misses. I own that at it's now gone. Um, are that points to our suspended accounts, right? Well, you don't do the URL. You grab the content of the tweet. Because you're owning, that's the data you own, right? We're not worried about bites on a blockchain anymore, which is why open sea and these other wearable. That's why they record a URL and not the data itself, because they're worried about the number of bytes being stored in these blockchain transactions. We accumulate the proofs of state in these accumulators, in the accumulators, only 32 bytes. And they can have any number of proofs to state in them. And that's what gets recorded in the anchor, whether it's Bitcoin or if they're in or whatever, it doesn't matter. And so we, we can amortize the cost, right? Uh, this potentially could solve the energy problem for Bitcoin, for instance, right. Or whatever. Like one of the things I should point out is that this has it because it's fully decentralized. It has a relaxed, double spend protection. So I would not say, use this for cryptocurrencies, although. Maybe there is a way that using the L two protocol or something to do it. Um, but that's not what we're trying to solve here. We're trying to solve the positive assertion of ownership rights, and that is congruent with how things work in the United States. And I always have worked in the United States for intellectual property and in Europe as well. Like if I'm the first person to come up with an invention, I filed for the patent. First, I get the pet, not you doc. You were two minutes behind me. Right. We had the same idea, but I filed first it's mine. Okay. Same thing for copyrights. Same thing for, you know, all of the intellectual property kind of legal regime in this country, as I understand it, who comes first owns it. And that's all we're doing is we're actually leveraging the, the fact that these public blockchains are really cool in the sense that their clocks like Bitcoin is a clock that ticks every 10 minutes, roughly every 10 minutes. Okay. And nobody can stop it from. More importantly, nobody can go back and change the fact that it did that it tipped and what the data associated with that TIG was. And so by putting these proofs of state in at a certain time, we can establish total global ordering. And so I can assert that I was the first person to take that photo, right. I'm the first person who wrote that poem. I can prove it because I made a problem that slog and I anchored it two years ago. You claim to own it. You're saying your proof is two weeks ago. Well, here's a proof from two years ago, so I, when I own it, that's, that's what we're getting at here. And I can't wait to liberate my NFT from open sea and write a whole article and send out press release. I think it's gonna be really fun. I hope more people do it, especially in light of Moxie's. So Catherine, we're an hour. We've covered it. Thank you so much. Thanks. This is, this is good. It's always a learning experience and um, yeah. Yeah. I hope we've uncovered some stuff and I hope without sounding like our tin foil hats are too tight, but seriously, this is real. Yeah. What really scares me is the eradication of plausible deniability for, uh, open to the public spaces. Right. That thing really scares me because I think that that's also to the skating rink. Yeah. I think that, I think that the global, I think the social credit system would form itself spontaneously because of this combination of these perverse incentives, right? The reputational risk and fiduciary responsibility, plus the lack of plausible deniability. That immediately turns every grocery store, every gas station, every, you know, rental office into like Twitter in the sense that they have to police who uses their systems. That really scares me. That really scares me. That I think is the strongest argument for absolute privacy I've ever come up with. You know, what scares me the most, as long as we're on the topic and what scares me the most is all the things I haven't thought of that people might do with all of this. No kidding. That's true. Yeah. I'll send you, let me send you a bunch of links to my articles. So I've got 1, 2, 3, 4, 5, 6 to share with you. The very first one I ever wrote was the principles of user sovereignty followed by unified through decentralization. That's our philosophical foundation for our entire company and everything we build. And then the more recent stuff in the last few months was achieving absolute privacy. Zero architecture is the way forward, which describes cryptographic qualifications. And then my latest one. The barn burner is the theory of digital gates. And so I'll send you guys all of those. Um, I'll probably have to do it over email because we're going to get off of here real quick. Right. I don't think I can scare them up while we're on here live. That's totally fine. Um, yeah, so, well, cool. Well, I'm sure our listeners will look forward to me, including them in the discussion. Hey, we're looking for angel investors. I shouldn't have to say that. That's where we're at. I'm the CEO. So we are looking for angel investors, the right investors. So anybody in your audience wants to get a piece of, uh, our company. And let me know, you got to work on that website though, just saying, you know, I think it might become a joke that our website, you know, this is your NFD right here. Yeah. You need to begin to feel that, you know, Take a picture of that and sell it. This is, this is your zygote. Uh, that is not yet a blast, Sheila, Arizona. Yeah. I think what's going to be really funny. I can't figure out what it is and I'm open to suggestions, but there is something that we have created that is like, might be valuable to somebody. And I want to sell it as the first universal NFT for sale on our website and sell it for like a million bucks. And if you bought it that money would build this company. Right. And I don't know, maybe like you get the ownership, you get the rights to the principles of user sovereignty or something. Right? Like imagine if JPB had sold the declaration of independence of cyberspace and NMT, right. He could have funded the entire eff for years if this even existed back then. So yeah, it barely existed. Right. Anyway, it'd be fun. I admit our website sucks. So if anybody wants to volunteer and help us, but it's great work. We're code guys. And we were crypto code guys. And so not HTML email. Well, yeah. Um, we're fixing the website slow apparently. Well, cool. Well, thank you so much for joining us again and thank you to everyone who has made it this far. And thank you so much for having me on yeah, you you're wonderful people and I love the opportunity to talk with you.