Anna (00:00:05): Welcome to Zero Knowledge. I'm your host, Anna Rose. In this podcast, we will be exploring the latest in zero knowledge research and the decentralized web, as well as new paradigms that promise to change the way we interact and transact online. This week, I chat with Harry Halpin, the CEO of Nym, which is a privacy infrastructure project. We talk about how his early activism brought him into contact with the privacy sphere and later with the blockchain world and how he understands the evolving tech privacy landscape. We then catch up about Nym and how this project aims to solve for privacy at a particular place in the blockchain stack. Anna (00:00:48): But before we start in, I want to highlight our ZK Podcast grant over on Gitcoin. CLR matching should be kicking off by the time this episode airs. So if you're interested in supporting the show, now would be a great time to make a donation. Over on Gitcoin, right now, all donations are matched. Previous Gitcoin donations went towards funding, the development of the new ZK website and future donations will give us a chance to build even more cool stuff for the zk community. So, yeah, please head over there now, I'm adding the link in the show notes, and help us grow this thing even more! Anna (00:01:19): I also want to thank this week's sponsor, Mina Protocol. Mina is the world's lightest blockchain, powered by participants. It's a layer one protocol working to connect crypto to the real world. This means developers can leverage private verified real-world data from any website to build decentralized apps. Mina's decentralized apps, called Snapps, also allow users to access on-chain services without sacrificing personal data privacy. Mina replaces the traditional blockchain with a zero knowledge proof, which ensures a super light chain that stays around 22 kb and allows every participant to act as a full node. As an aside, I am both an advisor to the project and a validator, and I can definitely recommend you check it out. Especially now, as the tools to build Snapps are coming online. Mina's mainnet has been live for almost 6 months and the ecosystem is growing fast. So do join the community and find out more by visiting minaprotocol.com. So thank you again, Mina Protocol! Now here is my interview with Harry Halpin. Anna (00:02:22): So I want to welcome Harry Halpin to the show. He's the CEO of Nym, a privacy infrastructure project, and he used to work on crypto standards in Web 2.0 and at MIT. Welcome to the show! Harry (00:02:31): Hi, Anna. I'm a big fan of the show and thanks for inviting me on. Anna (00:02:37): Cool. I think it was over a year ago, I had your colleague, Claudia Diaz, on the show. And in that episode, we actually went pretty deep into Nym. With this episode I'm curious to hear a little bit more about privacy, actually have that conversation about where your thinking comes from. And then I definitely want to, later on in the episode, revisit Nym, hear what's happened since I had you guys on last and find out how maybe Nym is solving some of these concerns or issues that we explore. Harry (00:03:05): Yeah. So, from basically my perspective, what Nym is really doing is trying to solve one smaller part of a larger privacy problem, or even you could call it a privacy crisis. So Nym's focus is on the network level of privacy. And I think, Claudia Diaz, our chief scientist, is a world expert on that particular part of the privacy puzzle. And what Nym hopes to do is to enable privacy for essentially other privacy projects, but on that network level. And so I think it'd be really good though to review and think about what it means to be private in a more holistic and even social sense, because privacy is really different from security insofar as that it's not so easily, I think, reduceable to essentially mathematical concepts or cryptographic concepts. And it's a more socially embedded and ultimately holistic concept. Anna (00:04:11): Do you feel like this idea of privacy is you either have it or you don't have it? Or do you feel like there is a spectrum? What you were just describing, if you need holistic privacy, you need from end-to-end privacy, is that what actual privacy is? Or are there percentages private? I don't know. Harry (00:04:31): Yeah, I think it's a spectrum, absolute privacy is very hard to even think about properly. So for example, it doesn't mean that it's undetectable, you're even communicating to someone that's unobservable to an outside party, when you're communicating. And some form of absolute anonymity could be considered one extreme. On the other hand, privacy can also be thought of as a selective disclosure, I may want to reveal to someone that I have a COVID vaccination, or I don't have a COVID vaccination or test result, but I probably don't want that same person, a border guard, for example, unnecessarily knowing what restaurants I've been going to, or my entire flight itinerary, or even my name and address, maybe they only need the photo to verify that information. So I think, privacy is effectively a spectrum and when I say "holistic privacy", I say, even given a particular kind of privacy, that privacy property, a particular system is attempting to achieve, that somehow maps to the expectations of the users, which are also very social expectations. When you talk about achieving those privacy properties, it's important to achieve them on different levels of the system. And so when I talk about holistic privacy, I don't mean you have to be absolutely anonymous, but I do mean something close to end-to-end insofar as that the property is upheld on all levels of the protocol, because otherwise an adversary could easily violate your privacy by kind of looking under the hood, for example. For example, looking at the network traffic, at a blockchain transaction or some other level in the system. Anna (00:06:29): Harry, I want to keep going into this, but I was just thinking before we do that, I do want to actually explore your background a little bit more. What led you to working even on this topic? As I mentioned in the first line, you had been working on crypto standards in Web 2.0. Can you share a little bit about your journey towards this topic? Were you always into privacy? Harry (00:06:51): Yeah, I've been into privacy for a very long time, but I always considered it to be, honestly, up until the Snowden revelations, very secondary to whatever else I happen to be doing at a time. So, as a young person, I was engaged in lots of different kinds of activism or even a revolutionary activity around the anti-globalization movement and around climate change. And we had to deal with a lot of very real police oppression. So for example, due to my climate change activism, I actually had a personal undercover cop assigned to me at some point. So that obviously was a kind of eye-opener. I've always been interested in computing and like many young people in the late 90's - early 2000's, the main problem that faced the internet was lack of the ability to essentially get information out there. So I worked a lot on open publishing systems, including one called Indymedia, which was basically to broadcast news of things like police brutality or protests that were otherwise being censored by the mainstream media. And at a certain point I decided to get a PhD and I ended up going to Scotland, to the University of Edinburgh, where I did a PhD in artificial intelligence, not in cryptography, under a fellow called Andy Clark, who is also a philosopher of artificial intelligence. And that's when I got into machine learning. And after my PhD, during my PhD, I realized that AI — this was around 2010 and 2011 —, was going to allow increasingly powerful big data processing and inference. And I said, "Well, look, we need to create systems that are not just allowing these big number crunching and surveillance of actors, but we need to create systems which can basically help secure people". And so that's when I took a job with Tim Berners-Lee, after I graduated, the inventor of the web, to look into standardizing crypto across browsers, because I was doing very basic stuff, compared to what people are doing now in the blockchain space. We were trying to understand how we could even do very standard digital signatures in the browser, because at the time people were programming crazy stuff in Javascript and had all sorts of side-channel attacks. Even though I wasn't a cryptographic expert, I did take some courses under Ron Rivest and friends at MIT. We didn't really have a clear path to allow people to even create cryptographically enhanced web applications. However, two things happened that pushed me into privacy. One is that the Snowden revelations happened. And the Snowden revelations made me realize that effectively my worst nightmares about big data and AI were true, that these kinds of surveillance systems could be used by governments and could be used to target activists. And I don't think activists who are working on things like climate change or even revolutionaries should be targeted. Arab Spring had a large effect on me. I visited North Africa, had friends in prison tortured and killed. And so this threat of automating these systems of oppression really concerned me. And then I got very burnt out on Web 2.0. I was very skeptical of blockchain technology when it first appeared. I didn't believe, honestly, that it was a giant breakthrough. The cryptographically verifiable log seemed to be just great concept, but what I misunderstood, and it took me a few years to understand, is that the tech monopolies had essentially created a disaster. And this came home to my job at MIT, when they asked my organization to work on digital rights management. So this is basically taking key material from users and hiding it from them on purpose, in order to make them pay for streaming media services. And so I decided this wasn't really going anywhere. And also I started looking very closely at the technology behind mass surveillance and I felt we needed more powerful anonymity tools. I wasn't particularly interested in zero knowledge proofs, I was looking more at things like Tor and Signal, and it occurred to me that we could build Mixnet. So myself and some of my academic friends got together, we decided that... Aggelos Kiayias, who helped eventually launch a lot of Cardano, George Danezis, who eventually worked at Libra, Claudia Diaz, who you've interviewed and now works with me at Nym, we all got together and we basically said, "Let's try to build a real-world mixnet, this could be our contribution to fighting mass surveillance and not just on the cryptographic level, but on the big data surveillance level". And that's what we did. At the same point, we hit a, I would say, a dead end, because ultimately research only takes you so far. At some point you really have to implement. And we were essentially an anti-mass surveillance project, looking for deployment, but we didn't have a sustainable economic model. And it felt like government research grants, ours was funded by the European commission, didn't seem like it was really the way forward. And so, strangely enough, I ran into people from Binance and other places, and they were really into solving these same kinds of problems. And so I was pleasantly surprised. We had a lot of support from diverse characters, everyone from Adam Back to Vlad Zamfir to Amir Taaki, a lot of people were very encouraging. And so we decided to launch a company and that's how I got here today. And I really do believe that at this point, even though I was very skeptical and I remember even saying to Vitalik, "I don't feel like Ethereum can solve all the world's problems" in what essentially to me, it looked like the equivalent of Python scripts and Solidity. I do believe that this tremendous wave of creativity and innovation that we've seen particularly in the zero knowledge space, around zero knowledge proofs, is something that can basically save us from both mass surveillance and create new social and economic models. And so that very much excites me. And that's what gets me going everywhere. Anna (00:13:57): What year were you talking about, building this Mixnet? What year are we thinking? Harry (00:14:02): Oh, I mean, Mixnet have been in the back of our heads for a while. There had been running mixnets, people don't really remember them, called Mixminion, and before that Mixmaster, that were used by the cypherpunks who were the early cryptographic revolutionaries to basically escape censorship and to do anonymous email posting. So for example, there's pretty decent evidence that probably Nakamoto knew about these technologies, the cypherpunks mailing list was helped by Len Sassaman, who was actually a PhD student, I think, of Claudia Diaz who worked on one of the first real-world mixnet, Mixmaster, back in 2003. So these are some really old technologies. And we had friends running email servers for journalists, where they really needed to disguise their identity from the mafia and from the police to do... And there was old software called Mixminion that actually came out before Tor, again, 2003, and that people have had running that software for 10 years without an upgrade. And so we knew about mixnets, but we didn't think really seriously about them, because we weren't aware of the threat models. So Snowden was the breakthrough in 2012. And that's when I started talking to George Danezis and other people about trying to make mixnets more practical and more generic. And from that line of conversation started our R&D work in 2014. And now, I think, Nym kicked off towards the end of 2018. So it's been going on a very long time. I mean, what can we say? We feel very lucky, because it's just pure historical accident that our work on mixnets and Snowden's mass surveillance revelations all occurred right before this huge boom around Bitcoin and cryptocurrency, which has led to this whole other wave of energy around zero knowledge proofs. And so we feel very lucky to be alive at this particular juncture. Anna (00:16:11): I want to hear of, you mentioned something just before, this idea that you didn't get it right away, or you didn't fully understand the power of a Bitcoin or cryptocurrency system. What changed? Harry (00:16:21): I remember when the Bitcoin whitepaper first came out, I was good friends with this fellow called Ben Laurie, who currently works at Google, running secure enclave projects, but was an advisor to WikiLeaks and generally all around cool cypherpunk, inventor of open SSL, that little lock in your browser window library. And when the whitepaper first came out, I think Ben gave me the worst financial advice I've ever received. I said, "Wow! You think this Bitcoin thing is for real?" And he was like, "Probably not. The mining thing seems sort of wasteful. And we know proof of work doesn't work here. I wrote a paper audit a few years ago". So I didn't really take it too seriously, there were so many weird things being thrown around the internet at the time that it just seemed to be yet another weird concept with some cool code. But then what started me taking it seriously is I went to some conferences in London and I saw all of this energy from young people. And that while many of us were also very concerned around privacy, we saw not just the work of Zcash, but the earlier work on, for example, CoinJoin and stealth addresses. And while it all seemed a little bit shaky, let's say, it seemed like there was a whole lot of energy put into dealing with the privacy problems. And I think Zcash was one kind of breakthrough insofar as that it was, I think, the first serious academic work or Zerocoin and then Zerocash after that, to apply zero knowledge proofs to blockchain-based systems. So I think the two criticisms I had, one of which is that these economic incentives would not work, was ameliorated by the fact that indeed it was drawing people to these systems. And the second criticism I had, which is that there was no privacy, even, I would say, earlier work of Zerocash, like the liquid sidechain confidential transactions essentially forms a whole morphic encryption, showed that there were ways to hack privacy on top of Bitcoin and then whole new cryptocurrencies. And I was still, honestly, a bit skeptical, but the rate of progress of the community convinced me that it was the right place to be. And I think we've seen that play out, particularly in the zero knowledge proof space, but also in other spaces, scalability, incentives, governance. And yeah, so that's what convinced me. It took a number of years, to be honest. Anna (00:18:50): Yeah, me too. I was also late to the party. I've known about this, as a tech, I was a tech founder, running around to lots of conferences. There was always a Fintech or blockchain track, and I knew about it, but I definitely had not... I didn't jump in until 2017. Harry (00:19:06): I was very skeptical of founding companies, to be honest. I considered, the World Wide Web Consortium at MIT, we consider ourselves the Knights of the Round Table trying to work for the common good by putting cryptography in browsers. We were very skeptical of startups in the blockchain space, because we had seen so many failed and scammy startups in the Web 2.0 space. And we felt that it was unlikely any of these Bitcoin or blockchain-based startups even succeed. And then what actually, I remember, really changed my line was Blockstream, which at that point was, I think it raised a fairly small amount, 20 million, then I said, "Wow! That's Adam Back! I know Adam Back, I've read his papers, he's a great guy! And he's put together what looks like a good team. And they look like they're serious. I think they might actually be able to get some real work done!" And then I was also excited by the Brave token sale and then the Zcash launch and all of a sudden we started seeing, I think, more and more serious teams, and from the perspective of them and privacy tech, we couldn't put our software into real code, because we just didn't have the funds and we couldn't attract the right people. And so we said, "Well, look, if these other teams can do this for, for example, zero knowledge proofs, like Electric Coin Co, which I think is now bootstrap, we can do this, too for mixnets!" That's really what led to Nym. Anna (00:20:39): This brings up something interesting though, this idea of startups in an activism space. And I think you pointed this out, blockchain technology, it's coming from such an activist route. And yet there are companies that were founded around this. So how do you think about that connection point, startups and activism? Can they actually co-exist? Harry (00:21:00): Yeah, I've changed my mind. I used to think they were completely antithetical to each other. I thought that startups were essentially getting rich quick scams, that essentially when I looked at things, [they] did not seem very revolutionary. Uber is interesting, honestly I'm glad it exists, it lets me get around easily, but I felt it was just ripping off its drivers. And I didn't feel like it was even very complicated, and as a researcher, I was very interested in complex hard problems. And I didn't feel there was too much going on under the hood. I remember visiting Moxie [Marlinspike] in Twitter and Twitter, honestly, at that point, wasn't very complicated. There wasn't too much going on from an R&D perspective and all the action was at Google. And it was clear that actually it was very asymmetric, in order to get to solve all of these hard problems, you need big data and only a few companies had that data. So I felt it was also pointless to do a startup, because your exit was just to be bought by a Silicon Valley tech company. And you couldn't even really be successful, because they had an asymmetric advantage over you due to their monopoly of big data. However, as activists, we should pursue a more nonprofit route. And we were really into the concept that everyone should run their own server, people should run their own email servers even, should run their own, at this point, Jabber servers, should run Linux and use, at this point, people are using mostly OTR and Jabber and PGP. And then what happened, what really changed my mind, was actually getting away from European, American circles and going to places like Middle East, where it was clear that no one could use any of these tools, and furthermore, the tools that people were relying on, Skype in Syria, Gmail, were actually often pretty good, if they could escape censorship or there was no backdoors in them, that were targeting these particular activists. And so what became clear to me was that usability had to be a big focus and all of the tools that we had built in the last 10 - 20 years as cypherpunks, let's say, and activists, were unusable, were not only unusable, but exceedingly unusable. And that this lack of usability was making these tools not accessible to the people that actually needed them. In order to make usable tools or to make tools that scale, you do need funds and you need talented people. And the problem is that the venture capital, up until the advent of blockchain, was not interested in privacy. I remember giving a pitch at... I gave you a pitch at Googleplex, New York on privacy. We couldn't get any interest, pre-cryptocurrency, this was in 2016, from any VC, despite winning some MIT pitch contests. In 2017 I gave a similar pitch, if I remember correctly, in California, Silicon Valley, very little interest. It was cryptocurrency that all changed, because we saw a unique historical convergence between capital and cryptography, with the goal of making this stuff economically sustainable and usable so that we wouldn't be dependent on government grants, which could be perverted. So for example, the US government really wanted to resist censorship in places like Iran or Venezuela, but was not particularly interested in resisting censorship inside the United States. And activist technologies. And this to me seems to be an ideal combination, that we can make startups that serve not just actors, but most of the world is in a situation where they need privacy. Most of the world does not live under anything resembling a fair democratic government. And most fair democratic governments are not actually so in practice, they're slipping more and more towards authoritarianism. And if we take this as a universal principle, these are also huge markets and these are users, and that's why I think privacy needs to focus. We need to focus. Maybe I would say, even move a little bit away from abstract research and development and move towards figuring out who our real users are, what their threat models all are and serving those users. And I think startups, by nature, of being lean by nature, of being financially autonomous, ideally, not dependent on outside forces as much as, say, nonprofits or universities, I think are in a better position to do that kind of work. And that work, I do believe, is the only way we can realistically counter mass surveillance. Anna (00:25:49): Do you believe that you are still working with an activist mindset or do you feel like that's evolved a little bit? And the reason I ask this is when you receive funding, I get the sense there's going to be some compromises, even if there's maybe less compromises than a university or an NGO. Do you still see yourself full on in the activist camp? Harry (00:26:10): Well, I had a personal police agent, so I can no longer go to climate change manifestation. I consider this just another way of acting. I almost would prefer to say revolutionary than activist, because I think what we're trying to aim for is systematic social change, that the problems that are facing us, mass surveillance is a symptom of larger problems of mass inequality, social instability, lack of economic opportunity that affects most of the world. That's what leads to things like the Taliban in Afghanistan taking over, or US imperialism in the Middle East. And we want people to be democratically governed, autonomous and technologically literate basically. And that, we think, that's a fundamental shift. And I don't think the compromises that, I mean, there's always compromises, but I don't think the compromises we've had to take, in terms of venture capital, are worse than the compromises that one would have to take at a university or a nonprofit. And even, I actually think they're on some level healthy, because what venture capital is interested is in growth. We built early versions of the Mixnet, which came out as Katzenpost, the Katzenpost code base under the Panoramix project for the European commission. There was no real desire by the programmers to see growth, there was a lot of the researchers, who were more interested in publishing papers, than in seeing users use the system. But if you talk to funders like Polychain, who are our primary backer right now, their main concern is user growth. They want to see people running nodes, they want to see people actually using the system and that's what excites them. And that's also, I think, a nice corrective to the, I would say, more insular activist nonprofit world and the academic paper-publishing world. And I hope, honestly, I hope more people, who are currently in academia nonprofit space, try to do startups, because we can create the best systems of the world, but if no one's using them, I do feel like it's not pointless, it leaves the foundation for future generations to build, possibly use stuff, even if no one uses your particular technique. So we support all sorts of R&D, but that being said, there's no better R&D, than R&D that's actually used by someone. And I can, for example, save a life or help reshape our economic system. And this is, I honestly think, the startup venture capital model is currently the only way to do that. Anna (00:29:02): Years ago, when I was studying, I did a course on the dynamics of power, I think it was. And I remember writing this very short paper about the reformer versus the revolutionary and the relationship between those two things. So the revolutionary is maybe more the activist, the protester, the one pushing the new ideas. And then there's the reformers, who are in the system. And I wonder what your take is on that. Do you feel like there needs to be a strong connection between those two? Or do you feel like it should actually break apart? Harry (00:29:35): I think that the real breaking apart will happen naturally within any given system in decline, if there's no change. And if that change comes more from the inside or outside, is hard to predict and may not even matter in the long run. So I think our goal is actually to prevent breaking apart that harms society and individuals, and the traditional pre-internet divide between protesters of the streets versus, let's say, reformers in the government or in the bank is falling apart. A lot of people, who consider themselves maybe more reformist, are actually quite revolutionary in what they're doing, but I do appreciate seeing this diverse amount of people looking at different ways to change society. And again, our job is to provide tools, people from these different groups can use them. And the great thing about privacy technology is that the more kinds of different people that use a system, the larger your real privacy is. And just as like, if the only users of Signal were people, who were dreadlocked anarchists like Moxie, or if the only people, that use Tor, were Middle East activists who were under attack, this would all be terrible, because you have to hide in a crowd. But if Tor is being used by governments, if it's being used by activists from the Middle East, if it's being used by Ethereum and Bitcoin transactions, that makes Tor more powerful, because everyone's transactions are going to hide everyone else's transactions and privacy becomes a common or public good. And I think we can see that, that kind of model, also with Nym, where we're hoping to get as many diverse kinds of user bases together. And then what people do politically with this technology, by design, we don't know, but I do bet that it will be for the greater good. Anna (00:31:34): And this is where, you've mentioned this a couple of times, but this idea of usability, this is where this comes in, to make it, as far as I understand it, the reason that these tools, the way you've said it, is they haven't been used, because they're unusable in a way. They're just not usable by most people. By making it usable, then at least you offer it as a real choice to a majority of people. Harry (00:31:56): Yeah. So I think it's good to look at, for example, PGP, as an example of an unusable technology. I don't know, if you've ever tried to use it. Anna (00:32:05): Yes, it's awkward. Harry (00:32:06): Yeah, It's awkward at best. You install these weird plug-ins, there's this key signing ceremony, there's no authentication, so you don't really know who you're sending your key to. People have bizarre behavior on signatures, people lose their keys. Most cryptographers I know, don't use PGP. And the technology stack, I have a small paper on this, [is] essentially unusable and insecure by design. And why we noticed this more and more is because actually we did a study, it's published in the Security Standardization Research Conference in 2019, where we basically interviewed developers, we interviewed users, and we interviewed users in different countries, not just the United States and Europe, but also places like Syria and Egypt and China. And what we found was that users actually had pretty well-defined threat models. They understood what threats they were up against often, they weren't really confused. Where the confusion was, it was off to the developers, who were focusing on things very different, than maybe what they wanted their users to be doing. And confusion, essentially, in the way that they believed that user was going to use the tool. And the weirdest thing is that, for many years, people felt that the very concept of a key and usability studies was the big problem. There's this famous paper by Alma Whitten, who I think is currently Google, called "Why Johnny can't use PGP", and the general result in that paper is that no one can ever possibly do key management. But what you discover is that, when you essentially connect money to that key, people can be pretty good about keeping it. Anna (00:34:08): They'll figure it out. Harry (00:34:09): They'll figure it out, if there's thousands of hundreds of thousands of dollars attached to it, or millions. And I think that's a huge breakthrough. And very non-technical people will write down their seed phrase and try not to lose it. And of course, a lot of people lose it, but I'm sure the usage of cryptocurrencies is much higher than the usage of PGP. And that's, I think, something that privacy technologies have to take to heart. We always have thought that people would never use privacy technologies, because privacy technologies do fundamental trade-offs between, essentially, latency, computation and privacy. It's always more expensive and more difficult to make a privacy-enhanced system than non-privacy-enhanced system. However, users are increasingly motivated to use these kinds of systems. And so the question is how do we connect the systems we have with the people that need them. And that's where I think startups come into play. Anna (00:35:11): And I think what you're tapping into here is this idea of what incentivizes the use, so... And I also feel, if you look at the privacy space, historically, it often presents itself as like, "it will protect you from", and not "it will bring you towards". And the Bitcoin blockchain, anything cryptocurrency-focused, it's incentivizing you towards something. It's like, "Join this, because it will give you benefit, that takes you from now up". Whereas with privacy, it's often like "Protect what you have back". I always got that sense. And I do think the combination of the incentives of blockchain with privacy tech would definitely push people more forward to want it not only to try to regain what has been lost, but rather go towards something better. Harry (00:35:59): Yeah. I completely believe that incentives are the way to go. I was shocked, when we launched Nym, how many nodes we got, just because they were getting a valueless testnet token, and how global the phenomenon was. It is true, under the conditions of late capitalism the one thing that will unite many people across the world... Anna (00:36:21): ...is greed. Harry (00:36:22): ...is greed, is desire to get rich, but let's not, it is a very, I would say, Puritan standpoint to say this is necessarily bad. There's been a huge, tremendous profusion of basic systems administration skills throughout the world. And there's not so many jobs available. So if you want to work at Google and you're from Ghana or Indonesia, it's going to be much harder. You have a lot of things stacked against you. You have language, you have credentials, location, visas. And the great thing about cryptocurrencies is it takes away a lot of that by being permissionless. People who just have some technical knowledge can start a node and then provide services for other people. And that's completely amazing, that you can have situations, where the same technology, for example, meshnets, which we're seeing projects like Helium incentivize, I think, is useful for both revolutionaries in situations, where their internet starts falling apart, and can make people a lot of money in countries, where they can't otherwise have opportunity. So I think this is a real breakthrough, and this is a huge breakthrough in particular for privacy-enhancing technologies, which has had a very hard time getting adopted. But I do believe that now we're seeing something, which we have never seen before, which is, despite their recent mistakes with client-side scanning, if you just look at their advertising, Apple's now advertising themselves as a privacy platform. They've even created their own version of Tor, Private internet Relay. So Apple, their user base is not activists or revolutionaries or people who are concerned about mass surveillance. Their user base is somewhat bourgeois, richer consumers, but they believe that these people are interested in privacy and will prefer Apple products over, say, Google products, if Apple has better privacy. And so what we're seeing is the mainstreaming of privacy, and this is good, this is a win for everyone. It'ss a win for people that want to make money by producing and maintaining privacy-enhanced technologies. And it's a win for people who might get killed, if the privacy technology doesn't work. Anna (00:38:45): What is interesting though, in what you described, and this is a little off the privacy tip, but it's the idea of capitalism and free market as this engine, where... I want to go back [to] your early story of being anti-globalist climate change activist. Were you at that time pro-capitalism? Because I remember that time or at least those movements as being quite anti-capitalist as well. So has there been an evolution for you on that front? Harry (00:39:14): Yeah. I'm still anti-capitalist, insofar as I don't believe that capitalism can sustain itself indefinitely. Obviously capitalism is not necessarily pre-ordained to solve the climate crisis, which is of its own creation. You cannot have infinite growth on a finite planet. And maybe space exploration will work out, maybe it won't, I'm quite skeptical because, it's the one thing we know about outer space is that how there most things are dead. But that being said, capitalism is also a tremendous force for growth. And that is inarguable. And so I think that the question is, if you really want fundamental social change, you have to use any tool that's at your disposal. And idealism will not help you, when it's a do-or-die moment, or when you're trying to build real software. And so while I'm skeptical that capitalism is an ideal system, I do think there's lots of parts of capitalism, markets, for example, there's lots of elements of community banking, lots of elements of Austrian economics, even, which are necessary, in order to create a new kind of society. And I don't think that authoritarian centralized societies, even if they consider themselves anti-capitalist or socialist, are the future, in fact, I can guarantee they're not. What will probably, I think, be more likely is that we're going to see a fusion between different kinds of more network-based and decentralized forms of governance and sustainability and society, as in society. And this is a shift, which at this point I believe is almost inevitable, but there will be a lot of resistance against it from centralized powerful actors, of which many are quite deeply embedded in global capital. And our job should be to level that playing field and level that playing field through technology. Anna (00:41:32): Also a little tangent here, but I've more and more come to think of climate change action to actually fix these things. Also through that lens of how do you incentivize, basically, how do you direct all of those brains at this problem? And I think scaring people is one way, but I don't know if it's working very well. And I think using the carrot, in any way possible, to incentivize people towards dealing with that problem and coming up with solutions is better. So, yeah, I don't know. I actually used to be more on the "all markets are sketchy and dangerous, and money is bad and it's going to bring everything down". And I would say there's been an evolution over my lifetime. Harry (00:42:15): Well, I would say collective intelligence is one way to think about it. And a lot of these effects are primarily social. So there's a wonderful study by Alex Pentland, where they were looking at climate carbon emissions. And they said, "Well, if you basically give people a carbon budget and say, "you have to stick to that", people tend not to stick to it". But what people are very interested in is, if you give them access to view their neighbor's carbon budgets, then they actually are more competitive about being less carbon-emitting, than their neighbor. And that was a pretty astounding finding. And so I think for me, it's about using every tool in the toolbox, including market-based tools, in order to basically solve these huge, crisis-level, basically species-level problems that are facing the planet. And on some level I consider the fight for privacy and mass surveillance to be a rearguard battle. It's a battle of self-defense because, because if you don't have privacy, then it's going to be very hard to do anything creative. It's a bit like, if you ever tried to do a trade and you have no privacy, it's pretty easy to get front-run. If you are a social movement and you want to have an interesting discussion about changing society, if you're under a total surveillance, what if that's flagged for extremism? What if you're prevented from having that conversation? I mean, this, I think would lead society to a place of stasis and a downhill stagnation and eventual crisis. And in fact, to be a healthy society, there have to be these private spaces for discussion and growth and the exercise of human freedom and creativity. And so privacy-enhancing technologies just allow us to build that space in a way that maps to our natural intuitions. When I'm having a conversation with someone in a room, I don't assume someone 1000 km away is listening, I assume that's between you and me. And there's a whole set of cultural and social and evolutionary assumptions that come into that. But also on the internet we no longer have those assumptions, any conversation we're having could be monitored and could be weaponized against us. And while it seems paranoid, we are seeing that happen in countries, which are falling apart, and in countries, which want to stifle internal dissent. And you need to have internal dissent and creativity, in order for society to regenerate itself. And if anything, we need more and more of it, and we need a truly minced collection of brains, as you put it, of collective intelligence to deal with these really large problems like climate change. So I personally consider climate change to be larger than what I know, or maybe even vaguely qualified to deal with. But I do think that privacy-enhancing technologies can play a role in preventing the people that want to tackle this problem from being personally spied and blacklisted and arrested, as I was, when I didn't have access to these technologies 10 years ago. Anna (00:45:36): Yeah. A few months ago you actually gave a talk at an event that we did with the Zero Knowledge Validator, focused on Cosmos. And in that, and I'll link to this in the show notes, you had talked in that presentation about a scale of privacy, and I want to dig back into that. Maybe we can explain it on the show. And I also want to understand, where Nym and the work that you're doing fits into that. What part, we talked about this earlier in the episode that it targets a very specific part of the privacy spectrum or privacy stack, let's say. So, yeah, let's first talk on the spectrum. What are the edges of this? Harry (00:46:11): Yeah, so I think on some level you can imagine that privacy means to be absolutely anonymous. So it is undetectable and unobservable, that one is even existing, much less communicating, and that's a form of absolute anonymity. And then, on the other side of the spectrum, you could imagine total identifiability, where every aspect of your very being is recorded and collected, as a collection of predicates or collection of statistical approximations, and that there is no way to communicate without being absolutely known to anyone. And in between, there's this vast spectrum of what I call privacy, which is the form of selective disclosure, where you say, "I would like to be known, but in this particular way, and for these particular purposes". And this is a very important concept. This allows you to reveal different sides of yourself to different people and prevents the abuse of your data by revealing too much to those who don't really need to know. And in between that, you can imagine all sorts of gradiations, for example, to be pseudonymous means that you don't need to know my real name, but you have an identifier, which for example, can reappear in a chat or could reappear on Twitter. Anna (00:47:48): So you could see the behavior of this identity, but not necessarily know who it's attached to. Harry (00:47:54): Yes. And the pseudonym can achieve its own reputation, can have its own personality. Satoshi Nakamoto being, for example, one example of a great pseudonym. But being pseudonymous is not being anonymous and it's simply a kind of selective disclosure. And so the way to think about this was originally theorized by an absolutely wonderful privacy-enhanced scientist called Andreas Pfitzmann, who basically in a paper called "Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management" produced what he considered a consolidated proposal for terminology. And he essentially based all this on a concept called unlinkability. I think unlinkability is very key to understanding the multi-sided nature of privacy. So unlinkability, to quote Pfitzmann, means that within this system, there are many possible items. From the perspective of an outsider, an attacker, these items of interest, that the attacker is looking at, are no more and no less related after the observation, then they are related concerning their a priori knowledge. And that's important, because that basically says that I can't link these two items. I can't link, for example, an email that I sent with who sent it, that's sender unlinkability. Likewise, I might want to send an email, I want to be a whistleblower, I want to send an email to all sorts of people, but I don't want to know what journalists received it, I might endanger them. That's another kind of unlinkability, that's receiver unlinkability. And these are different concepts, than things like unobservability, where the state of items of interest are indistinguishable from any others of the same type. So this is like, for example, what we do in Nym, where we add fake traffic, so it's not clear what's your real message and what's your fake message. And then there's even a harder question, which often is addressed by obfuscation technologies, what is undetectable. So when you use Nym or Tor, it's usually pretty detectable you're using it, your traffic has a particular signature. And so undetectability means that it's impossible to even tell if someone's using a given system at all. So I think this is a good way to think about, that for any given system, there's a range of being identified and being private, the way you handle that selective disclosure and the way you maintain privacy is parts of your activity, or your behavior yourself even. You want to unlink, to de-link from other parts. Anna (00:50:43): And with Nym, can you observe Nym action, but not link it? Or can you not see it? Harry (00:50:51): So what Nym is trying to do is, this comes back to this notion of holistic privacy, where privacy is not reduceable to any single layer of abstraction. And so a leak at any layer, for example, at the blockchain layer, the app level, the network layer can eliminate the privacy properties of the system as a whole. And we've seen this even with Zcash. So if you just look at the chain, we have a copy of the Zcash chain, the shielded pool looks great, but if you're observing transactions being sent to wallets, as we saw, I think, the Tramèr, Paterson, and I think even Dan Boneh was involved, side-channel tactic can use these network level timing characteristics to de-anonymize, to link in other words, particular anonymous transactions. And so they're no longer anonymous. Anna (00:51:44): With the unshielded accounts, I guess. Harry (00:51:47): So that's essentially what a side-channel attack is. You can see this with other kinds of systems that have nothing to do with Zcash. Like SGX, for example, it has many, many different side-channel attacks. And then what Nym does is Nym says "We don't care or even know what blockchain you're using. We're just going to help you defend the network level transactions. We're going to help you make sure that those side-channels, those kinds of attacks that depend on the timing and the volume of packets sent through the peer-to-peer broadcast or the communication of your light Zcash wallet with your full node, that those can't be used to de-anonymize you by a powerful attacker". And these attackers aren't all that powerful, there's not that many Zcash full nodes, I think there's [about] 400 or 500. And we already know that, way back in 2015 Chainanalysis was doing these kinds of attacks with the Bitcoin network, we would assume that they were possible on the Zcash network. And so Nym basically says, "We're not gonna try to replace Zcash, just as we're not gonna try to replace Bitcoin or replace Lightning or replace any other blockchain. But we're going to provide network-level privacy for those chains". And we hope that they provide some on-chain privacy as well, because it's equally pointless the other way around. If you're using Tor to communicate, or Nym, you may think, "Oh, I'm hiding my network little transactions, no one may know my IP address. No one may know when I sent that transaction." But if the blockchain's recording in a transparent way, the sender and the receiver, the transactions, of course, are easily identifiable, no matter what you do on network level. Anna (00:53:33): In that side-channel attack, you just mentioned the Zcash one, if the users had also been using Nym at the same time, would that have prevented it? Harry (00:53:41): I beleive so. I had emailed Kenny [Paterson] about this, it's in the paper that, yes, the, the timing information... One thing that Nym does, which Tor doesen't do, because Tor is built for web browsing and it's not built for cryptocurrencies transactions, so Tor has to be high-speed, while Nym can be a bit slower, is that Nym changes the order of the packets and changes, therefore, obfuscates the time the packets are sent. So attacks on Zcash or any other blockchain that really rely on timing, we would help ameliorate. But there's other ways to fix it then using them. I think that's what the Zcash team did. They basically prevented wallets from behaving a certain way, so that also, for that particular attack, eliminated the problem. Now we have future research we're doing with the EPFL university and Carmela Troncoso, who is mostly well-known for doing all the privacy-enhanced COVID tracing work, but also happens to work on mixnets and machine learning, is we're trying to understand how much and how bad these kinds of network-level side-channel attacks are on cryptocurrencies like Zcash, because the timing and volume, not just the IP address, but the timing and volume of the packets themselves, we think, is a pretty powerful signal to de-anonymize transactions. And there's been excellent work by the folks at university of Luxembourg on this issue, but we're gonna keep going with it, particularly, by seeing [if] a mixnet can help solve these problems. Anna (00:55:15): Sounds good. So we are a little bit over time, but one of the things I definitely wanted to get to in this episode was the update on Nym since I last had someone from your team on the show. So I think what's so cool is, I think, we've found through this interview, we found ways in which it can be used, what it's doing, but where's it at as a project actually? Harry (00:55:36): Yeah. So where Nym is at is that we are still in what's called the testnet mode. And the testnet mode effectively means that we do not have all the full functionality of the Nym whitepaper implemented, but we do hope to have all that full functionality implemented. I think it's possible by end of the year. The parts that Nym has implemented, we've implemented the Sphinx network packet, which is a packet, which makes... You want to disguise network traffic, you want to make everything the same size, we have mixnodes, which, as I said earlier, mix the packets, that obscures timing information, that could be very useful for zero knowledge proof-based blockchains. And we also have validators. And these decentralized validators maintain the state of the nodes, the mixnodes, that are in and out of the system. And it maintains their reputation and this reputation is currently maintain a testnet token, so that if you failed the mix, if you go offline or you just don't have a high enough bandwidth connection, we don't increase your reputation, while if you do, your reputation increases. And that's where we're at, we're in the testnet. And we expect to, basically over the next few months, add more decentralized measurement of basically quality of service, to make sure that the mixnodes are mixing properly and the validators are online. We're working on anonymous credentials, because some people that use our network... If you were, for example, using our network with Ethereum and you were moving ETH around, the very movement of ETH would probably de-anonymize you. So we built a lightweight anonymous credentials system, which uses a little bit zero knowledge proofs, so not full zk-SNARKs, in order to help systems, which are naturally transparent, to work with us. And I think the real problem we have, where we're would like to reach out to folks who are working on zero knowledge proof-based blockchains, is that blockchains, in terms of privacy, are a very odd paradigm, which is why things like zero knowledge proofs are so important, because traditionally in privacy, the government or the Chainanalysis, or the cybercriminal is watching your traffic and making a record of that and using that to identify your transactions. And what happens in blockchain systems is that we make that record ourselves, because it's useful. Anna (00:58:18): It's super public for everybody. Harry (00:58:20): For everybody! And so we feel, at Nym, quite comfortable with network-level privacy. That's what we've been working on for a number of years, that's what we understand. And we'd like to work closer with other teams and other projects to really make sure that we can offer our users something pretty simple, where users can do privacy-enhanced transactions out the box. And I think that's, to go back to usability, as even before mainnet, we would like usable wallets, that really... Just like you see what Signal [does] today, they give people the ability to do private transactions without jumping through too many hoops or thinking too much about privacy. And they can rest assured that their transaction is private. And we're seeing all this development on very different blockchains, Zcash moving towards Halo 2, maybe moving towards proof of stake. Monero has pretty brutal tax, but the community is giant and it has expressed long interest in network privacy through other technologies like I2P. We're seeing, I don't know, I mean, there's these private smart contract platforms coming. There's a lot of stuff coming and we would just like to integrate against it. And we would like for chains that don't have privacy to add these technologies. For example, Polkadot or, to be honest, to see better privacy-enhanced tech built on top of Bitcoin. But we can build the most perfect network-level privacy system for cryptocurrency ever, or for any message-based traffic, but it is pointless, unless the blockchain itself has usable wallets that have privacy on the chain level. Anna (01:00:10): I'm curious, just to understand what Nym is right now a little better, is it a standalone blockchain? Or is it built on something? Is it going to be its own, from the ground up validator? It sounds like it's proof of stake, using mentioned validators. Is it proof of stake? Harry (01:00:28): So Nym is complex insofar it has two kinds of components, which are effectively proof of work components. But the kind of work we're doing is not the classic work that everyone's concerned about or mostly knowledgeable about, which is... Anna (01:00:44): It's not the grinding through computation and burning energy. Harry (01:00:51): Yeah, we're not grinding through hash puzzles. What we're doing is we mix packets and mixing packets is work. It takes a lot of computation, not as much as grinding through hash puzzles, but it's substantial. And so we try to reward people, based on successfully accomplishing that work. And that's the Mixnet itself. And that's what actually provisions the privacy. Now in terms of a mixed network, we're not a pure peer-to-peer network, because there's lots of attacks on peer-to-peer networks. Ania Piotrowska has some good work on this, from Nym. Claudia Diaz, who was on Zero Knowledge Podcast earlier, did the truly great work on this 10 years ago, really, showing, that in order to really provide strong privacy guarantees, a pure peer-to-peer network is not suitable now. But the problem is you have to somehow be able to discover the network and route packets through its topology. This is like I'm sending packets to one mixnode, it's getting mixed, I send packets to another mixnote, it gets mixed some more, I send packets to the third mixnode, it gets mixed a bit more, than it gets shipped out. And how do I make sure that every packet goes through at least 3 mixnodes and the packets aren't being routed, for example, in circles or outside the network, before they're properly mixed? It's kind of like every packet must be treated equally. In order to do this, historically, the answer is a directory authority, and this is just simply a group of computers, you call them and they give you a map of the network. That's what Tor has. We've also decentralized that directory authority and that directory authority is currently built, I think, on top of Tendermint. It was built originally on top of Liquid, but we needed the ability to do the reward sharing in a more complex fashion than Liquid allowed. So we moved to the Cosmos and Tendermint ecosystem for that. Particularly, we were running Rust CosmWasm for the essentially reward sharing. And that being said, we're not a pure proof of stake system, because these validators are rewarded not just by being aligned and having certain amount of stake. They also, when someone uses the network, they have to create, what's called "a little anonymous credential" for the user. And that "little anonymous credential", Tor as a centralized version of this, they use the Cloudflare, we have a decentralized version, based on a system called Coconut credentials. Anonymous credentials, you can consider it like the simplest possible zero knowledge proof scheme, which is just an array of values. And one of those values is "Yes, I am authorized to use this network." And we basically measure the validator performance by how many of these credentials they're minting, which also is some cryptographic work. So we're a bit of an odd system, I think, insofar as that we don't want our blockchain to be used for financial transactions. There's enough blockchains out there who want to be used for financial transactions. I personally feel like there's too many. I'd actually like to see more focused work on improving Bitcoin, even improving Zcash, improving Monero, improving any of these systems, rather than building yet another system. Although there are huge bits of functionally private smart contracts, which I'd love to see someone to do what they're just missing. And so we use our blockchain not as a general purpose blockchain, like many other chains. We use it for a very singular purpose, which is just to tell users how to get onboard and find those mixnodes. And so it's essentially a list of IP addresses and keys, what you need to create this mixed packets and reputations of those mixnodes. Anna (01:04:42): But is it going to be a zone then? It's built using Tendermint, are you using the Cosmos SDK to build it? Harry (01:04:48): We are using parts of the Cosmos SDK, but we have not yet even thought about being a zone. That's something I'd be very interested in. However, to be honest, we want to deal with one problem at a time. So we currently just want to get our validators fully permissionless, fully decentralized, and working just to maintain those mixnodes. I don't know if we really want or need different assets coming in and out of our ecosystem, at this point. How we would prefer to interact with other ecosystems is we would prefer to do what we want to do, which is provide privacy on the network level. So we're hoping that other ecosystems can interact with us and they wouldn't use our chain to replace their chain or even transfer assets around different chains. So I think it's kind of a Zcash strategy, where Zcash is hoping that user-defined assets from other chains can come in. And I think that'd be awesome, but that's not what our strategy is. Our strategy is that you would be using Zcash, but you'd also want network-level privacy, and you would be using Zcash with us as a network-level transport layer for your Zcash transactions. And there's many different places that it could happen. It could happen in the wallet, it could be between full nodes with a light wallet, it could happen between the full nodes. We think we could also be useful for other ecosystems in that way. For example, Liquid is the main other ecosystem we've been working with. We could help defend... Liquid is used mostly by Bitcoin trading and exchanges and for a very large volume. So it's a very sensitive data. So we could help defend the network-level traffic of exchanges and traders and all of that, as well. Anna (01:06:41): I'm assuming... I did read about the raise that you just did. Is there a Nym token? And what does that do in this case? Harry (01:06:47): Yeah, so we have a token right now. We've actually had 3 for the testnet. And we probably will have, at some point, a final Nym token. And that token does something, again, it's not a general purpose of virtual currency, let's say, it's just to basically figure out the reputation of your node in the system, which in mixnodes is for how much mixing you've done successfully. And we do allow delegated staking. So people believe you're going to be successful in mixing. Let's say, you're EFF, people really like you, they will delegate to your node. But then we don't see that as a general-purpose financial instrument. And that's very different than, let's say, Zcash, where what Zcash is trying to be is private money. Or what Cosmos was trying to be, as like staking on the Hub with ATOM and like other people running all these different chains that are doing all sorts of different things. And on that level, we're in line with the Cosmos vision, cause we're just doing this very small thing that we want to do well, which is maintain network-level privacy. But our token essentially is a reputation token. And that's can be considered a kind of staking token, although staking might not be the right word, cause we're proof of work system and not a proof of stake system. So these are a little bit different. You could also consider it a deposit or a bond, but primarily deposits or bonds are supposed to eventually converge on your actual reputation for mixing, for providing privacy. And the real question we have is — something Chris Burniske, a VC asked, is how much is a privacy-enhanced byte worth? How much would, for example, Zcash user, would they pay a little privacy transaction fee for a greater network-level privacy? And could that transaction fee then be used to reward mixnodes? Converted over to Nym and given to mixnodes. And we don't know, but that's why over the next few months, we hope to be working closer with wallets and doing usability studies. Now what we're seeing, to be honest, makes us think that's possible. People pay lots for transactions. I'm shocked at how much people pay for Ethereum transaction fees, for example. And those transaction fees, it's just you paying to speed up your traffic. Now with Nym, it's a bit more complicated, as we're going to slow your traffic down a little bit. So maybe we're not particularly good for high-frequency trading. But, at the same point, there are users that would want a bit of privacy and maybe those users would pay a bit more. And we haven't really figured out that economics totally. But we are currently working out some of those. It will all be public by end of the year. And we're submitting them to various wonderful conferences like Financial Cryptography. Anna (01:09:35): Cool. But is there a value attached to it? This is the part I still don't fully get. Is the Nym token going to be valuable in a trading sense? Would people be able to move it somewhere to sell it? Or is it really just this internal utility thing? Harry (01:09:50): Well, currently it's an internal utility token. Now the problem with blockchain is it's very hard to prevent people from trading things. And we would be shocked, I mean, we are already seeing, we keep telling people "They're valueless, please don't trade them!" And then we see people trying to trade them already. It's just what people are used to in this space. The Nym token though should have a value. So the way to think about how Nym works, from the technical side, and this is all new stuff, since I think you've talked to Claudia, is that in the beginning of every round or epoch, amount of time, let's say a week, a month, a day, the Nym network runs a bandwidth auction, an auction on a privacy-enhanced bandwidth, where various mixnets say, "I can provide this much". And the users say "I would like this much". And that [the network] then chooses the highest quality mixnodes to be part of the mixnet for that epoch. And if those mixnodes do their job right, they're rewarded in Nym tokens. And if they don't do that job right, they're kicked off. But you still need those Nym tokens to have some value in a monetary sense to let you get into this game. And that's really important, because in our early version of our testnet, that was pre-Cosmos, we just gave tokens to anyone that did their job right. But what we found is that they were truly tremendous. We literally got Sybil-attacked and we went from 1000 nodes to 12,000 nodes in a few hours. And that's what actually drove us more and more towards using this more complex bonding technique that's enabled by Cosmos. Because essentially we're saying [that] one really good way to prevent Sybil attacks, I mean, it won't work with governments, but it'll work with a lot of ordinary people, is to say [that] in order to jump into the network, you need 1000 or 10,000 Swiss franc worth of Nym tokens. So that naturally makes it harder to do these 10,000-nodes-on-AWS Sybil attacks. Now we have other techniques, which we think will also make it harder for even government-level administrators. And I'll just mention 3 of them really quickly, cause they're all really cool. So there's 3 kinds of technologies that will help us prevent Sybil attacks, that don't require impure money. One is verifiable locations, which we already have implemented, and this uses a verifiable random function to select a set of nodes that then ping other nodes, record the time of that ping, based on the speed of light, and then use that to tell what the actual location of that server node is. And that makes it harder for everyone to cluster Sybil attacks on a few servers, and also could basically make sure the network is decentralized, not just at the number of nodes, but across different geographical locations. Anna (01:12:52): Does that matter if they're just using AWS servers from the 3 centers, the 3 main centers in the world? Harry (01:12:57): No, we can not reward those 3 centers. We can say, "Hey, we want some servers in Africa. We want some servers in Tunisia, which is actually a great IXP. We want some servers in Amsterdam". We can disincentivize and incentivize particular locations and servers. And verified locations make it hard to fake that. But a powerful adversary can still produce a lot of servers in different locations. A second technique we use is a technique, which is published by George Danezis, Nym advisor, called SybilQuorum. And this technique is really cool. It's based on a previous technique called SybilInfer, which basically uses the social network analysis between who's delegating to who, to figure out if there's Sybil attacks going on. And it's highly effective. And we think this technique could also be used by other blockchains. And then the last technique we use, of course, is the very basic formation of stake pools, or as we call them mixed pools, where we actually, interestingly enough, took technology that was inspired by the work of Cardano, where we said [that] Sybil attacks work, to some extent, if you're keeping the same reward about how many nodes there are, but we say, "Well, if you have too big mixnodes and you have too many mixnodes, we're going to lower your reward". So the bigger you are, not necessarily the more rewards you get, that's very different than Bitcoin. But also the smaller you are, we might not want nodes, which are low quality. So we might also lower those rewards. We try to aim for a median size. Anna (01:14:41): Interesting. So it's not quadratic, it's not like small equals more, but it's like middle equals ideal. Harry (01:14:49): Yeah. And then we finetune that middle, using that auction mechanism that we described earlier. So I think these 3 kinds of techniques we use, but they're also kind of general purpose. And while they have nothing to do with zero knowledge proofs or privacy, per se, I do think other blockchains [should] look at these papers. And they're all online. And our Rust code is in our Gihub. And people can check out our code and use that as well. So I'm overall really excited. And, honestly, the way I feel about Nym is that we are just one part of a smaller puzzle. From a revolutionary perspective, you consider us that we are a finger that is part of a large fist. We're just one finger, but we need to work more closely with privacy-enhancing technologies on the blockchain level. And we've seen all this wonderful progress of zero knowledge proofs. You've seen PLONK, TurboPLONK, Halo 2. And we just want to work more with researchers in that space, in order so we can provide network-level privacy and they can provide blockchain-level privacy. And then the last thing we all really need together is we need privacy-enhanced apps, because it's great to have blockchains. It's great to have wallets, but there's a whole space of web browsers, messaging apps, calendars, video conferencing, teleconferencing, where we need, not just the network packets, the TCPI UDP packets, not just the blockchain, but we need the actual app itself to be privacy preserving. We think that's the next big frontier for all of the people out there. And I really can't wait to see people starting building these new kinds of privacy-enhanced apps. Anna (01:16:33): Very cool. I think they're on their way. I feel like there's rumblings about things like this being built, but they do need the underlying infrastructure in place, in a way, to be able to be successful. Harry (01:16:46): But we're still in infrastructure mode, we're like the web in the late 90's. And once I think infrastructure is in place, we're going to see the big kickoff. So the way we look at ourselves is similar to open SSL. We're open SSL, just allow people to encrypt connections to the web servers. And then after open SSL, you've got PayPal, eBay, all sorts of wonderful stuff. And I think that's what's going to happen next with privacy-enhancing technologies. Anna (01:17:16): Very cool. Harry, thank you so much for coming on the show and sharing this journey through privacy, why it's important. We got to explore quite a lot through that. I appreciate it. And also where Nym fits into the stack. Harry (01:17:29): Okay. Thank you so much. And I look forward to talking to you again. It's been a great show and I do listen to these podcasts regularly, so it's definitely an honor to be invited. Anna (01:17:40): Cool. Oh, I'm so happy to hear that. Cool. So I want to say thank you to the podcast producer, Andrey, the podcast editor, Henrik. And to our listeners, thanks for listening.