Today’s show is sponsored by strongDM. Transitioning your team to work from home? Managing a gazillion SSH keys, database passwords, and Kubernetes certs? Meet strongDM. Manage and audit access to servers, databases, and Kubernetes clusters, no matter where your employees are. With strongDM, easily extend your identity provider to manage infrastructure access. Automate onboarding, offboarding, and moving people within roles. Grant temporary access that automatically expires to on-call teams. Admins get full auditability into anything anyone does: when they connect, what queries they run, what commands are typed. It’s full visibility into everything. For SSH, RDP, and Kubernetes, that means video replays. For databases, it’s a single unified query log across all database management systems. strongDM is used by companies like Hearst, Peloton, Betterment, Greenhouse, and SoFi to manage access. It’s more control and less hassle. strongDM. Manage and audit remote access to infrastructure. Start your free 14-day trial today at strongdm.com/GTC. JACOB: Hello and welcome to Greater Than Code Episode 180. I'm with my colleague, Carina C. Zona. CARINA: Hello. And I am here with Coraline Ada Ehmke. CORALINE: Hey, everybody. We have a great guest today. I'm super excited to introduce Tobie Langel. Tobie is the principal and founder of UnlockOpen, a boutique consulting firm that helps organizations think strategically about open source. His clients include top companies like Google, Microsoft, Intel, Mozilla and Airtable. Tobie is a voting member of OpenJS Foundation Cross Project Council and sits on the AMP's Advisory Committee and on the Advisory Board of OASIS Open Projects. Previously, Tobie served on Facebook's Open Source and Web Standards Team, Facebook's Advisory Committee Representative at the W3C and was a W3C fellow supported by Facebook. Tobie lives today in Geneva, Switzerland. Tobie, welcome to the show. TOBIE: Thank you for having me. I'm super excited. CORALINE: Awesome. Our tradition on this show is to start off with the question of what is your superpower and how did you develop it? TOBIE: I think that my biggest power, I probably wouldn't call it a superpower, but my biggest power is that I tend to bridge groups of people or tribes, if you will. I'm not super sure how I got that, except I've sort of always been somewhat of a misfit and never really liked being identified as a group, sort of like labeled. As a teenager, I was playing a lot of music and also doing rock climbing. And I would tend to wear my rock climbing t-shirts to band practice and my band practice t-shirts to rock climbing just to not be labeled. CORALINE: Yeah. Labels can reduce us to preconceived concepts and I can clearly see not wanting to be lumped in with someone's conception of what a musician is, what a rock climber is, what a consultant is, what a Standard's body member is. Makes sense. TOBIE: I would add to that, that that creates a bunch of superpowers. As a result, you're like the best musician among rock climbers. That's a superpower in that group, and so on. Being sort of like an open source person in standards bodies is a superpower. Knowing how to code in non-coding environments is a superpower, et cetera. So it's sort of like generates all of these scoped superpowers, if you will. JACOB: Coraline, before we started the call, described you as a co-conspirator of ethical open source. I'm just very fascinated by concepts. And just as a person who's less familiar with the concepts, I would like to know what that means to you and what that means to you on a day-to-day basis. TOBIE: I think that I've spent most of my life chasing a place where I would fit, and that would sort of fit my own conception of the world and morals and ethics. And I really found that space when I discovered music when I was a teenager. And then I sort of saw the limits of that space and kind of stumbled upon programming and software development, which just clicked as this sort of a new world that could get into. And as I got into this world, I got really, at the very beginning again, sort of like stumbled into open source and what that movement meant. And it sort of like filled this gap of meaning for me and actually feeling that what I was doing was contributing to the greater good. And as I grew and as tech grew around me much faster at a much bigger scale than what I could have even imagined when I first started being in that field, and open source grew and changed, I started to have trouble finding the same meaning and same set of values that I originally had found exciting. And as a result, I have started trying to find either meaning elsewhere or deeper meaning in it, or I'm trying to find -- I have a different perspective, I have a different outlook to try to get back to that deeper meaning. CORALINE: What is the deeper meaning of open source that is attractive to you? TOBIE: The deeper meaning of open source and software and tech in general to me is whether it actually makes society better, whether it improves people's lives. I don't see a lot of meaning in my personal life if I'm not working in a space that is connected to that. Yeah, open source did feel like that for a while. I would say, probably close to a decade, where I felt like I was contributing to something that was basically moving all of this knowledge, all of these assets that were being built by people in a place where they could be used by everyone, where they belong to everyone, and where they were not sort of aggregating wealth and power in the hands of a few. That really, for me, was a convincing argument for me to be involved with that for a long time. And then the other aspect is I feel very privileged, and frankly, I am very privileged for a number of reasons. And as a result, I sort of feel like I have a moral obligation to be involved in these areas. And so part of it is sort of, yes, something that I feel like I have to do. You can even hear it in my tone. Sometimes it's like, "Ugh!" I wish that I could look at life sort of like in the lighter way sometimes and not feel a responsibility around these issues. CORALINE: Edmund Berkeley, who is an early computer pioneer actually worked with Admiral Grace Hopper before she was an Admiral during World War II. He was a major force for bringing ethical consideration to computing. And for his entire life, people ignored him. But he talked about a concept of people involved in computing having a greater than average responsibility for the impact of their work on society, because computing can be such a powerful tool for either reinforcing the status quo or battling the status quo. And it sounds like your disillusionment with open source might be derived from that feeling of extra responsibility. Do you think that's fair? TOBIE: I'm going to try to answer this question by sort of like sidetracking into something that I learned from my standards. I spent a lot of time working in web standardization, so basically writing the specifications, the document, the technologies that are used to make web browsers. And there's a really interesting concept in the W3C, which is one of the biggest standardization bodies around those specifications, that is called The Priority of Constituencies, which implies that if you look at all the different constituencies that are touched by, in that case, web standards, that there are constituencies that are more fragile or playing larger and that have to be especially accounted for. And this is really one of the core principles, one of the core values that govern the way decisions are made in standard bodies. And so basically, the idea is that you focus first on the end user, the person that's actually using the software. And then and only then, you can focus on the web developer, the person that's actually building the application. And then once you've actually fulfilled their needs, then you can focus on the implementer, the developer that's actually building the rendering engine, whether it's inside of Firefox or Chrome. And once you fulfilled their requirements, you can actually look at the requirements and the needs of the editors of those specifications. And once you've fulfilled the requirements of the editors of those specifications, then you can actually look at technical purity. So it completely reverses the sort of like very engineering driven mindset of really focusing on technical purity and puts that as like the last thing you want to care about. And what you really want to focus on is people. I think that having spent a lot of time in standard bodies and especially at W3C and the related group of bodies around this, [inaudible] and ECMA. That has sort of like really bled into the way I see tech in general. And it's only recently that I've been really able to, first of all, notice that not everyone was thinking that way, which was kind of very surprising to me. And secondly, realized that this was actually mapped really closely to my world view. And so, this has sort of become a tool to explain what you were describing, Coraline, this relationship between and this responsibility that we have as practitioners towards the people we impact with our work but don't have a say in it. And this is where this sort of Parity of Constituencies comes from. It's like the end user has very little say and can do very little to influence the standard bodies or what's in the browser. And so that's why when there's a conflict, when there are conflicting needs around the specific part of the technology, part of the stack, you want to put them first. And also, just the question of the impact for, I don't know. Maybe for one editor, you have 100 or 10 or 100 implementers in the browsers and maybe 10,000 to 15,000 developers and billions of end users. So, you will also want to just respect everyone's time by making sure that you focus on the broader set of people. CORALINE: And do you think that open source falls down or that open source prioritizes the implementers over the end users? TOBIE: Yes, that's absolutely something that, in my mind, it's transition that open source hasn't thought about and hasn't operated yet. When you think about the four freedoms that are described among free software, freedom zero is entirely focused on being able to run software yourself. And the truth is, the reality of what human beings live today is that they're constantly being impacted and influenced by software that they're not running, that software that they might not even be aware that it is running. If you're, I don't know, picked in the lottery for something by software, you have the end result. You don't know that you were at some point, like a datapoint in a computer running software about you. And when you're using software as an end user, most end users today are using software that they're not really running themselves in the sense that you would have been running software two or three decades ago. If you're using a SaaS online, it is not the same thing than if you were running this same software on your laptop. If you're using WhatsApp to send a message to your family, it is completely different than running a command line tool on your own computer. And open source is still reasoning about this very much in terms of nothing has changed in three decades or two decades or like a long time. We've seen an explosion of software. Like, everything is software. Everywhere. And we haven't changed the way that we think about how software impacts the world. It's very different when software impacts the person actually running the software and when it impacts like a slew of people sort of downstream. I want to go back to the concept of Priority of Constituencies. For this, the Priority of Constituencies includes end users and includes people building on top of the web platform, and it includes implementers, people actually building the browsers. And the open source movement only thinks about the implementers. It doesn't think about anyone that's downstream of that. And because it doesn't think about that -- I mean, it does, but it's like not integrated in the model or framework to think about stuff -- it sort of has this weird thing where you can think either about technical purity or about like the actual person running the software. And that's the only two constituencies that exist in that field. And yeah, I think that in 2020, that's a terrible problem. JACOB: I'm thinking about something like my mom and my dad. I think if you talk to my dad, he would definitely have plenty to say about what's wrong with corporations in this country. But telling him that, "Actually, you can run Linux on your laptop and then you don't have to have any make Microsoft products at all." That's completely meaningless to him because maybe he could, according to these freedoms that you're talking about, or Stallman's talking about. But he doesn't have the ability to do that. He doesn't have the knowledge to do that. I mean, are there a set of freedoms for the rest of us? TOBIE: Right. It's a super privileged position to have. It's like, if you want freedom, you have to have this extremely deep understanding of how software works and the time to actually run your own stack. This is not the reality. And because we ignore the reality, we basically ignore the reality of like 95% of the people. And that's not fair and that's dangerous. We can and should do better. One example that they would like to give about this. I mean, this attachment to technical purity that we, as engineers, tend to have all the time, it's sort of like something that we fall back into if we don't pay attention, has real consequences and real impact on people at levels that are unexpected, but still very impactful. And one example that comes to mind is, from a security perspective, if you look at software, it is very common to think or at least it was for a long time, this is like starting to change. It was very common to think about security as, from a very binary perspective, either like someone has access to a device or it doesn't. And one of the examples that I saw about that when I was at Facebook was I was very concerned about the fact that a lot of what Facebook [was enabling] was spouses spying on each other's account. Because it was fairly easy to do when things went improperly encrypted inside the web browser's local storage. And that was perceived by people that I brought it up was as a non-problem because from a security perspective, from a technical purity aspect, if you have access to the device like everything's gone anyway. There's nothing you can do about this. And if you think about that from that perspective, that's true in practice, a spouse trying to spy on their spouse is something that's actually very common. Back then, I think I had done just like a Google search to see if this was a common thing. And the number of pages that actually were linked to answering a question of how can I spy on my spouse on Facebook, there were like 50 millions of them. So it was a thing people were doing. Actively doing. And it was just not thinkable to address that from a security perspective, because that perspective was driven in the technical purity aspect. There was no thinking of the attacker was not a professional security person. It was not like a software developer. They were just doing like a number of very basic steps that they would find online. And because we were looking at this from a purity angle, there was no room to address that problem. It's been addressed since, because people have sort of grown up from that perspective. It wasn't something that we could consider. Recently, we've seen a huge amount of critiques around the security and the safety of using Zoom. As people have moved really quickly to remote work and confinement, they've relied on these tools a lot more than before. And a number of security concerns have popped up. An answer that I've seen to this security concern has been, "Well, just use open source tools." It turns out that I have kids and they're just like at the age where they start to dig into things a bit more. I probably wouldn't want them to. And so, I double checked what the school wanted them to use, which was an open source solution. I was like, "An open source solution is great. That sounds great." And that open source solution had a ton of safety issues. None of them were security issues. It was like into an encrypted. Wonderful. But you could have people jump into your call if you hadn't picked a long enough character string to make your room secure. It was full of issues like this that were really problematic for end users. I actually filed a bug against it, like against the software, now, like close to two weeks ago and nothing has happened. And whenever I've actually talked about this with people that were proponent of open source against Zoom, the reaction has always been, "Well, use open source software because it's better. Use open source software because it's not proprietary." It's actually end to end encrypted and it has all of these exciting feature sets. But there was no deeper thinking around the actual safety aspect in the end user. It's as if the fact that it was open source was by itself sort of like a get out of jail card. And in itself, it's like, that's enough. And whatever is not open source is rejected. And so it creates this really weird conversation where, of course, I would rather have an open source and safe tool. But because we have this sort of weird perception of open source as ethically sufficient by itself, we no longer can think about broader ramifications of the software that we use. And we end up with safe, but potentially surveil proprietary tools on one side and tools for techies on the other. And that's a silly place to be in. And it doesn't need to be that way. CARINA: I think not always silly, but in danger. When we talk about security versus safety, I think we also have to talk about sort of flavors of safety. You brought up Zoom. And what we really discovered pretty fast as all of the world suddenly was using it once and exploring what it is, there was talk about children and classrooms were using this. And even if the teacher was in some way locking down access, I think the sort of psychological safety is an aspect there as well, as children are not used to having their mistakes along the way, the process of their learning, being able to be seen and recorded outside of their classroom. It's not just their teacher and their fellow students. What is the safety in that? Even if they're safe from active harassment, there's this really big change in what they can trust in, including can I trust the teacher? I saw an interesting example that's not with children, but people finding out that the classroom software for, say, adults allows often for the instructors, the TAs to see every keystroke happening in real time or to see and do this so-called attention tracking where they can see your eyes are not looking directly at the screen for several seconds. And there's an imposed meaning on these things. There is the assumption that if you're not looking at the screen, it means you don't care, you're not hearing, you're mentally not engaged. Personally, when I'm thinking real hard, oftentimes I'm looking up. I am mentally engaged. But all of these things are imposing, meaning that is unsafe, I think. And when you talk about technical purity versus security versus safety, I think we even have to go one beyond that. The difference between physical safety and mental safety. And what does that mean? And how that involves self-censorship, which circles back to the Four Freedoms. If you're not free to express yourself in the context that you're used to or should have freedom, then shouldn't this be like a fifth freedom? Freedom from self-censorship. You talked about the freedom is really being from a technical standpoint and leaving out the users, and I think this is an example of how something that was decades old created by people who are relatively free from harassment and none of whom are children, didn't anticipate where we are today. TOBIE: There's a lot to unpack here. My answer to this is part of a deeper problem that arises very often when you talk about this and to which I have utterly and completely failed to find the conclusion so far, which is that there is a temporality to things which makes everything very complicated. An example was Covid-19. There's been a number of health players that have decided to open source during the epitome, for example, design for the respirators. Depending on what your perspective is, this is really good or really bad. It's very complicated to actually decide which one it is. And I don't have the framework to be able to do that. But let me explain why it's good. Obviously, this lets a number of other players step in right away and build ventilators using a tried and true system and ship ventilators to the dearly needed people who just literally cannot breathe. So, this is really good. Now, another way of looking at that is this could also be a strategy from a company to prevent a truly open source solution, one that is not scoped in time to exist and be adopted by the overall industry. And so make sure that ventilators as a whole are things that belong to the commons and not a company in particular. And so, you have this tension between are we trying to kind of put Scotch tape around the current problem or are we taking sort of like a longer view? And to some degree, both perspectives have ground. There is sense in both of these perspectives. Safety issues around the open source that you were just bringing up are full in that same category for me, which is right now, we have all of these kids that need to be able to continue going to school. And so, my first reaction here is to make sure that they're in a safe environment from immediate harassment when they do that. But at the same time, the impact of sort of saying, "Yeah, you're safe for using Zoom right now," than using this open source solution that's not really ready to be deployed at that scale is also going to make Zoom sort of the norm and that open source solution less successful. And as a result, you're creating more long term problems around privacy and surveillance and all of that. And frankly, the only reaction I have to this is that, that is somewhat exhausting. CARINA: Yeah, it's definitely wider scope than simply, does someone have access to this information? Yes or no? There's a lot more questions implied when we think about who are the users, who are the potential users. Zoom talked about the fact, "Well, we only anticipated this being used for corporate purposes. Those are the only problems we were looking to solve. And we expected that there'd be IT departments who would deal with all of that, not end users." They did exactly that. They were thinking about one kind of usage and then suddenly it was being used for a lot of other things. Going back for a moment to children and think about things like now it's being used for things like playgroups or the equivalent of going out to recess, places where normally the parent and the teacher had no -- they weren't listening to those conversations at all. And suddenly, not only your kids having that exposed, but their conversations that are unusually intimate. People talking about death. Certainly in New York, a whole lot of kids are being impacted very directly, their family or a family of their friends, and having to have those conversations around 9/11. There was that same conversation without the technology so much of kids need to stop learning for a while and just deal with how they're feeling about all this stuff. So we had a use case that we already knew about from 9/11, and we've had enough time in distance to anticipate those things. And yet we still haven't. There's still a lot of assumptions about we have one kind of user and that's okay, or we don't need to think beyond this. We don't need to ask questions. And when you talk about having a hard time at Facebook with getting people to address questions, even when you very specifically raised questions, not just, hey, we should think about other things, but can we talk about this very specific thing of spouses being able to observe each other and spy on and stalk each other? I see other examples of that, like software that's being designed to track children on their phones and being installed to track spouses on their phones. Almost every software these days has open source somewhere underlying it, even with proprietary. Somewhere down deep in there, so much of the industry has been fueled by open source. So there's this [inaudible] that we're not necessarily aware of and likely not aware of in developing open source that it's going to be far, far out there. Decisions being made about it by people who don't share the philosophies at all of open source. They're not even asking these questions because it would not occur to them. And they don't see it as necessary and certainly don't see it as an underlying premise. How do we reach those people at all, let alone get buy-in on the underlying principles of the Four Freedoms or even just one of the four? TOBIE: The first thing that I would like to address that you brought up is that Zoom was designed with corporate usage in mind. And I have a very hard time as a business owner and as having very rarely in my life actually worked for employers and having people that work for me, I have a very hard time accepting the concept that surveillance inside of a company is okay, but not okay outside. I find that very shocking. And the part that you brought up about children being increasingly surveilled by parents, by teachers, there are apps and there are now universities implementing with sort of good intent in mind, tracking where students spend their time, because that's actually a leading indicator of them failing courses if they spend like none of time in the library. And so the sort of idea behind this is that you could then step up early and help them to make sure that they don't fail their course. So lots of sort of seemingly good excuses to surveil people and make them in the habit of being surveilled from when they first leave the house to when they're in college or in high school studying. I mean, this is so scary and so dystopian in so many ways. And we're just watching this unfold without even blinking, like this is normal. "Yeah, I feel safer." I mean, now we're talking about figuring out how to track people that have had Covid and that haven't had because of safety. And so we have all of these really good, reasonable things that we do that create a society where surveillance is normal. Everyone is surveilling all the time and that's acceptable. And yeah, this is not going to end well. This is scary. This is really concerning and really scary. The year that open source practitioners necessarily think about issues of how their software is used or that engineers in companies actually think about this is, unfortunately, misguided. This is most of the time not the case. Most of the time, people just do stuff because it's fun. I mean, I've seen that so many times, like, "Oh, I'm trying to solve this problem. This is an interesting problem." What exact problem is it? How is that going to be used? Do you realize what you're actually building there? It's often the case that we either don't think about it, don't want to think about it, or have heard really good stories. I mean, corporations are really good at telling really good stories to people about why they're building stuff and why what they're building is actually useful. And so clearly, the education problem is not only, was the decision makers using the open source software that we've built? It starts with the engineers themselves and so were the people building the software. CORALINE: In my mind, Tobie, in large part, I think there are two factors here. The majority of software is written by very privileged people who don't have the life experience they need to understand things like safety for a child or safety for a spouse in a bad situation, or safety for people whose experiences they have no idea about. But I think that sometimes it can come down to a lack of empathy for the end user, like not thinking about the end user at all. But going back to that principle you talked about of technical purity, I can do this because I'm smart enough to do it and I don't spend any time thinking about the ramifications of what I'm doing because I'm not able to put myself in other people's shoes. TOBIE: Yeah, absolutely. It's not surprising that a lot of people building software today are very young. I don't mean to say that in an ageist way at all. I've met a lot of very young people that [inaudible]. But the experience that you talked about is certainly a factor. I have this image that is probably weird and potentially, people would probably be upset about this. [Chuckles] But there is a reason that soldiers are young men. And when you look at how the software industry today is, it's VCs finding really young men to go build software. And there is some form of parallel here, in my opinion. It doesn't say anything about the character of these young men. I mean, I was a young man at a point in my life, too. And there were lots of things that I ignored. Lots of things that I didn't really care about. Lots of things that I glossed over. Lots of voices of people whose thoughts and warnings I ignored because that's what young, privileged people do. So yeah, I think that that is certainly a part of the equation. The other aspect that I think a lot about is comfort. It is not comfortable to be questioning whether what you're spending your day working on has a negative or a positive impact in the world. Questioning the impact of what you do is not funny. It's not enjoyable. It's way easier to sort of drink the Kool-Aid -- I know that's a horrible metaphor, too -- but it's way easier to go with what you're being told and adopt the reason that you're being told why this is useful as your own reason rather than to question it because questioning takes energy. It puts you in a sad place. It's difficult. It's risky. Our natural tendency is not to do that. And finally, there's another aspect, which is that software is a lever. It makes pretty much everything that you do an order of magnitude or two orders of magnitude more powerful or faster or any of these characteristics. And the corporations that today own the processes and the systems and the know-how and the knowledge to leverage software have become so powerful that pretty much whatever we do, it actually helps them be more powerful. If you consider GDPR, which was this European law, the idea of which was, "Let's make sure that the Silicon Valley giants are more respectful of privacy." It was clearly sort of fought by those giants for a little while and then they just embraced it. And it has made them a lot more powerful in the process because now, you have to comply all of these things. It's very complicated to be GDPR compliant if you're handling user data. So it makes it very, very difficult for other potentially more privacy aware players to come and compete against these giants. And so, to add to the fact that it's not comfortable to be questioning whether what you're doing or not turns out that if you're contributing to open source, it's helpful to large corporations like Palantir. If you're not doing open source, then they're just going to do that to as close to a software anywhere. It's like there's no way to win to some degree. So it's understandable that questioning this is not something you want to do. This is very pessimistic. I'm sorry. JACOB: I think this is getting at an idea that it's all well and good to have engineers who are empathetic and thoughtful. And it's necessary. I won't dismiss that. If their questions will fall on deaf ears, the deaf ears of management, or if they're brought into the conversation when it's too late to change the products based on those concerns, it seems like it wouldn't matter how ethically minded your technical workers are. CARINA: I see this also as an internationalization issue. European's perspective on what privacy is, is very different than, say, Americans. And even within America, we've got states that have really different privacy policies. And then you think of China. And going back once again to Zoom. Sorry, Zoom. But we've all been thinking about you this week. Most of the engineering department for Zoom is located in China. So the kind of questions that engineers are going to raise are fundamentally different than ones that you would expect to raise, say, if the engineers were based in Europe, where they have to, as a matter of law, think about privacy. As a matter of law, Chinese engineers kind of have to avoid talking about privacy. It's very different cultural issue. It's a very different legal standpoint. It's a very different set of assumptions. You're not thinking that way because it's so taken for granted that as an individual and as a society, surveillance is, as you say, like you start accepting things in that slippery slope as normal. Well, for them, that slippery slope already happened and continues to happen [inaudible] at the beginning of something that for them is much farther along and has much more momentum. So, I see here internationalization issues that are not solvable by either engineers or users in some other country. We don't have the power, even as engineers, to look at that code. We write that code with end users in mind more than the original engineers did. So we're talking about somewhere under there is open source, but it doesn't matter from even an engineering standpoint, let alone a user standpoint. We're at the mercy of whatever cultural basis the software was written under. So, if we're starting to think of these as internationalization issues, what can we do? To what extent can we change our impact as engineers, in order to change impact for end users? It's possible the answer here is Jitsi. TOBIE: Is what? Sorry. CARINA: Jitsi, the open source conferencing project. TOBIE: Oh. Yeah, that's the one I was referring to as not addressing safety concerns. There are different ways to look at this. We can look at this from an engineering perspective as engineers and our responsibility as engineers and as open source practitioners. We can look at this as a consumer choice, whether we decide to use as consumers. And I think we can also consider what are the tools that we have, what are the existing frameworks that we have that sort of span these internationalization question, as you call them. And I actually think that there are a common set of values across different cultures. Even if we have slightly different interpretations of them and sit on different parts of a spectrum. But I think essentially, sort of broadly speaking, what is good and what is evil is fairly well understood by human beings across different cultures. And we do have an existing set of tools for that. It's the human rights. A lot of what we're talking about now, a lot of the surveillance questions, the privacy questions are taken into account by the human race. That framework for thinking about this exists. So I think that's an important angle to consider what we can do sort of beyond engineering. Then the other aspect is, as individuals in this space, there are times where we can push for things and times when it's harder for us to do so because it puts our own ability to get work, our own safety at risk. And that's also something that we need to consider as individuals. And also be mindful and respectful of people that are in that situation. And then I think that as consumers, as privileged consumers and knowledgeable consumers as we are, we can make decisions and we can help others make decisions. But ultimately, I really do think that it's the combination of these different factors, and really, education at a broader level than just engineers that needs to impact policy and law and politics that we need to do. It's broader than just what we can do at the level of strictly tech, in my opinion. CORALINE: Tobie, we've been talking about some very serious topics and some very nuanced topics. And I really appreciate your perspective on these things. But the overall tone has been one of helplessness, one of how can we do the right thing in this complicated, interconnected web of corporations and governments and end users with privilege and users without privilege, technologies with privilege. Is there something in all of this that gives you hope? TOBIE: What gives me a lot of hope is that these conversations are happening and people are actually starting to talk about this, think about this, organize around this and do things around this. If you think about technology as an sort of like an accelerator, it can be an accelerator of good as much as it can be an accelerator of surveillance and sort of bad things. And I just think that we have to be more wary of looking at it as something neutral and be more conscious that it has all these different facets and just be careful to always push it and always drive it towards good and ethical outcomes and be mindful of that. JACOB: Well, we've come to the end of our show. I'd like to thank Tobie for coming on. Thank you very much. TOBIE: Thank you. I was very honored to be here and I really enjoyed this conversation. JACOB: Likewise. For all those listening out there, we do have a Patreon which helps sustain the labor that goes into making this podcast great. If you are able, you can visit us as Patreon.com/GreaterThanCode and give any amount that you feel you're able to give. And in exchange, you'll receive an invite to our Patreons only Slack channel where we have low volume, high quality conversations about the same things that we talk about on this podcast. And it's quite a great community. CARINA: In addition, we really appreciate companies who step up and sponsor episodes. Each time that we have a sponsor, it means that we can assure that the content is well edited and most importantly, that there is a transcript so that you can come back to this and then others can search and read this stuff. As much as possible, we want to give you every opportunity to be able to revisit these conversations and sponsorship is how we do that. So please, if you're interested, ask your company to sponsor Greater Than Code.