JACOB: Hello and welcome to Greater Than Code. My name is Jacob Stoebel and I'm here with Coraline Ada Ehmke. CORALINE: Hey everybody. I am very happy to introduce our guest today, Don Goodman-Wilson. Don is a philosopher-engineer, currently employed as a developer advocate at GitHub. In previous lives, Don has worked in such diverse fields as webapp security, chatbots, streaming media, embedded hardware, and the model railroad industry. He also holds a Ph.D. In philosophy from Washington University in St. Louis. Don enjoys statically typed languages, sailing, and Japanese model trains, and lives with his wife and daughter in Amsterdam. Welcome, Don. We're so happy to have you on the show today. DON: I'm very happy to be here. Thanks for having me. CORALINE: We always open the show trying to get to know our guests a little bit better by asking you the question, what is your superpower and how did you develop it? DON: I think my superpower is that I'm a generalist. I don't have any particular superpower, but I have lots of little superpower -- that's going over the top a little bit. But I have lots of skills that I can bring together in one place. And I think it gives me the opportunity to bridge cultures or bridge disparate themes that otherwise seem unbridgeable. So at the moment, I work as a developer advocate and that role involves bridging developers with product people or marketing people and ensuring that they know how to talk to each other. So I'm the interface there. You can guess by the large list of skills that you listed in my biography, I have familiarity with a lot of these areas and I like to find ways to bring them together. Today, we're bringing together engineering and philosophy, which I think is going to be a lot of fun. CORALINE: And a great topic for Greater Than Code. We have a resident philosopher on the panel who is unfortunately not with us today, but very interested to hear your thoughts and hear how you connect that up. In particular, I wanted to have you on the show today because of an article that you wrote and published, I think on your personal blog as well as on Medium, about ethics and open source. And this is a very topical topic. Recently, there's been a big push with #NoTechForICE, for developers to reclaim control over how their software is used. We saw Seth Vargo pull code from RubyGems and GitHub when he found out that his code was being used by Chef to support a contract with ICE, and that turned into a big media event. Also me personally, I released the Hippocratic License, which is an ethical open source license and published the ethical source definition. So there's been a lot of talk, I'd say over the past month about open source and ethics. And I thought your article did a really great job of covering some of the key bases for using an ethical lens to examine open source. So could you, for people who haven't read it yet, link will be in the show note, can you kind of summarize what you were talking about in your article? DON: Sure. I'd be happy to, Coraline. In my article, I tackled two somewhat disparate topics that are quite current, I think in the conversation around open source. And I try to bring them together under a single heading. And that heading is essentially that Open Source, and that's capital O capital S trademark sign, is failing us as a collaborative development method. It leaves open, indeed encourages certain ways of being that many of us are beginning to feel very uncomfortable with. So for me, my decease began about a year ago when I had a conversation with a fellow engineer about one of the many interesting alternative, I'll just call it alternative licenses they're floating out now, alternative to OSI is what I mean by that. Licenses, like the server side license that Mongo tried floating into the OSI or some of the licenses coming out of zero license that offer for example, paywalls for developers who don't want to see their projects commercialized and increasing amounts of talk around the way that, for example, machine learning code gets used and the possibility for abuse of human rights with that kind of code. The report that came back to me was they're not OSI approved and that was the totality of the argument. And that doesn't sit well with me as a philosopher. That sounds very much like an appeal to authority. And so what I hear is a set of assumptions that the OSI knows what's best. And we should defer to them. I'm not willing to accept the OSI's authority unconditionally. CORALINE: OSI is Open Source Initiative. Twenty years ago, I think it's been about 20 years, they published the open source definition, which is the canonical definition of open source. To your point, Open Source is an endeavor spearheaded by OSI and they preserve the spirit of open source from their perspective through the open source definition. And that's very license-centric. One of the things that occurs to me is that although licenses are an important factor in open source, licensing isn't the totality of the open source experience for millions of developers around the world. DON: That's exactly right. There's much more to it than licenses and there's much more to it than their goal of ensuring unmitigated absolute openness. And they seek to ensure that through legal means, through the enforcement of licenses that meet these goals of openness. But I think that the current atmosphere in many sectors of open source development is that openness is not the primary goal. Yes, of course, we would like to get as many people as possible in our community helping out. We'd like to help as many people as we can, but writing code that is useful for others to use. But at the same time, we're finding ourselves wanting to exercise a little bit of control for that. And that means taking a step back from full unmitigated openness and starting to think about the ways that we can build community without committing to full openness, the ways that we can enter into contracts with users of our code without just giving it to them a priority, but to have a conversation instead about what a world looks like where we can sleep at night, knowing that our code is not being used for violating human rights, for example, which is the thing that's happening. People's code is being used for that. This isn't a hypothetical or an academic exercise. And very few of us want to see that happen. Very few of us want to see our code weaponized. And that means stepping back from the OSI stems and rethinking what it is that we're coming to. JACOB: One thing that came to mind as I was starting to read the article and I wasn't able to finish it just yet, is I think there've been a lot of discussions, especially recently about the concept that free speech or unregulated speech or I guess you could extend this to sort of just complete openness in terms of being able to access code is just like you said, the only thing that seems to be important and that if speech or a code is made completely open, then hand wave, hand wave, hand wave, any other problem can be remedied by that. And it seems like you were taking issue with that. DON: Yes, I was definitely taking very strong issue with that. One way to put it is that we are as a community, taking a step back from a consequentialist perspective of morality. Consequentialism says you should choose the action that has the best outcome for the most people. In some ways, [inaudible] the most happiness among the most people, but it doesn't have to be happiness. Just value in some hand-wavy broad sense of the word. And the open source principles of openness seem, it's not explicit, I'm reading a lot into this, but it seems to reflect a consequentialist ethic. If we make the code maximally open, if we maximize the opportunities for people to use a code, yes some bad things are going to happen, but there's going to be a lot of really good stuff that comes out of it that's going to offset it. At this particular moment in time, watching the way that people use our code for evil, it becomes very visceral. We can feel it. We see the images of the ICE detainees, for example, and that hits us in a particular place. And I think that demonstrates to those of us who feel the strong pull that we are not consequentialists. In fact, we have a very different mode of ethical decision making. One that resonates with me is a mode of thinking called contractualism. And we can talk more about that as well. But basis of this is it's rejection of many things, but it's a rejection of consequentialism. It says it's not just the outcomes, there's something that's fundamentally more valuable than having good outcomes. And in the case of the theory that I talked about in my paper, that thing that is deemed most valuable are other human beings as rational agents. The recognition of them as rational agents and the respect that we have for them as fellow rational agents in the world with us is the most valuable thing. And sometimes that means maximizing happiness. But sometimes that means recognizing that there are certain actions that are just full stop wrong, separating children from their parents. That's full stop wrong. And it really doesn't matter how the consequential calculus comes out, like there's no offsetting that kind of wrong. And so I tried to apply this way of thinking in my essay, but this notion of openness demonstrate that there are other ways of thinking about the good that we create in the world through the code that we make. And in so far as we are committed as code authors to helping people, to building communities that are useful and supportive and open, welcoming, then maybe it's okay if we take a step back from the OSI. We don't have a moral commitment to the same things that they do. That's actually pretty clear about the things that people are saying and the actions that they're taking. And so the way to resolve this conflict between, "Well, I want my career to be open and out there, but I don't want my code to be used for evil." Let's just step back from those high definition. Let's step back from the supremacy of openness and think about what values we really do have and then becomes an open question of how do we encode these into our communities? How do we encode these into the licenses that we use because we do want to enforce it in some sense. CORALINE: You should point out for me the FAQ on the open source definition published by OSI specifically poses the question, can my software be used for evil? I think that some of the glib response that the FAQ gives is that, "Well, if you're going to be open, you're open to everyone and that means that evil people will use your software." So there's a tacit acknowledgement that as open source authors, we have no ethical rights and I would say it also implies that we have no ethical responsibilities. DON: Yes, precisely. It's a total aggregation of our responsibilities as coders. There seems to be this model of you build it and then you throw it over a wall. And then whoever's on the other side of the wall gets it, takes it does what they want. That's fine. But we've wiped our hands of it. We've washed our hands of it once we've thrown it over the wall. We no longer have any further responsibility for what happens to it there. That's obviously not true. We live in a very, very interconnected world. We have an effect on people and on outcomes on the other side. There is no wall. We're not throwing it over a wall. We're creating a causal chain in which we're part of the cause, and the effect is something pretty awful. And I think it's incumbent on us to recognize that, to recognize that code has impact on other people, impact that we are responsible for in some sense. That all code is inherently political, in the sense of that it affects other people and those other people have a claim on how our code is affecting them. JACOB: I have not worked in software for very, less than only about five years now. But what I know about the history and sort of origins of open and free software was, or like the central problem that community of mostly white men were trying to fight against was like the monolith of Microsoft and other big companies and basically we are people who have every privilege in the world, but one thing we don't have is the ability to get the stuff and hack on it for free however we like. And it's like if we can just solve this one problem, then we'll all be happy. And I'm just sort of thinking back, that's the sort of origin. It makes sense that any of these other concerns, they would seem like you were speaking a foreign language. Like, "Why would I care about how this has to relate with ICE? I don't have anything to do with that. That doesn't affect me." I'm sort of seeing why that seems so irrelevant. CORALINE: To build on that a little bit. When the free software foundation, when it was founded and they published their principles of free software, and the same with the OSI with the open source definition, the big bad, as you pointed out, Jacob, was companies like Microsoft, the big bad was corporate America. And I think what they're failing to recognize is that today, the big bad is bad actors within our own governments. They're not solving the right problem anymore. JACOB: Sure. And when governments, and I guess corporations, have figured out a way to use and benefit from open source, what's the difference other than the original conceivers of open source have gotten what they needed from it. DON: I had always been under the impression, I have not been able to find a definitive source for this. So maybe at some point after the show, we can dig into this a bit. I've not been able to find a definitive source for this, but I've always been under the impression that ESR's intent in guiding the creation of the OSI with the extent that he did and creating the open source definition was to make concepts taken from free software more palatable, in fact to large corporations by removing the explicitly political element from it. I have a quote somewhere in my article about this, but essentially creating a money saving activity for corporations by allowing them to tap into these open source communities. I feel like the intent was probably good, but intent doesn't count for much here. I get the impression that a lot of the people involved in these communities were libertarians is probably not an impression. I know some of them are very explicit about it. And the libertarian utopia really only works when the entire community consists of a homogenous group of people who share the same beliefs when you're all libertarians. As it happens, that's not the reality. Is it any wonder that this libertarian ideal broke down, the ideal in this case being, "Well, if we have code, we know what it's going to be used for, we're close community, we know how this community works, and we know the rules." In this case, [inaudible] or something like that. That's sort of [inaudible]. It's not really how the game is played. And this sort of fantasy breaks down very quickly, very rapidly, and you begin to see a lot of unintended consequences. In many cases, the bad actions being carried out by ICE are being enabled, in fact, by the large corporations who are in a position to benefit from, and understand very, very explicitly the benefits of open source software development. And so in fact, this very open nature does incentivize the very things that we're railing against here. Of course, Palantir's going to build their systems on open source software because it's cheap. In fact, it's literally free. So, why not? We can say they built it themselves, but that was just IBM's argument in the 1940's. If we didn't sell it to the Nazis, somebody else was going to. Fine, let them build it themselves. I don't want to be a part of that, quite simply. The fact of the matter is we do live in very [inaudible] communities. We participate in these, we celebrate diversity in our communities. When we do that, we find that we need to set out new rules. We begin to discover, as you had hinted at earlier, the paradox of tolerance. We begin to discover the paradox of tolerance. Full, unabashed tolerance works well in small homogeneous communities. But it doesn't work well when you have larger, more diverse communities, especially communities with interests that are often at odds with each other. In order to maintain an appropriate level of dialogue within that community, you have to set bounds on the sorts of conversations that you can have. And I feel there's a very similar analogy here to open source. There's a paradox of openness, if you're too open or you create too many possibilities for things that we don't want to see happen. And so in order to maintain open communities that are welcoming and inclusive and provide a sense of belonging, we actually have to close the doors just a little bit. We have to close the doors to people that are toxic. We have to close the doors to people who want to use our software for ways that we don't agree with. CORALINE: So Don, when I released the Hippocratic license about a month ago now, I saw the same sort of thing [inaudible] early on with people saying, "It's useless because it doesn't match the OSI's definition of open source. It's not an open source license." In fact, the OSI even asked me to remove the term open source from the website. But on the other hand, there are other people who are saying that licensing is not the right mechanism for bringing ethics in open source. Do you have any thoughts on that? DON: I think it depends on the problem that you want to solve. So, for example, one of the topics I've touched upon in my essay is that the open source maintainers, open source contributors essentially comprise of free source of labor for large corporations who are able to derive massive value off of the backs of that labor to the extent that we feel that there is something dodgy going on here, something that requires correction, something that requires maybe closing down the openness a little bit. There probably are licensing means to that end. Or if they're not licensing means to that end, then I have friends who believe trademark law maybe the right way to go in that direction, but that doesn't solve all the problems. It's very difficult because people are very fond of telling us to legislate morality. That's putting the cart before the horse. The first thing to do is to create a community to rally around a set of norms rather than thinking about how to legislate or use a law to our advantage. And so in that sense, yes, I strongly agree that licenses are barking up the wrong tree. That said, because we are such a license oriented community, license like the Hippocratic license for example is a useful tool but not as a legal instrument. It's a useful tool as a rallying point or at least as a particular expression of norms or values that you can use to attract people to your community and to regulate the behavior of your community. But in that sense you're not using it as a legal document, you're using the legal document as a statement of something that's a little more fundamental. The question is what do we do to move forward from there, assuming we can find a community of people and we've normalized the kinds of behaviors and outcomes that we want to see, we have a structure within that community. It does remain to be seen like how do we expand beyond that small community into a broader world and create something that approximates openness without actually being there. With speech acts, it's relatively straight forward, but speech acts are things that other people too, and you can sort of wall those off from our community, but with code it's something that you're giving to others that they then go and use to do something with. And it's not obvious to me what the kinds of mechanisms for enforcing those norms outside of the community look like. But as a first step, yeah, I think these sorts of licenses are a very interesting tool for creating discussion and propagating the norms that we want to see in these communities. Another case in point is the JSON.org license. That's a BSD license that adds a clause. The software shall be used for good, not evil. That's got no legal force, let's be completely honest. But that's not the point of it. I don't think the point of it was to enforce any kind of legal motion of good or evil. Having this is just kind of nonsense, but it was put there for a reason and it was put there very deliberately. I'm sure it wasn't inserted there as a joke, although it might've been thumbing their nose that OSI fact. But I get the sense rather, and I don't know, I haven't asked these people so I can't say, but I get the sense that it was put there to make a statement to the community to say this is where we stand on certain issues and we would very much like you to join us in thinking this way. Of course nobody really noticed it until recently, I think. So, I'm not sure how effective a tool that was, but nevertheless I think my understanding, my interpretation of the situation was that was probably the intent. CORALINE: So Don, coming back to some of the reactions by frankly oligarchs in the open source community. A lot of people talk about the enforceability of ethical source licenses, which strikes me as really interesting given that most open source licenses that are OSI approved have not been tested in the courts. Do you have any thoughts around enforceability? DON: Pot kettle black? I think enforceability is 1990's thinking being applied to the 2010's, almost 2020's now. When the OSI definition is being formulated, when open source licenses were first being examined, the notion of enforceability was very important to them. And many of these people look forward to legal tests, which in many cases just didn't come. But the reason for that, again was this sense that I think, and again, I'm interpreting history here when I say this, but the sense of making these things palatable to large corporations required a certain amount of legal safeguards for them that they wouldn't then turn around and be sued for using the source code or something like that. So enforceability was something that was a necessary condition for the corporate adoption of open source software. However, I think that thinking mutated over the last 20 or 30 years, an enforceability became something seen as valuable in itself. I have license if you can't enforce it. But we're not thinking about, in this case, trying to make our source code palatable to large corporations. We're thinking about how are people going to end up using the code. In fact, we're trying to make it less palatable. And so in some sense, enforceability is a red herring. On one hand, there's the statement I made earlier, which is simply that we're using licenses as a method of expression. On the other hand, by inserting crazy clauses like the software shall be used for good, not evil, no, that's not enforceable. But I guarantee you corporate lawyers are going to look at that and go, "Can we red line that?" And the answer is no. No, you can't red line that. "Okay, we'll find another solution." When? In many cases. One of the things that I'm personally very interested in pursuing as a next step is finding the community of people who feel as I do and feel as you do, and getting us all in a room to talk to each other. Because I know there are a number of people out there that have similar or adjacent feelings about the topic of morality in open source. I know that a great many open source maintainers are deeply concerned that they're not being compensated for the value that they create, which may or may not be a moral concern. I've phrased it as a moral concern in my essay that was probably more bombast than anything. I think it's important. I think it's linked. But this is something that hits maintainers at home. So, Henry Zhu who's trying desperately to make a living writing open source software gets paid some, but has to find ways to supplement that. So, this hits him in a very immediate way. Of course, it's a concern that he and other maintainers are going to have. For many of us, we look at our hood and we think this would never get used in a facial recognition system deployed by the military. And this is probably true. It's probably true. And on those occasions that it does get used for something like this, it's not always obvious when it actually happens. I myself have at one point written code for the military when I was an undergraduate. I'm actually pretty sure that they rejected it outright, it never got used anywhere. But I don't know that. Even though at the time I was explicitly writing code for the military. And even now we don't know if the military is using my own code. The point being, it's not something that's immediate. It's not something that I might look at these bills coming in, piling up and I feel it, rather I see these awful things in the news and I feel very bad about these things, but it's very difficult for me to see the causal connection between them. And I think it would be very interesting to get people who have experiences and opinions together in the same room on this topic. But I'm a terrible community organizer, to be completely honest. I'm a terrible marketer. So I don't know how to make this sort of conversation happen, but I'm hoping in part that by being here, for example, that we can spark this kind of conversation in a broader way and a more focused and deliberate way. CORALINE: That's something I'm very interested in as well. And I'd be happy to lend whatever leverage I have to make something like that happen. I want to come back to something you mentioned in passing down and that is compensation. The fifth principle of the ethical source definition reads: its creators have the right to solicit reasonable and voluntary compensation from the communities or institutions that benefit from the software. And you said that compensation was not an ethical consideration, but I want to challenge you a little bit on that. Isn't seeing a benefit from your labor, while not a human right, isn't that something that we can and should ask for? Isn't that a recognition of the value of our labor? And does that have an ethical dimension? DON: This is something I've thought about and I've gone back and forth about for those who are digging up my Twitter handle and looking at past conversations. I have, in public before, strongly stated the case that there is a very strong moral dimension to this. And then at other times, I feel like walking that back, that there's in fact not such a strong moral consideration to it. Nevertheless, I do think that there is, to the extent that we live in a market oriented capitalist society, there's a social contract that we ought to exchange [life for like]. If we benefit from somebody's labor, that labor deserves to be compensated for in like terms so that both parties walk away with something more than they had. I'm not at the moment convinced that there's a moral component to this, but there is nevertheless, there are obligations that we can have that don't have moral components to them. CORALINE: I know you're going back and forth on it and I'm not as, I don't have a philosophy degree, but it just feels to me fair to compensate people for their labor if you are benefiting from it in a monetary fashion. It just feels like the right thing to do. DON: Yeah. That intuition is just really worth pulling off. I feel that intuition as well. And I think a lot of other people feel that intuition as well. And it's worth digging into it to understand the extent of this obligation. And whether it's a moral obligation or not. Contractual obligations are an example of an obligation that we have to another person that doesn't necessarily have a moral component to it. There's a strong legal component to it because contracts are drawn up under the laws of a particular jurisdiction, and they create an obligation under the law between the two parties who signed to that contract. But it's not clear that these contracts necessarily have a moral strength to them. I signed a contract to buy something from you, for example, and of course, you're going to get me compensation in exchange for that item of value. If I decide I don't want to sell that item to you anymore, I've decided to break that obligation. It's not clear that we would say that I was wrong or bad for breaking that obligation. I just changed my mind. I don't want to sell it anymore. I've just decided I want to keep it. Okay, these things happen from time to time and that's fine. I don't judge you as being morally bad for suddenly changing your mind about selling this thing to you. The question then is does compensation for labor fall under the sort of contractual obligation? Maybe not necessarily under a specific legal framework, [inaudible] speaking. There are kinds of contracts that you can enter into in a marketplace that have the same way as a legal obligation, or is there a moral component to those sorts of obligations. And philosophers, I'm sure go back and forth on this. I've never really been party to conversations from this particular topic, so I can't say for sure. But I think on one hand it's in the air and I come down on different sides of this every time I think about it. On the other hand for this particular topic, although I think it's related to the more explicitly moral obligations that we have with our code, I think the fact that it is an obligation means it's important enough for us to really be reflecting on this. Because we do think, for example, that it is unethical to steal, to take something without offering compensation for it. Of course, you think slavery is unethical. Not that I want to equate open source maintenance to slavery, but I'm not going to take that step at all. But these sorts of obligations that producers and consumers and purchasers and sellers have with each other have a weight, even if it's not moral weight, they have weight. And I think this weight is connected to the idea that as creators of the code, we should have a say on what other people do with the code or what becomes with the code or how the code gets used. Whether that's the say and how it gets used for the oppression of other people or in, "Hey, you're extracting huge amounts of wealth with this code, I think it's only fair that you owe me some money, at least a reasonable amount of money for my time in producing that code for your benefit." There's still obligations. They're still very strong. They still stem from the same kind of source, which is control over your product. Does that help? CORALINE: Yeah. I think you're pointing out that it's complicated. That's what I [inaudible] down to. And the fact that you, yourself go back and forth on the issue, kind of demonstrates that it's complicated. I feel that compensation can fall into the camp of ethics. I think there probably is, or there might be some component to that. But I think at the very least, it's about fairness and fairness is something that I do want to see held up as a value in open source communities. DON: I agree. I just wanted to add one common misconception that people have about philosophy is that philosophy holds answers, that's not true. Philosophy only holds more questions. CORALINE: Fair. JACOB: One question that is coming up for me now is the question of how can I quantify the value that some open source software gives to me? Just to give you an example, when I was briefly a contractor and I built into my quote a certain percentage of money that I would then turn around and donate to open source. Just figuring that. So I'm building this using Ruby in Rails. I want to donate to Ruby Together. It's sort of like, "This is propping up my business. I want to just pay back." But I was always really stuck on it. I'm like, "How do I quantify that?" I don't know how to do that. DON: I expect that opinions are going to differ pretty significantly around this subject. I think there are a lot of different potential metrics for this sort of thing. And I think that what I'm about to propose is perhaps not fully workable in practice. But it seems to me way that we can do that is to look at, to the extent that you can estimate these things, how much time did you just save, how many engineer hours did you just save by adopting this project? Maybe it's not a lot because you're not really using Ruby on Rails to its full potential. Maybe you're just using a couple of features of it to get a website up because that's what you know how to do, when in fact some static HTML would've served just as well. So maybe it's not a whole lot of time. It doesn't have to be the whole amount of time that it took to make Ruby on Rails. Or maybe you're making heavy use of a lot of features of it and it saved you a great deal of time and allowed you to market that much faster. Well, you can quantify engineering hours. You know exactly how much those cost at your company. And I don't think I want to say that that's exactly how much is owed because when you make the calculation like that, then it's like, "Wow! For the same price, we could have built it ourselves. So in the future, we'll just build it ourselves." Maybe that's not what you want. Maybe you want these people participating in your community. Maybe you want the income from having built the software. So maybe price it a little cheaper than that to be competitive. I don't know. But it seems to me that's one possibility. It becomes much more complicated when you look at modern software development and its heavy use of dependencies, especially in the JavaScript world where they strongly encourage large numbers of very small dependencies. And then there's the further question of, "Okay, so using Express saved me X engineering hours." What does that mean? Express is more than just Express. Express is also all of its dependencies, [inaudible] many, many dependencies and their dependencies and so forth. And so there's a further question essentially of how do you divide that money up? But that's a really tricky one to answer because it may turn out that something like [inaudible] is super foundational as we discovered a couple of years back. [Inaudible] from the ecosystem, the ecosystem breaks full-stop. Does [inaudible] get a lion's share of the money? I don't know. There are lawyers who focus on these things. Like how do you divvy up an inheritance or something like that among generations of children. [Inaudible], this is a very similar problem, but I don't have any good idea but it is just something that we need to think about. CORALINE: So, we've talked about compensation quite a bit and now there's a question of whether that is an ethical consideration. I want to talk about ways that -- we have a lot of listeners who are participants in the open source community, either maintainers of popular libraries or people that contribute to open source projects that they find useful. What kind of tools should we be developing to empower people who participate in open source, participate in developing the open? What can we do? I think there's this feeling of helplessness right now, like we have no mechanism for preventing our work from being used for violations of human rights, for other nefarious purposes. What hope do we have to offer people who are ethical human beings who do care about these sorts of things? What can we do for them? What can we say to them? How do we empower them? DON: That's a huge question with many, many different facets. I'm going to tackle, I think just a couple of them. And I think the most important of the possible facets here is that at the moment we have a number, a small number, but more than one powerful institutions that define the culture around collaborative development. And I'm thinking of the OSI in particular. And as you've seen and commented upon, I've mentioned as well, these institutions do hold a lot of sway when people can point that the OSI said, as a rebuttal to many things that's not OSI approved. We need to help people understand that it's okay to question authority, that it's okay to say, why should we listen to the OSI? All of this began for me with that little unease in the middle of my chest that I could not put words to, which was like something feels very uneasy about this. It doesn't quite feel right. And I want to tell the people who have that little feeling in their chest, this isn't feeling quite right. And it's okay to express that sense of not feeling right even if you don't have the exact words to use, to explain that feeling in any detail. You don't have to justify it to other people. That feeling's there for a reason. And so normalizing in developer culture that it's okay to express this unease with authority is a huge first step for us, I think. And this means having conversations like this. This means saying in the open where public can see, doubting our institutions or questioning the authority that we've been handed. And it also means that when other people speak up and say, "I don't know what to say. I don't have words for this," when something doesn't feel right, to amplify that voice and to help give them the words that they need to help express themselves and encouraging them to use those words and to express themselves in that way. Because the more people that are speaking out about this sort of thing, the more normalized the speaking out becomes, the more chance we have of building the kind of culture that we want to run, collaborative software development. So that's the first thing I think that we have in our toolbox. The other thing is that at the moment we live in a culture where some sort of contribution to open source is almost mandated. It's very difficult for me to explain precisely this. I don't have words yet to explain my feelings around this. But I'm hoping that the more I talk about it out loud, the clearer it gets. But when you apply for a job in the tech world, they want to see your GitHub profile or they want to see a history of contributions to open source. And the more prestigious the project, the better, and more weight will be given to candidates that have a proven track record of open source contribution. CORALINE: That's probably problematic. It is the reality, but then it's incredibly problematic because participating in open source requires a certain degree of privilege that of course not everybody has. Sorry, I had to say that. DON: No, I mean, that's exactly right. I feel that too. You know, as a parent, I don't have free time in the evenings to go contributing to open source. So, my GitHub contribution graph is not as empty as many, but it's certainly much emptier than I would like it to be where I am in the market looking for a job. It's problematic for a huge number of reasons. One of which is simply that by creating the culture where people feel obligated to commit to open source, that creates incentives to look past that little voice in your chest that says maybe something about this isn't quite right. It makes people feel obligated to participate in a culture or in a community that maybe they wouldn't otherwise do to the extent that they are doing this. And it certainly does exclude a lot of people that I think would be much, much better brought into this community to increase the diversity of representation that we have among collaborative software development. That's a separate question that I'd like to set aside for just a moment. What I'd rather see those is a culture where there's not this expectation of participation in these sorts of things that people don't feel compelled to do it, but they're participating in these communities rather for internal reasons, for internally driven reasons. When you're participating in communities for internally driven reasons, you're going to pay much more attention to things like, "Is my code being used for ICE?" Have we created an environment where the people who are contributing to this code are being compensated for their time and their labor? Or am I just trying to build a community so I have more people to commit free labor to this project because I have to get a job, and we need to build up the procedure of this project so that I look better. It's got our ethical principles rather backwards. JACOB: Yeah. It seems to me that if an employer is so concerned with whether or not I love writing code so much that I will find a way to do it in my free time for no money, then that's really troubling. I don't know if I want to work for anyone like that. And [inaudible], it's a privilege that many don't have. DON: So I think one of the things that my friend, Joe Nash has brought to my attention, and I've tried to articulate in other places, I don't think I've done it well enough. But it's along the same ones that if corporations value or hiring managers. If hiring managers value contributions to high prestige projects, what are those projects? What are they going to be? Well, will it be something like TensorFlow? There is a project that's probably already been weaponized, so that feels a little wiki. React is very high up there, but React is Facebook. It's Facebook's recruiting tool, essentially. Not clear that that's a really solid moral move to be contributing to something that benefits Facebook. Almost all of the most prestigious projects out there are projects controlled by large corporations who are essentially using these projects for [inaudible] that are very dodgy at the very least. And it feels like normalizing the obligation to contribute to high prestigious projects just is normalizing contributing to problematic projects that increasing justice in the world. CORALINE: So I've tried really hard, Don, to keep this conversation as general and high level as possible, but I do have a license that I myself wrote about a month ago and released. That is my attempt at creating an ethical open source license. The OSI has commented publicly that they don't think it meets the definition. I've been working with the lawyer and I think we have a very convincing argument to make to them. The particular provision in the OSD that they're pointing to, you cannot exclude a field of endeavor in terms of the use of your software. And the argument that I don't mind giving away the strategy because I want them to think about this before they come into it. The argument we're going to make is that a human rights violation is not a field of endeavor. What we're pushing for in the best of all possible worlds, the OSI would come to see the logic of that and accept the Hippocratic license as a blessed, if you will, Open Source license. Failing that, I think forcing them to make a statement is valuable and I think that will create some ripple effects in the community. So I want to ask you to prognosticate. I want to fast forward two months into the future where I have done my best in conjunction with people who understand the law much better than I do. We've endeavored to make a case to OSI that the Hippocratic license does meet the definition. They have rejected it. There is some kind of public outcry, there's some people starting to question the authority of the OSI. There's a backlash of some sort with people expressing that they don't think the ethical dimension can be ignored. And now the first project in the Ruby ecosystem to adopt the Hippocratic license has come out publicly in favor of an ethical source license. And let's say this is coming from the Ruby community because that's where I have the most ties. Let's say it's a library that is used by developers in the development of their software, not necessarily delivered as a product to end users. What do you think would happen? What's the impact of that? DON: I think the impact is going to unfold a number of different dimensions. I think from a very practical dimension that it's not going to impact the usage of the tool at all. I think for example today many people recognize that not most people, a fair number of people recognize that the JSON license has its clause in it about not doing evil and people just don't care. Most developers honestly, they just don't care. They don't even know what licenses their software uses. They just want to get the thing done. So, Tierney has been conducting a sort of informal survey on the npm repository for license usage essentially where he takes a particular project, looks at its dependencies, looks at the licenses being used, whether a license is being used at all, are they compatible? And the preliminary results, and I don't want to misrepresent what he's found because he hasn't really presented a lot of this publicly, and I'm reading a bit between the lines. But it looks like a lot of people just don't care. They'll mix and match, whatever, which is funny when you listen to people talk about enforceability in the OSI definition and the importance of having a license. They're using unlicensed code in some cases, which is just a violation of copyright law full stop and they don't know it because they don't dig very deep. So I have a feeling that they're practical [inaudible] usage of this tool, whatever it is, it's probably not going to drop in any kind of measurable way. People like to talk very loudly. In other dimension of it, there's undoubtedly a lot of loud talk because there's already a lot of talk and this is just something new to talk loudly about, but that loud talk I don't think will be backed up by any kind of measurable action that's discernible for [inaudible] and usage. My guess is that the loud talk will go on for a month or two. There'll be undoubtedly a talk at [inaudible] in February, probably given by somebody at the OSI decrying the situation and talking about how Coraline is ripping the open source community apart by introducing non-OSI compliant license into the ecosystem in this way. That seems 75% likely. But the important question to one I don't really have an answer to I don't think is like, does this diminish the influence of the OSI? And I think the answer to that is actually no, I don't think it does in the eyes of those who are already loyal. When you have a debate about a political topic like this, studies have shown people walk away more polarized than they walked in. And I think this kind of debate will increase polarization and I think it will lead to those already loyal to the OSI doubling down on their loyalty to the OSI. This is not necessarily a bad thing though because it means that those who are on the fence, those who didn't understand the OSI's position, those who maybe didn't even know that the OSI existed but were feeling their effects very indirectly, will probably have their hearts open to the truth of what the OSI stands for. And I think to the extent that we can recruit more people into this way of approaching software development, that it has a moral component. We need to pay attention to it, the better, because that community right now is extremely small but I think it has the potential to be very big. Does that make sense? CORALINE: Absolutely. And I think that's actually a good outcome. And I would say to the people who say the licenses like these are not enforceable or they're not true open source are missing the point. The point is to get a conversation started around ethics and open source development and to make people feel like the little voice that you were talking about before is valid and is worth bringing to the surface and worth talking about. And who knows what kind of innovation arises from a situation like that. Software engineers are very creative people. We're problem solvers and I think raising the visibility of an issue like ethics and open source will lead to surprising inventions and outcomes. And that's what I'm hoping for. DON: I am so ridiculously excited to see what people who may have read my article or who listen to this or who follow you on Twitter or the future community to come. I'm so ridiculously excited to see what they come up with. It gives me some glimmer of optimism in these otherwise fairly dark times to think that there's something coming, there's something coming with the force of a train behind it and we get to witness this. CORALINE: Don, I think your voice is a very critical voice in this discussion and I'm very happy that you are thinking about this problem and sharing your thoughts publicly. I'm hoping that your appearance on the podcast today, not only did I want to give you a voice and give you a platform for sharing your thoughts, but it's all part of trying to make this conversation a little bit more prevalent and give some weight to it and get other people talking. So I would encourage you, if you feel strongly about this topic, engage with me, engage with Don, and engaged with other people who are talking about this on Twitter or whatever other venue you encounter us in. And of course, we have our own Slack for Greater Than Code. If you donate at any level on our Patreon at Patreon.com/GreaterThanCode, you get access to our Slack community and we will definitely be continuing this conversation there with some very thoughtful, very kind, very caring and very smart people. So, think about it, talk about it, engage in a conversation and think about that train that's coming down the tracks and where you want to be when that train arrives. Thank you everybody and thanks, Don, for coming on the show today. DON: As I mentioned, I'm ridiculously excited to see what comes out of this. Thank you so much for having me on the show and giving me the opportunity to share my views with your audience. CORALINE: It's been a great conversation, Don, and thank you for the voice that you're giving to these concerns and the voice that you're giving to people who have that little feeling in their chest. Hopefully, they'll feel a little bit more confident about expressing what their hearts are telling them is right. So, thank you.