NOEL: Hello and welcome to Episode 25 of the Tech Done Right Podcast, Table XI's podcast about building better software, careers, companies and communities. I'm Noel Rappin. After you listen to the episode, you can join the conversation. Follow us on Twitter at @Tech_Done_Right, where you can get notifications of new episodes and tell us what you think. You can leave comments on our website at TechDoneRight.io and also see our full catalog of past episodes. We're really curious what you'd like and don't like about the show and what you'd like to hear us discuss in future episodes so please let us know. Also, if you want to help other people find the show, leaving us a review on Apple Podcast is a great way to do that. Thanks. Liz Abinante is a senior software engineer at GitHub and occasionally acts as Empress of Documentation for Ruby Together. She's an enthusiastic community member and speaker who recently gave a talk on The Social Responsibility of Coding. We talk about what that means, the value of seeing the whole context around your work and whatever else came to mind along that topic. I enjoyed talking to Liz and I hope you all enjoy our conversation. Thanks. Here you go with Liz. Liz, would you like to introduce yourself to everybody? LIZ: Hi, everyone. It's literally me. Nice to meet you. NOEL: What are you doing these days, Liz? LIZ: These days, I've been at GitHub for about a month. I'm a senior engineer working on the projects feature for the website, the Kanban-style planning board doing a bunch of really cool stuff for that. NOEL: I have used that. Not a lot but I've used it. LIZ: Well, thank you for giving me a job, I guess. NOEL: Liz, we're here to talk about a presentation that you gave called 'The Social Responsibility of Coding.' Can you tell us what you talked about? LIZ: Yeah. I have seen a lot of stories cropping up on TechCrunch and all the miscellaneous internet websites for all the miscellaneous internet people, saying things like this random family's life has been destroyed because an anonymous internet company mapped every unmappable IP address to their house, which happens to be exactly in the middle of the United States. They had a toilet in their driveway, like their miles long driveway. They had SWAT team showed up because they thought they had children kidnapped. They had people thinking that there were murders and suicides and all these terrible, terrible things happening because obviously, the only reason you hide your location on the internet is because you're a criminal. That was sarcasm because this is not true. But I have seen articles like this, where we would do terrible things on accident. NOEL: That particular case was a really good Wired article. I think the Reply All also did a piece on it that was really good. LIZ: I read both of those and I was like, "Man, this is brutal. This is pretty terrible," and then little things would happene. For example, this isn't really a little thing but Blue Apron has some of the most atrocious safety and labor violations out of any company in California's history. In California, with a vibrant manufacturing and agricultural labor force has had all these problems but the worst offender of all time ever is Blue Apron. This company has been around not very long. NOEL: Yeah, also not a sponsor. LIZ: That's good. That's convenient. It would be awkward if it were. Then it would be like true little things like Instacart switched from having tips to service fees to help pay everybody equally but that took away your ability to provide a tip, which in the United States is a pretty standard way to offset nonliving wages. Just these tiny, tiny decisions that were being made for technical reasons or for tech companies that removed people from the equation. NOEL: This is a little bit different from the idea that algorithms are causing people's problems. LIZ: Yeah. This is like people are making technology without thinking about the ramifications of the things that they're building. NOEL: Right. I'm not thinking about the possibility that somebody might actually live at this place that I create as a default. LIZ: Exactly. NOEL: As the default, lat/long for IP addresses that I can't figure out. It's a flyover country. Nobody is there. LIZ: Yeah and the thing is was so silly because they wanted a human-readable number. Like the true middle of the United States, if they had done the true middle, it would have not been a discernible address. It was so far away from a discernible address that it was clear, it was a field. But when they rounded the number to make it human-readable, like just easy to read, it was their front door. I'm like, "Really? You didn't think to check?". And like, I get it. This is mostly probably made back when four people used the internet and three of them were using it to share files locally so it wasn't even really the internet. It was the fake internet. They didn't check. They didn't think. I kept wondering with things like the Volkswagen case, where the engineer actually got jail time and a fine for defrauding the government about what he did, are you willing to go to jail for the code that you write? Are you willing to go to jail to ship the feature that your manager or your product manager or your VP tells you, you have to ship? NOEL: That is not a question I have ever asked in a job interview. LIZ: No one has ever asked me, "Are you willing to go to jail for your code?" NOEL: No but I've been in situations. I have done work with healthcare. It's been a while but I've done work with healthcare stuff and financial stuff, where you bump up against legal responsibilities very, very fast, especially in finance -- especially in those two areas. Certainly, as an engineer, you need to know where the limits of the law are so that you can tell. LIZ: For sure. NOEL: I don't know. I may or may not have done... I have definitely done things that were not PCI-compliant at a client request. Over my objections but, still. LIZ: There's true legal issues where PCI compliance is a very black and white. You know what you need to be doing and then there are fuzzy things like data storage policies and data retention policies. Do you really, really need to be storing and tracking all this data in perpetuity? Not only does that seem a little dubious like how safe is that data but also environmentally, the amount of energy you have to produce to keep those servers running, to store that information, is that worth it? Environmentally, is it responsible to keep saving all this data just because we can? The cloud has given us this ability to do all of this stuff that we couldn't do before. Honestly, the cloud has probably why I love engineering so much because it just makes my job a lot easier but at the same time, do you really need every single piece of data that you are storing? NOEL: Right and it enables us to sort of not see the cost. I think one of the issues here generally is people not seeing the costs of their actions or the potential cost and we could just save something to the cloud and it's in a cloud, right? LIZ: Uh-huh. NOEL: But yeah, there's a real data center somewhere in a field --- see that they could use that as the default address for their IP. For some data center, that would be so funny. If it was like Area 51 or something like that. LIZ: And we're going to get to the point where the cloud has acid rain. There's going to be something that goes wrong here. Is it responsible? Do we matter? Do we care? All this very heavy philosophical questions but at the same time, we should be asking them. We should be more aware of what we're building and how it's being used and the side effects of what we're building. NOEL: We have a couple of different issues here. One of which is just in terms of data storage is being responsible in terms of limiting data and thinking about what you need to store and how securely you need to store it. Obviously, there's been all kinds of things, especially recently in the news about companies that have used terribly irresponsible practices with very important data. LIZ: But also we're not thinking about the product decisions that we make and how they actually affect end users from, I don't want to say ethical because that feels very heavy-handed but from like, are we doing good things? Are we doing bad things? NOEL: Right. One question is like, "Are we being responsible with the choices that we're making in our data?" and a whole different level is like, "Is this product a good thing to be working on?" LIZ: Yeah, exactly. There are some things that are obviously bad. If you make things that enable people to do harm to others, like you're actively choosing to make harm easier for people, that's bad. That's very clearly bad. But something like being a privileged tech worker making six figures and eliminating the ability to tip someone who's just trying to make a living wage, that doesn't necessarily on the surface seem bad but it's pretty mean. NOEL: Right and then you have things like Uber, which forgetting about their business practices for a second, if you think of Uber as a business, Uber is tremendously convenient or Lyft, or those services but they also are convenient and cheaper in most cases because they are undercutting government regulations in many situations. It's also true of Airbnb. Some of those things are potentially there just to protect existing taxi companies and probably shouldn't be there but some of those things are there to protect consumers or drivers and probably, should be there. Is that a good thing or a bad thing? Sarah Mei had a recent tweet storm about trying to dig out which side of the gray area Uber was just as a business, regardless of their terrible practices. How do you, as an engineer, approach these kinds of things? Like you're working on a specific tool right now, how does this come up in your day-to-day or your project-to-project existence? LIZ: Happily, it has not come up a ton because I've only been here for a month. NOEL: Right but you've also worked at other places. LIZ: Oh, yeah. I worked at New Relic, land where all of your data lives. If you were a New Relic customer, they have all your data. Not your customer's customer's data but they store freakish amounts of data and there are a lot of conversations that I can recall, like do we really need to save this? Do we really need to have this information? A lot of the times it came down to, "No, we don't actually need to have or save this information." Every time we would work on a new feature or need to save a new piece of data or need to manipulate a piece of data, it's kind of like, "Is this okay?" I was always deeply concerned about having access to our customer's customer's information. We shouldn't and we never did. New Relic did a really great job of that. But still, you have this enormous data warehouse so you care a lot about security and there was a great security team at New Relic. And you care a lot about the integrity of the data. You want to make sure that you're only looking at the data for the person you want to be looking at. I developed, I would like to think, a very healthy paranoia around my user's data. I have a very firm respect for the people who put those situations in place. People who know me are not going to be surprised by this statement. I was never afraid to stand up and say, "We shouldn't do that. It's wrong," in a feature planning meeting or road-mapping session. We would say, "We have all this data. What if we could pull this correlation information and pull it together and draw these conclusions.", Should we be doing that, though? NOEL: A very scary place to start is we have all this data, what can we do with it? LIZ: Right and that's a cool place to start but it's also a terrifying place to start. Especially when your customers put so much trust in the places they provide their data, think about your Gmail account for example. NOEL: Oh, sure. LIZ: You put so much trust into Google to keep your Gmail account safe but we've all seen the horror stories of people getting locked out of their Gmail accounts and not being able to get back in because there's no customer service to speak of. NOEL: Or just being sort of creeped out by advertising that is seems to be based on the texts of their emails. LIZ: Because we have all this data, what can we do with it? NOEL: Right and one of the things that you can do with it is micro-target, which is also there's been a ton of articles and things like that in news about using micro-targeting in ways that are probably unethical. LIZ: Yeah and I used to work in the ed-tech space, I made an online learning management system, one my earlier engineering gigs, and I was always deeply concerned about the privacy of minors but also the legality around minors using the internet is so complicated. We would be building features and one of the features would be related to a parent being able to have access or not have access to something and I was always very strong advocate of, "What about the privacy of the child?" This is important. NOEL: There are situations where... the school district where I live, children are given Chromebooks that have camera capabilities and the teachers, at least in theory have the ability to see the students desktop and see, probably not the camera but definitely the desktop, even when they're not in school, which I think is a little bit creepy. LIZ: That's creepy. NOEL: Definitely goes to a place of not really thinking through some of the dimensions of allowing that access. LIZ: Right because could you imagine if your company sponsored computer could just turn on your camera at any moment during the workday. Just during the workday, they could turn on your camera and see what you were doing. Creepy? Yes, very creepy. Violation of your privacy? Definitely, kind of, sort of. NOEL: Probably legal. LIZ: Right. NOEL: Which is also disturbing because I feel I see that story every couple of years, where there's some company that is keylogging their employees or trying to figure out when their employees take breaks or something like that in a way that feels super invasive. LIZ: Because, it always is super invasive and it all comes down to like, "I like to summarize this talk as just because you can, it doesn't mean you should." That's really the emerging theme. NOEL: When you're in the situation and it sounds you've been in the situation more directly than I have of saying, "This is not a good thing to do." How does that play out if you're an engineer and you are in a situation where you feel something disturbing or unethical or illegal is going on, what do you recommend that people be prepared to do? How do you prepare for that kind of thing or is it just something that happens in the moment? LIZ: For me, it was just something that kind of I fell into but also, I feel very secure in my job. I feel very respected in my industry and the role that I provide. I feel I'm very secure where I am so for me, it's very easy to kick up a fuss about stuff that I think needs to have a fuss. I feel I can really get in there and make ruckus. But for other folks who may have objections, they may not have that same level of job security or they may not have built up a reputation in their industry so it's incredibly risky for them to go out on a limb and be like, "We shouldn't do this. It's pretty terrible." For people like me and there are a lot of people like me in this industry, people who are relatively secure in their positions, relatively well-respected in their area of expertise, they have a little bit more leverage. I'll just say it, they have more power in these kinds of scenarios and leveraging that power is the most important thing because not everybody has it. For those of us who do, it's being vocal and succinct and clear about why this is terrible, how it could not be terrible. In the case of an online learning management system, where you're building features or use by minor children, where parents could potentially see their activity in some way, it's very easy to come up with an alternative there. It's very easy to say, "Instead of giving them this permission, what if we gave them this permission. What if we redacted this data?" There's so many different options there that it's very easy to have an alternative solution and if there's been one theme throughout my career, it's 'don't be the problem-person, be the solution-person,' which is sometimes kind of garbage but sometimes kind of true. I think when you're bringing up concerns of, "Are we doing a good thing? Are we doing a responsible thing? Are we taking our customers data and keeping it safe? Are we doing things with integrity in that scenario?" it's just different. It's easier to be the problem-person in that scenario. You don't always have to be the solution-person because that's a really fucking hard problem. That is such a hard problem. It's so hard to have a concern about privacy, about data integrity, about even ethics, and just come up with a solution. There's a whole discipline. Ethics, it's a thing. You can study it. Really. I swear. There's a whole industry around what's right and what's wrong and how you determine that so you don't always have to be the solution-bearer. I feel like in this scenario, when they see a problem, we've been conditioned in this industry to have a solution and you don't have to. But you do have to understand the problem and it doesn't hurt to get some people on your side, pull all of your teammates or pals and be like, "Oh, man. We're going to build this thing and it's going to put everybody's data as a live stream in Times Square just because we can, and this is terrible. We shouldn't do this." To be clear, no one's ever made me do that. But having people on your side always helps. If you feel that something is risky or dangerous or bad for your customers, honestly odds are you're not the only person. Group think is super powerful but you can fight it. I believe in you. NOEL: Group think works both ways. Group think can work both ways. Is there a sense that you need to pick your battles and not be thought of as the person who is always raising concerns and therefore, gets dismissed. LIZ: Yeah, as a women in tech, I feel like I always have to pick my battles. But when it comes to the safety privacy and integrity of my customer's data, I never pick a battle. That absolutely comes first for me. I'm not going to be the person that the Equifax CEO points to and is like, "This single engineer didn't upgrade something." NOEL: The one person at Volkswagen who knew about the problem. I'm sure, just the one. LIZ: Just the one. Only one person in this multi-million dollar evil scheme. NOEL: That's a piece of advice that people give you. If you are in a situation where you don't feel you have power but you see something that you are uncomfortable with, then one strategy is to try to get other people on your side so that you can build the power through numbers. I think the ability of just trying to frame the issue in such a way that other people can come to the conclusion that you have come, I think that a lot of times what happens here is that this kind of thing winds up being a framing issue or a perspective issue. Sometimes, you can get people to come to your conclusion step-by-step -- LIZ: As opposed to getting into the debate and then winning the debate. NOEL: Right, as opposed to starting off in a posture that might put somebody in a defensive position to sort of walk them through. I had a situation, actually it was something I spoke to you about. It was not something that it was affecting customers but it was a potential policy affecting employees that I found a little bit disturbing and was having trouble getting people to see my perspective on, and eventually was able to, by role-playing with people like imagine you are X in this situation, imagine that the situation was playing out not between you and other person of goodwill but imagine the situation is playing out between a hypothetical bad actor with power and hypothetical person without power. Sometimes just exposing those scenarios to people can help them see a problem that they're not already seeing. LIZ: Because the classic problem is most good people think that most people are good people and that's not true. Humans are garbage. NOEL: And we have our pull quote for the episode. LIZ: Well, there you go. It's just one of those things where it's really easy to think that good intentions are good enough. That's probably the story of most people's lives and honestly, that's okay. Not everything is your fault. I'm not going to hold one engineer responsible for one shitty product decision. Somebody, somewhere put their foot down and said, "We're doing this," and ultimately, people are like, "I'm not going to lose my job over not building this." Fundamentally, it could come down to that. It could come down to you putting your foot down and saying, "No, I'm not going to build this. Put me on a different team. Put me on a different project," or you have to find a different job. That's always a possibility in these scenarios. Thankfully, I've been fortunate enough that I've never had to run into that and I know when all of the Uber catastrophes were exploding in the past year. For me, it's been since Uber has been a company. But everybody is tweeting about they can't believe people that are actually work at Uber or they can't believe people that actually do this or you'll never going to hire anybody who works there and I'm sitting here thinking not everybody has the ability to choose where they work. NOEL: It's hard for me to fault somebody for staying in a job, even if it's a bad situation if they don't see a way out. LIZ: Exactly NOEL: A company where the policies are bad, to me the responsibility is higher than that, at the people who are making the decisions, at the people who are potentially hiring developers who have low leverage on purpose so as to not have people who have the comfort levels to raise concerns. Like I said, I feel empathy for engineers who are in tough situations. I hope that you can help them find some tools that they can use to help those situations but ultimately, everybody's got to make their own decisions, I think. LIZ: Yeah. Because stuff like this is so personal, what feels wrong to me may not feel wrong to someone else. It's really tough to make a call sometimes. I would love to say that if GitHub asked me to mine Bitcoin in everyone's browsers and then also log all their information and then also steal their code and then also publish it on the internet, even though it was a private repo and then also, print it and mail it to people and then also other terrible things that I can't think of, I would like to think that I would say, "No, I'm not going to do that." But at the same time, I also have a mortgage. I have a dog to feed. You know, priorities so it's all super, super hypothetical until you're put in that terrible situation, where you're like, "Ahhh!" NOEL: It's easy to be free with somebody else's job. I think that no job is perfect, especially a larger company might have a history or a division or something that is going to be challenging. LIZ: I think when you work in tech, the kind of consensus I have come to over the years has been, "Everything is awful but pick the awful that works for you and doesn't destroy you every day," which is so pessimistic when I'm sitting here being like, "You can do it. I believe in you. Build good things." I'm a glass half-full and a glass half-empty kind of lady, is what we're saying but it's an individual decision. It's a personal decision. Sometimes, there are things that are obviously terrible but for the most part, you have to do what is right for you. If you can't sleep at night because you think you're doing something terrible, that's probably a sign. NOEL: In career basis, it helps to kind of think about the situations that you would not want to be in and try to guide your career so as you don't avoid the situations. I have not spent a whole lot of time working at super large companies and some of the reasons for that is I don't like working at super large companies and one of the reasons for that is it seems like the possibility for weird situations that I would be powerless over seems higher. LIZ: I can confirm, having worked mostly only at large companies but in the same time, I love large companies because I feel like they're very stable and very secure. I, as a single income earner, that is super attractive to me. This company is not going to go anywhere because they employ hundreds, if not thousands of people. NOEL: Sure. I worked at Motorola, which I guess stable might not be the right word but that was certainly part of the appeal. They had processes in place. You knew what you were going to be doing for the next six months and you knew what your next career ladder was and that was all very reassuring. That's on a career basis. On a project basis, I think it just helps to think through. It's terrible. You have to think like a terrible person. LIZ: You do. You have to think what is the worst person going to do with this. NOEL: And it's amazing to me how often I see things go out where it's clear that nobody has ever thought about, even now. Even after almost a decade of social media which pretty much lives on ignoring what would a terrible person do with this? LIZ: I legitimately don't understand it like how has Twitter not figured it out. NOEL: Yeah. Twitter is baffling. LIZ: Yeah, that's a good word -- baffling -- I like that one. NOEL: As I'm coming into work today, I'm reading about Amazon Key, which is allowing Amazon delivery people to get access into your house. I have not read anything other than that, other than the tweet length version of that but that sure seems like -- LIZ: Oh, my God. I have not heard of that and I'm absolutely like absolutely not, "This is how you get rape-murdered in your sleep." NOEL: It sounds like a terrible idea and yet -- LIZ: What would that do to your homeowners insurance prices and your renters insurance prices that this random faceless company with an enormous customer service machine, where I can't speak to the right person about the right thing, unless I'm on-hold for two hours has a key to my house? NOEL: Yeah, I assume it's some sort of one-time, fancy digital key that you can enable. LIZ: That's super not hackable, totally legit. NOEL: Because that kind of Internet of Things things, they have a sterling security record. LIZ: Oh, my God. I'm living in fear now. What have you done? Why did you tell me about this? NOEL: I don't think they're going to make you put one in your house, Liz. This is also another argument for diversity and inclusivity in teams that the more people you have on a team who had a different range of experiences and different ranges of terrible people in their life, the more likely you are to have people come up with objections to things that should be objected to. LIZ: Uh-huh and it's like how would you know that this technology can be used to stalk someone, unless you have either A, stalked someone or B, been stalked. Who do you think in that scenario is more likely to maybe be concerned? And who is more likely to actually take advantage of that feature? It's the saddest thing that having lived a life and had bad things happen to you makes it easier for you to see when bad things might happen to other people but at the same time, that's just the world that we live in. I want to protect everybody and hug them and keep them safe and keep their data safe and keep everything private. NOEL: I think the other important thing there is when somebody on your team comes in with a different perspective and says, "I think this could be misused," you need to listen to them. LIZ: Oh, for sure. You got to hear it out and you got to ask questions and follow up and make sure you really understand where they're coming from. NOEL: Because eventually you don't necessarily need to be the solution-person to raise the problem but eventually, somebody needs to come up with a solution, even if sometimes the solution is we just don't do this. LIZ: Yeah and sometimes, the solution comes out of that conversation. The best teams I've worked on have been the teams where I could just have conversations with about what we were building, without be concerned about the underlying technology. NOEL: I think the one thing I would recommend for people is to just be aware of the context that they're coding in, what the actual business goals are, if you're at a product company what actually is going to affect and not affect the business to the extent that you can determine that. For me, at a consulting company, a lot of times that's being aware of the client's business and being able to ask questions like, "Is this really going to help your business." I've certainly had discussions with clients on a less intensive level but definitely, on the level of clients collecting demographic data that I didn't think they were going to need. In this case, the consequence for them was just a longer checkout. It wasn't even so much an ethical discussion, although it was a little bit. My concerns were both ethical and technological but you have to be able to make the case that you think your business goals are this and this is actually not going to help them. To do that, you need to have an awareness of your surroundings. LIZ: You have your individual customers and then you have how your customers are related to one another. Like you said, that demographic data, do you really need that and is it good for business? Is it really worth the effort? That's your individual customer. But once you've collected that data, how do you make sure that you don't accidentally expose that data to the wrong customer and then how do you keep all of the customer's data safe from outside forces. You have all these layers and you're like, "Liz, I'm so tired. I don't want to do this," and then I would just say, "Hire a really good security team and listen to them and collaborate with them and don't put them in a corner by themselves." NOEL: Yeah, listen to the people who know what they're doing. LIZ: Just listen to people. I mean, maybe not from the bad ones. Listen to people and let people feel like they're heard. If someone is sitting in the corner and it's like, "This might be a bad idea," instead of being like, "Well, we don't really have time to discuss that in this meeting right now," maybe be like, "Oh, interesting. Could you briefly summarize that for me?" and let them do their summary. Then if you truly don't have time, you make the effort to do the follow up. You make the effort to make sure that voice, that opinion, that perspective is heard and that it is validated in some way because there's nothing worse than finally getting the nerve to raise your hand and somebody being like, "Nope," without even hearing you out, just like being completely dismissive. It's so heartbreaking. NOEL: Right and from the person's perspective, that of course dramatically reduces the chance they're going to come up with another concern and weakens the possibility that you're going to find things. How many of these terrible ad campaigns and terrible products and things like that would be stopped if three people felt they had the agency within the company will say, "This is a terrible idea. Don't do it." LIZ: Humans have the capacity for such wonderful things but we're also complete idiots. We are so internally focused on ourselves and we think about ourselves first and you know what? This is exactly the advice I give to people in the negotiating salary. I'd say, "I put you first." It's a double-edged sword with pros and cons. NOEL: But what I think that's interesting about this is that allowing everybody to put themselves first in a meaningful way, actually is helpful here. If you're the person that has to determine an issue, your ability to raise that issue is part of putting your own concerns forward. You can build a process and a structure where people putting their selves and their issues forward is beneficial to the process and you can put together one, where only the terrible people put the process forward. LIZ: Ultimately, the goal is that people aren't just raising concerns. People are also bringing in ideas to the table so it's super hard to be like, "This is a bad idea." It's also super hard to be like, "I have a good idea." NOEL: Yeah. To phrase what I was about to say when I just said terribly in a better way. People basically are people and it is process and structure that enables you to put people together in a way that is either really effective and empowering or bad process and structure lets you put people together in a way that is not empowering, in which is harmful, both in the people involved and the people externally. I just was listening to a podcast show that was making the distinction... They're talking about hospitals that treat incidents as car crashes and hospitals that treat incidents as plane crashes. A car crash hospital something goes wrong and they're just, "It's a cost of doing business," and a plane crash hospital, if something goes wrong, they treat it like a plane crash. They do a full investigation of what happened that ends with a recommendation of how to prevent that from happening again. It's the same, it's just people -- doctors and nurses and things like that but it's a process in place towards continuous improvement and towards people raising concerns and having those concerns addressed, versus one in which nobody thinks their concerns can be addressed and therefore, nothing gets better. LIZ: Yeah. I have been there. I've been in a situation where I felt like nothing I said would be heard or if it was heard, it would be dismissed immediately. I felt like I had no power to speak up for users, let alone myself. I found throughout my career, once you start of speaking up for users, it was a lot easier for me to start speaking up for myself. For some people, that's not as big as a concern but for me, throughout my career that has been something that has been difficult. Despite being a very outspoken, I like to think, brave person at times. It's really hard to stick up for myself and you know what? If I can throw a gauntlet down for rando users who don't know me from Eve and don't care who I am. It's just that I build a thing that works, then I should also be able to stand up for myself. They're relatable skills, really. NOEL: We're starting to talk about the social responsibility talk. Who were you trying to reach in the talk and do you think you got to them? LIZ: I kind of hope I did. I have the topic rattling around in my head for a while but hadn't found what I thought was a good audience. Then this amazing conference in Canada, it's like the Canadian University students in computing or something. It's CUSEC. I can never remember what it stands for. It's a conference that all computer science students in Canada can go to and it's really super affordable. They typically get travel grants and they're like, "We have love to have you as a keynote," and I was like, "I have the best topic for you," and I told them what it was and they're like, "This is fantastic." It was in January in Montreal and it was the week that the 45th president was inaugurated and it was a few days before that so it was an especially apt time to try to bring people up from being a little bit down. I kind of ended the talk with like, "I know that a lot of you have internships. I know that a lot of you are starting to work in this industry. I know that a lot of you want to work in this industry and I expect every single one of you to make this industry not so terrible so help us make better things." I had a lot of students come up to me afterwards and they were like, "Oh my God, I never realized that I could object to what I was being told to build," because they're so used to being in school and they're like, "Here's the goal that we want you to learn, build something to learn this goal," so they're very much like built this, do this, learn this, study this and while there is discourse, they don't get input into their curriculum necessarily. For them to be like, "I can go into a job and ask, 'Why are we building this?' Is it a good idea to build this?'" It also brought a lot of like, "What if I get in this situation, where this is the only job and I can't leave?" I give them the not-literal hug but emotional hug and I'm like, "It's okay. We've all been there. You have to ultimately take care of yourself but move forward and do good things," and I was like, "I believe in you," and I did the little motivational penguin dance where I was on-stage and, "Yeah, I believe in you. You can do it. Go do it. Wo-hoo!" and they're just looking at you like you're not a cool lady and I'm like, "Yes, I am." NOEL: I am so sad that there's no video this yet. LIZ: There's a video somewhere. My lovely Canadian friends, I'm sure are working on it. It was a delightful conference. If you ever get invited to speak at this conference, if you ever have an opportunity to go to it, it's a really good one. I really enjoyed it. NOEL: It does sound really neat. LIZ: I normally don't advocate going and meeting a bunch of college students, having been a former college student myself but they were bright and eager and excited to meet you in January, in Canada. NOEL: That sounds like a lot of fun, honestly. LIZ: It's cold but it's fun. NOEL: It was cold and not fun where I was that week. You had empathy or ethics training as part of your initial bootcamp experience into the tech industry, right? Was that useful? LIZ: Yeah, to a certain extent I did. I think for all the dudes that were in my class, it was useful but for me, it was a bunch of people in a room staring at me expecting me to model the behavior better than them so it was like this really tedious exercise and you should just listen to other people. Nope, really. All you have to do is just listen. You should listen more than you should talk and as a very chatty person, this is a very difficult thing for me to do. I think it was useful for some folks but for me, I have done a lot of things throughout my life. I've tutored autistic children and if you want to learn empathy, tutor autistic children. That will sure as shit teach you empathy. I have worked with animals. I have worked with literally every population of people I could possibly think of. I worked with a domestic violence coalition, where I took survivor calls and resource calls and lobbied for funding and I did training. I was a nanny. I've done so many different things. The overarching theme was I helped people so I had to listen to people. A lot of the people that I met when I was new to engineering had not worked with people. They had worked with computers, for people. I think that that honestly, the best empathy training that I ever received was working in positions where I was serving people. Not literally serving, although I did actually waitress for many years. I had to put the needs of other people as my top priority. It was my job to make sure their needs were met. NOEL: Yeah, which is not the way most computer technology professionals approach their jobs. LIZ: Eh... NOEL: Stereotypically, that's my point. Listen to people, your users priorities come before your own or your customer's user's priorities and look around and understand the context of the work of what you're trying to do. Is there another takeaway that you would recommend for people? LIZ: I think my takeaway is really like it's super easy to get carried away with all the cool shit that we can do and it only takes a second to stop and ask like, "Should we be doing this?" That's all I'm asking for. Just take a quick sec and be like, "Should we be doing this? Is this a good idea?" NOEL: Thanks for being here, Liz. I hope this was a good idea. I had fun. LIZ: I had fun. I think it was delightful. It was lovely to chat with you and talk about this depressing but also uplifting topic where we will all become better people, right? NOEL: If people want to find you on the internet, where can they find you? LIZ: The best place to find me is on Twitter. I'm @Feministy. You'll see my big old grinny face and then some really cool sheep. I strive for that high quality branding but I'm Feministy literally everywhere on the internet. That's just where I live. NOEL: Great. Thanks for being here and we'll be back in a couple of weeks with another episode. LIZ: Bye everybody. NOEL: Tech Done Right is a production of Table XI and is hosted by me, Noel Rappin. I'm at @NoelRap on Twitter and Table XI is at @TableXI. The podcast is edited by Mandy Moore. You can reach her on Twitter at @TheRubyRep. Tech Done Right can be found at TechDoneRight.io or download it wherever you get your podcasts. You can send us feedback or ideas on Twitter at @Tech_Done_Right. Table XI is a UX design and software development company in Chicago with a 15-year history of building websites, mobile applications and custom digital experiences for everyone from startups to story brands. Find us at TableXI.com where you can learn more about working with us or working for us. As I record this, we have three positions open for software developers and frontend developer and designers. You can find those at TableXI.com website. We will be back in a couple of weeks for the next episode of Tech Done Right.