SAM: Hello and welcome to Episode 52 of Greater Than Code. I’m Sam Livingston-Gray and I’m here to introduce my co-panelists, Jamey Hampton. JAMEY: Thanks, Sam and I’d like to introduce our other co-panelists, Coraline Ada Ehmke. CORALINE: Hey, everybody and welcome to Episode 52, as Sam said. We’ve been on the air for just over a year now, which I’m super excited about. We have a special guest with us today, Emily Dresner. Emily is the CTO of Upside Travel. Prior to joining Upside, Emily was a product engineering lead for Luminal and web team lead and principal architect on the Elder Scrolls Online at ZeniMax Online Studios. Emily has a Master’s in Computer Science and Engineering from the University of Michigan. She’s also a certified scrum master and a fanatical Michigan football fan. Welcome, Emily. EMILY: Hi. Thank you so much for having me today. CORALINE: Emily, we always start the podcast with the same question and that is, “What is your superpower and when and how did you discover it?” EMILY: My superpower is my ability to know a guy. I have an uncanny ability to network. I always seem to have somebody in my pocket from college or from previous work or from online or through some networking facility that happens to know some weird and obscure fact. I’ve noticed this actually more and more as tech has gotten more specialized and I need to reach out to friends and get an answer on something very obscure and strange very quickly. That’s my superpower. SAM: That is nothing to sneeze at. Networking is super useful. CORALINE: Were you always that way? Did you develop that skill at some point in your professional career? EMILY: More or less, I don’t let people just go when I move on. I usually maintain a relationship, especially with my old college friends who have all moved all over the place. A lot of them have PhDs in very specialized fields these days. I keep up with a lot of people on Twitter especially. Not so much with Facebook. Mostly Twitter is my jam but just over time, I know people who happen to know a thing and I usually can reach out to know a guy to get an answer for something. CORALINE: I’m pretty good at keeping up with people on Twitter but I talk to two of my college friends maybe once every six months and I barely talk to old coworkers and I don’t want to have that power. JAMEY: If you’re not on Twitter, you don’t exist to me. CORALINE: Yeah, exactly. SAM: Wait. What do these college friends of which you speak? CORALINE: I have vague memories of people from college but I don’t think any of them went on to get PhDs. SAM: Speaking of advanced degrees, I noticed that you have a Master’s in Computer Science and Engineering. I find it’s rare enough in this field to find people with a bachelor’s degree and a master’s degree and it seems even more rare in the working field. What’s that like? Is that a path that you would recommend to others? EMILY: A master’s degree, especially from someplace like Michigan is mostly a signal that you have the commitment that you can follow through with a major project from beginning to end and that you have the capacity to read technical research papers. Those are really the two things that getting a masters sort of gives you. It’s a funny little story. When I was getting my bachelor’s in computer engineering from Michigan, I went out to Intel and this is a very long time ago and I had a two-day long interview process with them and the only job that they would offer me was a helpdesk tech. It felt like four years of a hard core engineering education didn’t seem to really measure up to being a helpdesk tech in Portland, Oregon so it’s a little bit of an insult. They sent me an offer letter and I put it through a shredder in the EECS Building at University of Michigan. It was still in its brown envelope. They did call me back exactly 16 times — I took real notes — before they stopped calling me for a call back and I actually — CORALINE: Maybe you have some more to add to their professional network on LinkedIn. EMILY: Perhaps. I turned Intel down hard core and I applied to the Michigan Computer Science and Engineering program at Rackham Graduate School up pretty much that day. I got in and I did two additional years and I specialized in distributed operating systems, which funny, back then it was called distributed operating systems and today it’s called the cloud. Apparently, it’s just about 15 years ahead of my time. I thought that giant computers were really cool and I wanted to build things that were so big they span to the planet. Getting a master’s degree gave me some of the deeper algorithmic knowledge to be able to do it but I didn’t use some of that knowledge so I got until almost 10 years after I got my degree. It was just years ahead of its time. I end up spending a lot of time in security, instead because I was doing operating systems and a lot of security is actually the operating system level. I like fiddly algorithms and crypto was just a lot of fiddly algorithms. That’s a badge of a thing that I did and I had to pay for, which I did, in cash. SAM: And time and opportunity cost. EMILY: And opportunity cost, yes. CORALINE: I don’t know how people can do that, honestly. In my teen, in 20s, I was a big fucking mess and I ended up running out of money and dropping out of college so I have respect for people who can finish that at such a young age and know themselves enough to do the work because I definitely was not capable of that. SAM: Same here. I went to college twice and dropped out twice before the third time stuck in it. The third time was when I went back at 27 and I was highly motivated after having a bunch of crap jobs to actually do the work. CORALINE: The second part of Sam’s question, Emily was, is that something that you recommend for other people? EMILY: It depends. I think if you are a woman in engineering, I do highly recommend getting the master’s degree. It helps when you’re just starting out to pull your resume out from the rest of the pack. It’s just a little bit of a nudge that says, “I have this much more training. I think about problems in slightly different and deeper way.” If you’re a women or you’re a person of color, then I actually do recommend that you get the master’s degree. I don’t recommend you get the PhD. Unless you want to go to data science, then I strongly recommend you get a PhD actually. But if you just want to go industry engineering or engineering management, I do recommend a master’s degree for people who just feels like you need to get your resume onto the top of that stack, to get a second look. It served me really well for… I don’t know, almost a decade of just being able to get my name in front of a hiring manager just that little bit much easier, than if I’m just what I have my bachelor’s. I was not offered any more helpdesk jobs from Intel after that, let’s put it that way. SAM: it’s almost like you’re saying that women and other underrepresented minorities have to work harder to be seen as just as good. CORALINE: What? JAMEY: In tech? CORALINE: But it’s the meritocracy, Sam. Come on. SAM: Right. EMILY: Yes, if you’re a woman in engineering, it is not a meritocracy out there, unfortunately and these little things help. JAMEY: I heard some people tell stories about being told that they were overqualified because they are too advance of an engineer. Is that not something that you’ve personally experienced? EMILY: I’ve never been told that I am too qualified because I had a master’s degree but I’ve always hitched to the top. I’m never interested in those intro positions. I’m always interested in those senior positions so it was never a topic that came up. There was other topics that came up but ‘you were overqualified’ has never been one of them. SAM: Was everything easy for you after you got that master’s degree? Where are some of the struggles? I’m sure the answer is no so what are some of the struggles you faced, even being on the top of that power resumes? EMILY: I never really had a problem with the interviewing part of the ‘getting through the door.’ It was always after getting through the door that the cultural problems really started to sink in. To get stand up at a board and write code off the top your head in a number of different languages, my jam was always straight up C and I can still write code on the board straight up C, even today. You remember I said all the algorithm questions and all the algorithm patterns and all that is getting through the interview is not really that hard. It’s after being in the job that the challenges start to emerge. For much of my career, I was the only woman, not just on the team but in the department. In fact, that was the way until I was able to actually start hiring women actually, believe it or not, until I was in the driver’s seat to make hiring decisions. One job I had that there was a woman on the team but I’m just ticking off all of my positions in my head, going all the way back to the beginning of time. My first job I ever had, I had a woman manager but it was hospital IT. That’s a different world because it just is when you’re living in a world of nursing and doctors and they have a different take on position on things, which is something I didn’t realize. But if I look at my second job, my third, my fourth job didn’t have women on teams. My fifth job and then by sixth job, where I was finally a hiring manager. Out of all of those, there was only one that there was another woman on the team and there was two. Throughout my entire career, it wasn’t so much the interview process or the call back process or ‘getting in front of hiring managers’ process or even ‘getting in the door’ process. On the team, there were never any peers. It was interesting. It was never even peers on the team or in team leadership or in product. Usually, I have to go out to marketing before you actually find any other women that happen to be on the team. Then once I was in the driver’s seat where I can start making some hiring decisions and resumes got to me, I started hiring women, which is something I carry along with me through my career now. CORALINE: Did you have any role models when you’re coming up, if there weren’t very many women in the field? Who did you look up to? EMILY: My mom. My mom has a PhD in molecular pathology. She is a director in pathology and associate professor at University of Maryland in Baltimore. She actually didn’t get her PhD until she was in her late 30s while I wasn’t around. But the story is she went to Michigan Tech and she actually want to be a plant biologist. She got into the PhD program at Michigan Tech for plant biology and just before she started her program, they yanked her position and all of her funding out from underneath her because she was informed that a friend of the dean for the program had a son and they needed to find him a place and because she was a woman, they were going to prefer him over her. If she want to do plant biology, she had to leave. She was a medical tech and then she went back and got her PhD at Wayne State and then she was a head of one of the major labs at the American Red Cross and then she moved to the University of Maryland in Baltimore. My mom’s always been a symbol of perseverance against pretty much everything. She’s still there today so that’s why I look up to. JAMEY: That story is unbelievable. It’s really not unbelievable. It’s just really sad. EMILY: It was 1971. I got a reflection of it in 1991 and I have a 12-year old daughter who wants to go into Applied Mathematics and we will see it again in six years. SAM: It sounds like you’ve experienced a lot of isolation as mostly being the only woman on your team. I’ve seen a lot of chatter on Twitter this week. Marco Rogers posted some stuff about the double bind that he experiences as a person of color in tech and then, I think Sarah Mei was tweeting about some of this as well. I’ve seen a bunch of other people talking about the double bind where as a woman in tech, you are either told that you are too aggressive or that you’re too nurturing or that you’re both at the same time, which is a special trick to pull off. Have you experienced any of that? EMILY: I had a manager for five years who every single one of his yearly reviews informed me that I was too aggressive and I would never move forward in my career, unless I was more female. I ignored his advice because it was bad. SAM: Literally, he said more female? EMILY: Yeah. I need to be more of a woman and more caring and think more about the men’s feelings around me. I would make him more comfortable if I changed in that way. That hasn’t happened. SAM: Glad to hear that. EMILY: I’ll let you know if I ever does that. CORALINE: I’ve definitely been on the receiving end of being called aggressive or abrasive and there’s some truth to it because when I know I’m right, I do not yield. I don’t want to have that sort of confidence that I’m that right all the time. But when it does come up, I am relentless in my pursuit of doing the right thing and that has definitely caused me a lot of trouble in the past. How do you think your career would have been different if you taking that advice? EMILY: I don’t think that I can physically take that advice. When I went to go and build the game, I don’t think I would have survived that environment. That’s an interesting world of video games and it’s funny because I have done video games twice. Media Station is actually a game studio. I don’t learn from my failures apparently. But I think if I would have taken that advice and wouldn’t have learned how to stand my ground, I don’t think that I would have been a successful at ZineMax Online Studios as I was. I think that this double standard between you have to be more warm and more nurturing while you’re too aggressive but I need to be more aggressive and I need you to defend your decisions, I think that it’s just bad advice and you really have to find your own way in the situation that you’re at the time, if that makes sense. CORALINE: Definitely. I’m interested in hearing about how working in a gaming company was different from working in a more traditional, what we think of tech environment. I have a friend who specializes in programming AI for bosses in video games and she’s told me some real horror stories about what games studios are like and the grind and the long hours. I’m curious how your experience was in that industry and maybe, why you left it. EMILY: There’s an old David Foster Wallace essay called ‘A Supposedly Fun Thing I’ll Never Do Again.’ In that case, it’s about him being on a cruise ship, which also a supposedly fun thing that I don’t really want to do again. The video game industry is both amazing and terrifying and a call out to all my friends who are still in it, “Hi. Are you still there? I will parachute you guys in like hot dogs and other edible food that we can shove under the doors so you can get food.” Shout out to my friends at Blizzard and Ubisoft and still at ZineMax. It is the group with the most passionate people that I have ever worked with in my life. These are people who love what they’re doing, they’ve live for it, they breathe it, they play their own games, they eat their own dog food. They play other games for fun and for work, they do nothing but [inaudible] all day and they know every single thing that’s coming out and the game schedule. They know every conference, every sub-conference, every blog, every major person on Twitter. These people live and die and breathe video games and they are terrific to work with and if anybody ever has a chance to do it, I actually highly recommend the experience. But it was also an experience in getting things done extremely fast, understanding what is important and what is not, learning how to sort of weed out the egos from the job that has to be done because code has to be delivered. It is an educational experience in how to handle the pressure from social media. One thing to see one piece of feedback or one tweet on your product and it’s another thing to see it on Twitterfall when it’s coming at 10,000 tweets in a half an hour. Those are two very different experiences and I have had them both. When it comes to shipping, it is a very high pressure environment. In a lot of places, the teams do pull together and they do pull together to get the job done. There’s a lot of frustration, a lot of people lose their patience and leave but there’s also a feeling of satisfaction when it gets out the door. It is a lot of hours, it is a lot of hard work, it is a lot of dedication. You have to want to be there and I wanted to be there. The reason I left is because there was just some internal factors. You’ve launched AAA MMO and the parent studio is not happy with the launched and they’re interfering with everything now. It’s sort of some standard ZineMax story is a joke here from anybody else. Part of it was just some burn out at the end of that and I just needed to go and do something else for a little while. But hilariously, I just wanted to downshift so I went back to startups because I was looking for something slightly slower but it’s a terrific experience. A lot of people, I still talk to them, people who I work with there, I still talk to them every day. It’s an environment where you form friendships for life. CORALINE: I think I have a good understanding of what you did at ZeniMax. You painted a picture for us. How does that compare to what you’re doing at Upside now? EMILY: It’s funny because I really started on the ground floor here and I got to think long and hard about our learned lessons, about what we did really well and what we didn’t do so well and pulled a lot of the management direction that I felt worked really well and brought it here. ZeniMax was, for a very long time, a very move-fast and break-things environment. I’m a big fan of move-fast and break-things so I brought that ethos here, learned the plusses and minuses of agile versus Kanban versus team size, team communication, overhead, scaling. A lot of those lessons I brought here. But I also try to leave some of the grind on the floor, simply because I believe this is a marathon and not a sprint. Videogame companies, most of their QA is outsourced contract so they don’t really care if they burn people out on third shift. I have more of an eye towards long term of growing people and growing their careers and retaining them and making people better people than the way I found them. That’s a plus and that’s a minus in the video game industry, except for Ubisoft — I call it Ubi — a lot of places, they’re not really big in investment over the long term. I think it’s very important when building a healthy culture to have investment on personal level and in an organizational level. It’s just a little bit more of the CTO thinking of how we really should be, as a people and as a department and as a company, instead of just as a thing that needs to ship a thing out the door to the hordes that are waiting to play their game. That’s the difference here than what I experienced before. Also, it’s very important for people who can be together to be together. People need to be together on a mission so we try to build mission here, which is also important. People need to come to work every day and feel like they’re doing a thing that’s important. That’s Upside. That’s a little bit of what we’ve done different here than we’ve done it in previous gigs. Another thing that we’ve done here is that we’ve hired a lot more women here. We have women engineers who are working on main product. We have engineers of color who are working on main product, which is super cool. DC is a really cool tech city because we both have historical black colleges here like Howard College. In Howard University, there are [inaudible] and University of Maryland, College Park, UMCP plus Virginia Tech and because the government has these requirements for diversity and hiring and diversity in advancement, we end up having just a larger influx of diverse candidates that come to us who are very, very lucky in the DC environment. We get to pick from people who otherwise, may not be or have the same opportunity out in Silicon Valley so it’s really cool that we can start diversity initiatives here and we can bring in more people. We have people that have worked all over from all different industries and all different backgrounds and it just makes Upside, especially Upside tech side of things, more vibrant than a lot of places out there. CORALINE: Emily, you’re now a CTO. Was it at Upside that you made the transition from being more of a software engineer to more of a manager? EMILY: I actually started making the transition from an engineer to a manager all the way back when I worked at Merchant Link, which is the transactional processor in Silver Spring, Maryland. When I had my very first report and then he quit on me and then I had my very second report, that was exciting. Then when I moved to ZeniMax and that’s when I was heavily introduced to hiring teams. When I moved to Luminal in 2014, I was introduced to hiring a building departments. Then when I came to Upside, I started building the engineering department. I’ve been on this track since about 2009. CORALINE: I feel like I’m in my second time around in software development. The first time, I took a management track and got up the c-level but I found that the further away I got from code, the less happy I was. Around 2012 or 2013, I actually went back to being an individual contributor. I found that the things I liked about management, in terms of mentoring and team building and having some decision-making authority, I could get as a principal engineer. Do you find that’s your experience too? Is that what you like about management and do you ever miss coding? EMILY: Well, I’m going to take that back — the missing coding because my daughter came to me with wanting to program a simulation about doing bacterial spread infections and I spent the next six hours reading SIR algorithms and going through all the differential equations and then finding out how to implement them in Python and then figuring out, reading a bunch of example code about how to implement them for zombie outbreaks. Then using a whole bunch of MATLAB stuff to pull in various maps in the United States to simulate zombie outbreaks in United States. Apparently, I do miss coding. I’m not going to say I don’t. He asked me to look into a problem and I just go right down the rabbit hole. One of the things I’m especially good at and was the real sweet spot for me, one of the things I really enjoyed was the architect hat, which isn’t really so much coding anymore but thinking about projects on a large scale. Instead of just thinking about my one piece of code really deep, if you’re writing like some kind of file system and you’re trying to deal with some of the L1/L2 cache miss algorithms for your file system, it lets you really go and really deep in some operating system driver. That’s the principle engineer right there, all the way down to the metal. My sweet spot was — and I’ve noticed this ever since grad school so this is not some new thing — the ability to think about all the pieces in the system and be able to hold them in my head and have them all work and then be able to follow the little traces in my head. It’s like having your own little tracing system that runs in your head where nobody can see it, which isn’t terribly useful to the team but was incredibly useful to me. To be able to work through some of those systems and I’ll be able to communicate back with product and UX and engineers and whoever else had a stake in it, here’s how the system is, here’s implications, here’s what we can do, we can design this. I still have some pretty decent whiteboard skills this day. It’s interesting because I’ve taken that piece of skill set of being able hold everything in my head at once and I transposed it onto people, instead of just things. Sometimes, I joke and claim that I just see everybody who’s in engineering, just as a process list. I’m just walking through doing a ‘ps -ef’ and seeing what everybody’s doing and what exactly they’re working on and I can see all the processes running at once and how they’re scheduling on the processor that is Upside, which I think drives my direct reports nuts. But I’d largely take the optimization program that I was running on systems and transposed it to people and how we deal with various things that are not working and how we rearrange systems and how we grow systems in a way that we can scale out the entire department, the entire business and how it interfaces with things like customer service or marketing or our growth team and how they all have little input and exports and how everything should work together. I think I do drive people a little bit nutty because I will talk about it in a very dry engineering way of here’s how this should work and here’s how this process should communicate and here’s my deck and here’s the steps go. But it’s worked really well for getting Upside off the mark and getting us from 10 people to 100. It’s been a useful skill. I don’t think I would go back to building code. I don’t think I would go back to writing things other than fun evening projects trying to [inaudible] flow running or whatever I’m playing with today. I actually don’t miss being an individual contributor. As a kind of long winded but that was my answer. JAMEY: That was really interesting. I have not done management track. I’m not sure that I ever want to. The thought of trying to move in to that direction is scary to me but it was really a fresh perspective to see how you thought about it so thank you. EMILY: You are welcome. SAM: It is interesting to hear about the idea of treating systems of people, at least in part, the way that you might treat a system of code. EMILY: I will tell you that people are way more difficult. If computers do exactly what you tell them to do and people never do what you tell them to do. They’re really never. You just want to walk around and hook a debugger up to them and try to figure out where the code is wrong. Leadership is super interesting. It’s mostly putting out your vision and trying to build these North Star and say, “We need to as a team get to North Star but I really just wanted MVP of the things so please, just give me a little skate board. We’ll get to the North Star. Be patient but just give me a little bit of this. Awesome. Now, ship it. Now!” It’s fun in a different way. CORALINE: A moment to take a second to give a shout out to our newest $25 level sponsor, Risse Rigdon. I hope I’m not mispronouncing your name. Risse is on Twitter as @risuikari and I want to remind people that as a Patreon, you get access to our exclusive Slack community. I think we just passed 100 people, is that right Sam? And we talked to — SAM: Two hundred, in fact. CORALINE: Two hundred. Wow. We talk about things that come up after listening to the show, you have the ability to suggest guest, to suggest questions to our guest and to just be around like minded-people who understand that software is about people and that, we are all greater than code. If you’d like to join that community, please go to Patreon.com/GreaterThanCode. Pledge at any level and we’ll get you in. JAMEY: You helped make the show happened so thanks to all 200 of you. CORALINE: Emily, I’m curious, at the beginning you talked about the importance of a PhD if you want to go into something like data science. We really haven’t touched on data science very much. Have you intersected with data science at any point in your career? In my personal experience, I see a lot more women in data science than in software engineering. I’m curious if that’s been your experience as well. EMILY: Yeah. This is interesting because it’s sort of a two-pronged question. My first integration with data science was actually at ZeniMax. We stood up a giant data science system basically from scratch and hired a bunch of data scientists and discovered really horrifying facts about our game. We built fun simulations like the death map where you could bring up the entire map of the MMO and just see where everybody was dying at. Those are really cool. That was one of the best application. But you can watch like heat maps of what quests people were playing and what content was doing really well and what content wasn’t doing so well. It was in the direction there. The entire premise of Upside is actually built on data science and we recently really put down the hammer on building a data science department. I have noticed that a much larger proportion of women in data science than there are in straight up software engineering. I suspect it’s because we’re looking for people in more of the hard sciences, where there’s been a shift that’s been moving since the late 80s. I know that in economics, it’s really bad. I read that flame war on Twitter. There’s another iteration of it going at economics Twitter right now about there’s no women in economics. I’ve seen that one. But I do know that in biology and chemistry and applied mathematics and especially astronomy, it was a big one for a long time because it was almost entirely male-dominated, there’s been a large shift to being sort of this 90/10 to being closer to maybe a 60/40 representation. When you put out a call for data scientists, we get a really good gender mix between men and women who come in and apply. We need to go upstairs and count all of data scientists that just appeared one day here. It was remarkable but we’re running about 50/50 in our gender diversities here in Upside, which is super cool. Majority of them do have PhDs. They don’t always have PhDs in applied mathematics. They do come from a diverse background. I believe it’s because data science largely came up through the bio track, through genomics and bioinformatics, where all the way back to my work in hospital IT, where everything was a little bit more gender parity. There’s been a lot of gender parity moving in biology and the medical field for really long time. It was very, very bad in the 80s but it moved in a much more positive direction there. In the genomic field, it’s pretty much a 50/50 split and you can reach and you can get people who get data science there and a lot of data scientist say that they don’t really care the nature of the problem. They just really care about the data and doing actual science to the data that they collect. It’s a little bit different to thinking about things. It’s a huge growing right field and it’s getting people from all over. It’s not just this one skinny pipeline of software engineers going through college, coming out taking software [inaudible]. That’s my theory. I would need to look at the data to understand if that’s really correct or not but a theory seems to be burying itself out pretty well. CORALINE: One of the things I worry about with any job that has a pretty high college qualifications is that you’re self-selecting for people who have a lot of privilege. You’re self-selecting for people who have the ability to afford to go to a master’s program or a PhD program. I worry that that means that we’re selecting for upper middle class white women, as opposed to focusing on other dimensions of diversity. Has it happened to your experience too? EMILY: I think that is probably true. I went to Michigan and one thing that Michigan Engineering — which doesn’t overlap with data science but that’s my background — was very proud of its scholarship program for underprivileged children who are coming out of Detroit. I can’t speak about through any other university or any other universities program and props to my homies at Michigan, there was a lot of outreach and a lot of work to bring in students that came from, especially Detroit in particular but also setting them on flint in the sticks, bringing them in and getting them into programs where they would shine. I know that Harvard is hideously unbalanced. Thirty percent of the freshman class are all legacy and other bits of nonsense. My husband went to Maryland College Park. He actually was in chemistry and not in computers and he was there for a bunch of years actually. They were very diverse department too and were very big about reaching in to DC and serving underprivileged students and getting them into graduate programs. I’m going to answer for that, as I guess my long winded answer is I know that the public universities are trying to do something about that problem and I can’t speak to the private universities. I can’t speak to the Harvards or the Stanfords of the world. SAM: I do find it interesting that you’ve mentioned that you’re drawing a lot of talent from the other STEM fields, especially bio. I have the impression that there are not as many jobs in those other fields and they don’t pay as well so I wonder if tech is managing to attract a bunch of those people just because of all the money that’s currently here. EMILY: I have a different theory, which was my theory is that they all went to fintech in New York and fintech isn’t all that hot anymore, now they’re all moving into the next field. SAM: Financial tech, you mean? EMILY: Yeah. For a while, especially before the Great Recession, fintech was draining data scientists out of the universe. SAM: Really? EMILY: Yes, there was a huge move. Especially PhD physicist was the biggest one but physicist and mathematicians, they were being pulled very heavily out of academia and out of the bio and moved into fintech where there was a lot of money to be made very fast. The problems were interesting because if you’re hiding vast amounts of fraud in the financial system, I guess it’s true, then problems are interesting with the crash and regulation. I saw a little bit in The Economist recently talking about the return of some of the synthetic CDOs at beta phase. I’m sure that they’ll all be pulled back to New York, sooner or later but the crash came and then they all went to a whole bunch of different industries, mostly to tech. SAM: So there is a diaspora but not from where I thought it was. That’s really cool. Thank you. EMILY: Yeah. JAMEY: When you were talking about the demographics of the data scientist, you mentioned that you would have to get the data on that, which in addition to being very deliciously meta to me, it made me start to think about the ways that data kind of describes our lives. A lot of cool stuff has come out of that but I think the people have also been starting to realize that it’s scary in some ways, the data contained so much information about us. I know this is kind of a big question but I wonder what your opinion is about where that line is. EMILY: There’s a very useful book that’s out called, ‘Weapons of Math Destruction’ by Cathy O’Neil who is also a data scientist who originally was teaching and then went to fintech and then left. At the end of that day, we’re just talking about algorithms that take training data and do computation on them and largely, some fancy calculus and some matrix transformations and some curve fitting and then spit out weights. That’s largely what data science does. It’s an optimization problem. The problem comes from those models. We build models and we identify features on them that we actually wanted to test against and optimized for — some of that are data science language. We try to build a model and run a problem. An interesting problem that you could try to solve for is I would like to predict what is the fashion that are going to come out of the New York runway based on the Paris fashion show. There’s an interesting blog article where somebody did exactly that and took all of the pictures and fed it into a model and it started generating possible fashions and did a bunch of categorization and clustering. It is interesting because you could see which designers have what kind of styles or what kind of styles are popular, how things are moving to the style world. That is a very neutral model. It doesn’t really help people or hurt people, unless you’re really into the fashion world, then you’re spending lots of money on it, then it might sucks some of your money out of your wallet. But models can be used in a variety of ways and it’s all based, more or less in the training data that is given. If the training data that it starts with is impure or is biased or it doesn’t have the right information that’s framing it and it’s just pushed through one of these models to the other side, it could definitely be a source of harm. In Cathy O’Neil’s book, her opening chapter talks about the DC school systems and that they were trying to filter out the good teachers from the bad teachers. The training data that was going into a model that was done by some company in Boston that nobody had any say over, they were taking, basically the change in student’s test scores year after year. The teachers figure this out and they would cheat or they would game the system in some way, that it would show that students would have this massive gain in their test scores during their year, during the tenure with that teacher and then they would hand them off to a teacher the next year and the teacher thinking that she was getting these students that had performed in these particular test scores that they would be at a certain level. They would come to her classroom and they would be like two or three levels below that. Then she would get hit by the model and say, “Your scores have really dropped,” because you’ve ended up the year before cheated and some teachers who are excellent teachers lost their jobs because they were given some random score that came out of this model that was clearly had some curve overfitting problems that were associated with it. The teachers had recommendations and they found other jobs but it definitely damaged their reputation, not just the model in the process but also the school system. We see that a lot. One of the big places that these models are used is in advertising dollars. That’s probably the biggest place that we know of is used in advertising dollars and we’re seeing it right now with all the brouhaha over Facebook and Russians spending money on it and highly optimizing it. It’s got a neural net model that’s sitting on the back that you say, “I need you to optimize for people who live in Western Michigan that have labeled themselves as Republican and Christian,” because they pick up these profiles of people and the algorithm knows how to get ads directly to their very targeted eyeballs. It’s cheap because you’re maybe only putting out a thousand ads, instead of a television ad, which you pay for spots for whatever it costs for all those eyeballs at once. There are some strong negatives about data science but on the other hand, if you start talking about, say that the genomics fields, data science is extremely powerful when trying to do combinatorial mathematics for finding new drugs. Otherwise, we would have to run the simulations on thousands of these drugs and it’s very expensive and it’s very slow. We started dropping the cost of drugs over time because now the cost of research is going down because we can have AIs, largely being able to just get rid of all the negatives before we even take a look at them. Data science is a big plus and minus. We’re building a giant data science department here at Upside because we have an interesting problem, which is travel data. Travel data turns out to be huge. We were getting three terabytes of data a day and we had to sift through all of that to be able to find flights and hotels for people. The big question is how do we start building automated shoppers and sorting around your preferences. To be able to push all of the poor flights that have like two connections, don’t have any Wi-Fi or the plane is not comfortable, how do we push all that stuff to the bottom and bring up all the flights and hotels that are quality up to the top. It turns out to be pricing is a huge data science problem, where there’s a combination of how do you set margins versus how do you find quality versus how do you set preferences versus how do you get in other stakeholders needs into the data to be able to bring up something that’s of high quality to the customer. In that sense, we give people quality and reduce the amount of time searching and be able to build trust so in that way, the models are very positive model because we’re basically an automated shopper that is for that person’s needs. JAMEY: That sounds like a really interesting problem. When you are describing it, it seems almost building a program that can think the way that a human might think about what a human’s preferences would be. EMILY: I wouldn’t try to humanize it like that. It’s a machine that knows from the way that we push data through it. It knows what is good and it knows what is not good based on our whole lot of data that have seen in the past. Based on knowing what was good in the past, it can make a suggestion about what it thinks will be good in the future. That’s the thing to remember about data science and all these models, especially I know that Facebook has them, Twitter has them, Google has them and everybody’s talking about how cool this AI is and if it’s good or if it’s evil. The thing to remember about it is that these AI models, they’re not sentient. They only know what happens in the past because you fed the past to it and it can only make suggestions about the future, if no conditions ever change. That’s the thing to remember about these models. The moment that conditions changed in some way, the model is invalid and the answer that is given are now wrong. It’s not like a human that can take a new ideas and sort through them and synthesize them and then from that, be able to have a new answer. It needs to be told something has changed and if something’s changed drastically, it needs to be rewritten from scratch and rebuilt. That’s why it’s so good for something like spam because spam, this is same kind of patterns over and over again. With an infinite amount of trading data, we just sort of push it through that, it’s like, “This is spam. This is not spam.” It gets better over time because there has a bigger and bigger data set. CORALINE: That reminds me. I need to spend my $25 Amazon gift certificate, get some new windows and remodel my bathroom. EMILY: Amazon’s got some great models that if you go and it suggests you, you’re buying this and here’s the other five things that you might like. It does a terrific job. CORALINE: I’ve seen that fail pretty gloriously. I actually took a screen capture of this because it was so good. They recommended a septum to me because I bought the Zombie Survival Guide. EMILY: Yes. That’s amazing. It think she’d love that. Based on your past purchases and past searches, it thinks that’s perfect for you because you bought a Zombie Survival Guide. JAMEY: I literally just bought Weapons of Math Destruction on Amazon while we’re talking so I just went back to Amazon to see what’s recommending me. It’s still just like, “You should probably buy Star Wars Legos.” Now I’m like, “Yeah, that sounds like me.” EMILY: Yeah, well, why wouldn’t you buy Star Wars Legos? Is there a Porg Lego yet? Does anybody know if they put up the Porg yet, as a Lego? JAMEY: I think they have. The Last Jedi had just come out. EMILY: I have to get the Porg chewy for my dog. I have to get one. JAMEY: That’s why I was looking at them because the new type of Lego sets came out already for the new movie and I was like, “Yes!” EMILY: Yeah. The algorithm absolutely knows what you want. JAMEY: It does. EMILY: Maybe, it is sentient but only in Amazon. SAM: And then, of course there are the cases where you buy something that you only buy once every five years like a refrigerator and then Amazon says, “Here are five other refrigerators you might be interested in.” EMILY: Well, you might want to buy five other refrigerators. Have you considered it? JAMEY: You’re a refrigerator enthusiast now. SAM: Well, I will now. EMILY: I’ll think about. I spent a lot of time searching for a very strange books on Amazon so it’s totally confused to what I want. SAM: You’ve talked about the importance of feeding good data to your model, which is something that we’ve talked about indirectly a few times on this show. I wonder if that’s something where having people with a background in doing actual science is helpful. Does that training include skills on how to analyze the quality of your data and find where the biases in it are? EMILY: Absolutely. That is a big yes, especially the start point of people from economics. I know that they’re trying to make everybody the irrational actor and humans are not rational actors but it does give them the training to look for biases and to keep them from reflecting their own opinions in the data and allow the data to talk for itself to get down to real answers, which I have found from building a startup in a business to be unbelievably powerful. SAM: I’m curious if that extends further to identifying bias in the data itself like the canonical example there being something like predictive policing, where you’re feeding it data that’s based on biased arrest reports and so on. EMILY: There’s a chapter in Weapons of Math Destruction where she actually talks about this exact issue. I highly recommend the book, if you’re very interested in where data science is falling down. But there’s several companies that are startups out there, most of them are San Francisco startups that are trying to take a lot of policing data and push it through models to be able to predict clusters of possible crime so that police cruisers can be dispatched with higher efficiency. But because it only knows about the past and it doesn’t know about the future and it knows that poor communities have a larger occurrence of low-value crime like somebody did graffiti or somebody going to the back of an alleyway or whatever, it makes it look like there’s a larger cluster of actual crime right there but it’s never going to pick out someone committing fraud because that’s a lower incidence of the crime. You’re not going to dispatch cruisers to go around in lower Manhattan looking for instance of fraud, although we know it’s all there. It’s going to go and just dispatched to the Bronx and because there’s a higher propensity of lower value crime and that just happens to be what’s in the data. That can be a reflection of a human being’s biases and one thing we must be careful about is that we are not reflecting human being biases in the data when we filter into the model because the model is just going to amplify those biases. Always keep in mind that the model only knows about the past, then we know that the model can inadvertently have a racist overtones and tendencies. Even though it’s just a machine doing mathematical computation, it can look like it has racist tendencies if we happen to put that data through the model, if that’s what we’re looking for. It’s a curve fitting problem. It’s a matter of putting the correct data from the beginning to train the model so that it does not have biases and what pops out of it is something that approximates the truth. That’s where we really get in trouble with data science. Everything is a logistics problem in the universe and we want to be able to dispatch and deploy our resources that we have because we don’t have many of them on the problems that are of the number one highest priority thing that we have on the table. If we’re putting our biases in what we think is the number one highest priority thing, it’s not really but that’s what we’re putting it. We put in the model and it tells us, “You were right. That is really the highest priority. You should send all your police cars down to that neighborhood because that’s really were all the crime is,” then it becomes a self-perpetuating process and it’s something we need to be very cognizant of as we build all of these systems out. SAM: What training do we have and what training do we need that would help us identify it? EMILY: That’s a really good question. I think part of it is simply having an understanding of what the problem is from an unbiased point of view and it’s hard to get to that place. SAM: So you’re basically saying we’re screwed. EMILY: I don’t think we’re screwed. I think that we need to understand that these are just machines and they’re not all-knowing, certainly not yet and we need to be cognizant that we may be pushing our own human foibles into them. SAM: Yeah, my joke was basically just saying that if we need to be able to look at something from an unbiased perspective and we’re all humans here, that’s kind of a bootstrap problem. EMILY: Right. CORALINE: We’re at the point where we should start wrapping up. Is there anything else, Emily you wanted to touch on before we do that? EMILY: The only thing that I want to bring up really briefly is just back to the very beginning, the women in engineering topic. One thing that has come out of all the women enjoying in tech right now is really the importance of mentorship and internships for women who are very young and early in their career to be able to get the encouragement that they need. Even me on my side, I think it’s someplace that we’re falling down across the entire industry right now. At starting from high school, maybe even starting in middle school, I see it with my daughter. She’s in seventh grade. Her science teacher is trying crazy to encourage her and she lives in a tech home so she’s can even encourage with her but I see it all along the pipeline and just as wrapping up, as we’re talking about women in engineering and data science and getting people more and more involved in the industry as a whole, that I think that it should start at the beginning. Instead of thinking about, “They’re in college. They just come out. How can we keep them here?” I think we need to talk about how do we keep them in 7th grade computer classes. JAMEY: Coraline asked the question about how do you think your career would have been different if you had taken that bad advice from somebody and I wanted to ask the opposite question, how do you think your career would have been different if you had had more women on your teams and women peers from earlier on in your career? EMILY: How would have been different if I had more women peers? It’s an interesting question. I’m trying to imagine what that would be like. I think I wouldn’t have gone for my PhD. I think I would have finish them at the Masters. I think I wouldn’t have gone all the way and that would have been a significant difference. I probably would’ve pushed into management a lot earlier as well. There was never any encouragement from my side. It’s just me go and get things done. I think I probably would have moved faster and harder in my career than I did and I probably would have taken a few more risky opportunities that were on the table than I did. JAMEY: Emily, at the end of our shows, we like to do a little section where everybody gets to reflect on one thing that really struck them and made them think about what we talked about in today’s episode. CORALINE: I have been thinking a lot about bias and we talked about bias in terms of machine learning but there’s another kind of bias that I was thinking about and that is my experience with being self-taught, with being a college dropout. I think I am biased toward hiring people with a similar background story to me, people who have taken an initiative on their own, learning what they need to learn to break into our field. I know I definitely have a bias towards people with that story and probably because it resonates with me in terms of my background. I get the sense and Emily, I hope this isn’t unfair, that you have a bias toward people with degrees because that’s how you came up and came into the field. I think, we think all about our biases in terms of negative impact on things like diversity and getting more people in the tech from different ethnicities or different genders or different socioeconomic backgrounds. But I wanted to think more about other hiring biases like the resumes that do get to the top of stack. Are we favoring people with higher degrees or we’re favoring people with no degrees. That’s something I want to think about a little more. SAM: Following on to that. We’ve talked about bias a fair bit and I wanted to mention that it’s well worth your time to do a little bit of research into the various cognitive biases that we have as humans that we are inherent because knowing about a cognitive bias doesn’t necessarily mean that you can vanquish it once and for all. It’s always going to be there but not knowing about a bias means you’re going to fall into that trap every single time. Even just going in and looking at the Wikipedia page list of cognitive biases, it’s super educational and you’ll learn something about yourself right away. JAMEY: For my reflection, I am thinking more about how computers aren’t people and what we said about that, which sounds so obvious when I word it like that but I think that people in general have a real tendency that anthropomorphize things. I know that I certainly do but I think a lot of people do and that’s how we end up with Siri and Alexa and these computers that we speak to like they’re people. It was really interested in what Emily was saying about how computers can’t think like people because they can’t take in new information and change their minds the way people can. A lot of the technology that is cutting edge these days is kind of scary for me. I’m kind of a little bit of like a skeptic in that way but that really hit the nail on the head about what’s scary about it to me. I was really happy to have heard that perspective so I can now think what has really on it and how it makes me feel about tech so thank you. EMILY: I really enjoyed doing the podcast. This was a fantastic experience and thank you all so much for inviting me on your show. I really enjoyed it. My reflection is that I really do enjoy teaching. It’s interesting to be able to talk and dig into a topic with interesting people like yourselves and just sort of reflect on computers and people and society and what it all means. CORALINE: Great. Thank you very much for being here. We’ve really enjoyed this conversation and I think our listeners will enjoy it as well and looking forward to seeing how people react to the podcast in our Slack community in some of the ideas we brought today. We’ve definitely covered a lot of ground. Thank you very much and to all of our listeners, we love you and we will talk to you very soon.