CHANTÉ: Hey, everyone. Chanté Thurmond here. We're back with another episode of Greater Than Code episode 203. And I'm joined today with my friend, Jacob Stoebel. Hey, Jacob. JACOB: Hello. I'm pleased to announce this week's guest, Matt Zhou. Matt is a senior data engineer at VillageMD where he focuses on data platform stewardship, fostering better engineering culture, and standardizing machine learning processes. He graduated from Northwestern University with a degree in anthropology and with several years of experience in statistical impact assessment in healthcare spaces. Afterwards, he completed a Master’s in Public Health from Columbia University specializing in healthcare informatics. Matt previously worked at the New York Times as a data engineer on their platform team. He likes to spend his time writing about algorithmic accountability, fostering bunnies, and getting involved in local organizing communities in Chicago. Matt, welcome to the show. MATTHEW: Thank you. Thanks for having me on. JACOB: I'll say, as a side note, I feel like we've had many anthropology majors on this show and that is a very interesting commonality in the guests, including one of our panelists as well. So we start with our first question which we ask everybody, which is what is your superpower, and how did you acquire it? MATTHEW: I think my superpower would probably be letting things go fairly easily. I just recently moved apartments and went through this whole process of figuring out what to keep, what to donate, what to throw away. And there is this whole emotional thing of do I plan to keep this painting stuff and am I going to paint watercolors in the future? And I think just being able to let things go really easily helps keep you open to new things, helps you unlearn things also very easily. You're not invested in opinions or stances. And I think it just makes you a lot better of a conversationalist. CHANTÉ: I love that. That's really interesting. And it's definitely reminding me of kind of the philosophy that I follow of non-attachment although I am attached to lots of things, and outcomes, and people, and material items, which is why I try to actively practice this. Where did you learn that? Where's that coming from for you, Matt? MATTHEW: I'm a first-generation kind of generation 1.5 immigrant. And I think growing up -- I was born in Australia and then we moved countries to the U.S. But there's a lot of jumping around different countries and different identities. So I think being able to kind of shift your idea of where home is, being able to take chances and go into new spaces, I think that was something that my parents were very unafraid to do. So I think I kind of just grew up seeing that, and I was very excited to travel abroad, take new experiences, move to new cities, transition from anthropology to tech, these paradigm shifts. CHANTÉ: Ooh. You just gave me lots of ideas. Although it was a simple answer, it was kind of meaty. And my mind wants to immediately know more about your backstory. Do you mind if we jump right there? And I'd love to hear where your family is originally from. I'm not sure if your parents are from the same place or not, but let's talk about that origin story a little bit. MATTHEW: Oh, yeah. For sure. I love talking about where my parents came from. So this is pretty timely. I've done a lot of audio interviews with my parents over the past summer just learning more about where they grew up and kind of how they came of age and immigrated and took all these risks. So they were both born in China in the 1950s so kind of during the rise of the Communist Party. They grew up in pretty rural areas where their parents didn't have much education. As teenagers, they grew up during the Cultural Revolution. And I think education for them was a very intermittent start and stop sort of thing. So, that made them value the importance of having continued studies of polishing your skills, of being able to be self-sufficient. Once they graduated from college, once the schools were back in session after the Cultural Revolution, they had a lot of plans to just get out of the country. I think around that time, it was kind of seen as the Western countries were the places where opportunities were, very different world today. But in those days, whether it was the currency or standard of living or just future career opportunities, it was a very complex path to try and navigate it in China. So really plumbed their family networks and kind of friends of friends for any funds that they could scrounge up to try and get the plane ticket and immigration visa fee to be able to get out to Australia, which was seen as the easier route than going straight through to America. And when you're immigrating, I think you have to be very tactical about how you're crafting your story and the reasons why you're trying to enter another country. And so for them, it was English classes and being a student, and trying to demonstrate this previous history of academic excellence and then be able to leverage that into an Australian university Master's or PhD degree. Long story short, they get the funds. Dad goes first. He steps off the plane. He's got $20 and his suitcase of clothes. And then my mom was telling me the story of how he steps off the plane and then just goes right to Chinatown and then starts washing dishes to get his hotel fee for the night. And that's kind of the tone of the next few decades that they spend in different countries being able to really get into the hard work, not being afraid to roll up their sleeves and just do whatever it takes to figure out some kind of stability and establish a home. So he was able to make enough money doing odd jobs. And my mom was able to come by with my sister. And then from there, he enrolled in a PhD program and successfully matriculated for electrical engineering, found a job with Motorola, and then moved to the U.S. So a lot of this journey was very based off of where the pivoting opportunities were, which I thought was interesting just in terms of how people find their ways to this country. CHANTÉ: Wow. I got goosebumps listening to that. So I love talking about these stories, and it's so interesting. First of all, I've made a note that you are doing this great idea of having these audio interviews with your parents. I think that's just brilliant. I wish I had done that with my grandparents who immigrated to this country too, but I didn't get a chance to before they passed away. So I think it's going to be an amazing of a relic to have to maybe pass onto your own family and if you ever decide to start your own immediate family and have children. But either way, it's a great legacy story. And the interesting thing that you mention here is that they went to Australia. And I just picture myself as a person that having to go to another country, let's say I had to go to China or something, learn another language and just be completely new to this way of being. I always just admire people's bravery and their ability to actually follow through on the desire and the dream to leave their homeland and what that represents. And now more than ever, I think it's so cool to just make sure we're examining. One of the things that I often think about now as an older kind of more wiser woman is just what my grandparents were trying to say either to themselves, to their immediate family, or to me as a grandchild when they came to this country when they left their homeland to come here. What does that represent? So I just love that you are taking time to get to know that story. Thank you for sharing it. So, Matt, one of the things that I know about you personally because Jacob and I have the great privilege to work with you at VillageMD. And I've gotten to know you in the last few months. But I do know that diversity, equity, and inclusion is super important to you. And based on the conversations we've had and just getting to know you, I think it's formed by your identity and the family that you do have that you come from. So I would love to know, just in your own words, how your journey has been shaped by your familial story from anthropology into tech and specifically the things you're interested in as it relates to technology and data. MATTHEW: Anthropology being the first really critical analysis framework that I picked up in undergrad was pivotal to how I've been solving technology problems and also just trying to think of career direction in a world that's melding technology and social issues very intimately. I think cultural anthropology really emphasizes centering other people inside of their own context and knowing what language are they speaking, what values are foregrounded in their minds and kind of giving them the benefit of the doubt that their beliefs are serving some sort of function whether it's social or mythological or whatever. And I was doing this ethnographic work with different international development projects around the world during my undergraduate years. Having that experience, led me to think more about how do you decide whether to intervene in situations and what is the right time to intervene? And kind of the ethics of humanitarian aid, the ethics of metrics and studies, and how you go about that entire process. I think that's immediately relevant to a lot of my work in technology because there's this belief that technology is this objective kind of virtuous thing in and of itself and that most people see inequities in the world and they just really itch like, How do we solve this? How do we hack this? What is the disruption here? And I think that other industries and other frameworks that also deal with kind of humanitarian aid have really taken a step back to recognize what is the opportunity cost of intervention? And what do we lose by doing something instead of doing nothing especially when it comes to protecting local cultures, and traditions, and indigenous peoples or local economies and things like that? So I think that early experience in anthropology really helped me think more about the value of doing something and how to ask questions well. Sadly, the path to careers in anthropology is very ambiguous, and I ended up doing a Master's degree to kind of clarify that. I ended up focusing on healthcare informatics. And I think from there, it was just immediately impactful for me to see how terrible it was to navigate the healthcare system in the U.S. and just what is blocking things from getting better here and who are the populations that are being under reached and underserved and left out of the equation. So that's why I really loved delving into healthcare in recent years. JACOB: One of the ongoing themes I think in this show is that we've I think spoken to a lot of people from various social science backgrounds and how they think their working as technologists is informed by their background in social science. And I'm just curious if you see any connection there in the work you're doing right now. MATTHEW: Beyond the nitty-gritty day-to-day of being a better human and community member when doing technology work, being more empathetic and compassionate when reviewing other people's code or helping people plan out their career maps and trajectories, or just being kind during times where mistakes were made and more of the organizational practice that I think social science helped me understand a little better. I think there's also just larger impacts of the work itself. We're making decisions and encoding a lot of opinions into the technology products that we're building or the algorithms that we're designing. And all of those end up automating our own biases into what a lot of other people touch or utilize on their day-to-day lives particularly for things that are as sensitive as healthcare, how people understand their own wellness, or how they feel comfortable seeking out health services and how they can trust medical professionals and things like that. So I definitely think having social sciences boosts my understanding of user design, places more importance on feedback, and reaching out to people who are ultimately impacted by the products that I build. And also being very aware of my own notions of what fairness is or what biases I might be actively trying to address. CHANTÉ: I guess one of the questions I have when we're thinking about technology, and being compassionate, and having ethics is, whose job is it to ask those questions? Is it the technologist's job? Is it folks who are on the product team? Is it folks who helm at the organization? Is it people outside who interact with and use your products and services? Whose job is it to raise those questions of ethics? MATTHEW: I think the answer to this question is tough just because no one person really has decision-making ability over this entire concept for an organization. Every person is contributing their own little decision-making power and then that accumulates into this larger scope. So the product designer might be held to a timeline that forces them to leave out auditing stuff. Or the people developing a machine learning model might not have access to the resources of having an ethicist come in and give that kind of opinion. I think that when you do this stuff in practice for a private industry, there's this intersection of business momentum and then also organizational assets that really determine your ability to do technology ethically and then also just your internal culture too. So I think it probably starts from having leadership that really believes in the importance of this whether it's because it's an inherent right to be ethical and fair or because it's a compliance thing from some government entity. And then that disseminating top-down from having and investing in an auditing group and having that expertise in-house or hiring consultants to do that kind of audit on your organization for data and technology and then with the people that you're hiring and the people that are building this stuff trusting that they have good intentions but also putting in the necessary process and guardrails for auditing inside of there from the very start of design. And I think it's tough because what we're talking about are systems here rather than individual people making bad decisions. And right now, these systems are being built out in real-time as the products are potentially contributing harm. So it feels like there's this sense of stuff is bad right now. And we need to fix it right away. And so we're just going to try whatever we can, but there's not really a sense of will this extra intervention make things even worse? CHANTÉ: Yeah. The systems part is the trickiest thing because we're involved in so many systems, you know what I mean? And adding a new person to the system changes dynamic or adding this new product feature or a new suite of a product, for example, just makes the system more complex sometimes. So I love that you kind of make that distinction there because you do need to think about the systems. And I would agree with you that there's not one person that is responsible. My wish for technology right now in this year of 2020 where we're recognizing we have so many racial inequities, so many issues and instances of real exclusion as a result of somebody not making ethical choices at the beginning, my prayer for tech is that we get more people like you who are cognizant and aware and willing to really step up and use their voice and their reason as they're building new algorithms, as they're informing or shaping up the work that is so important for our product and ultimately the end-user. And I think that sometimes we make these products for that user without thinking about the implications of did I do the right thing for our company? Did I do the right thing in the long run? Am I going to be proud of that? And hopefully in service of the consumer. Now, more than ever, consumers are really stepping up and saying, "I demand that you do this better. I demand that I see an improvement in this feature." MATTHEW: Yeah. I think especially this year and recent history there's just been a growing awareness and disillusionment with technology and this techlash that's been rippling through a lot of different parts of the world. I think that people are more interested than ever in learning more about how their data gets used. And it's astonishing to me how tech literacy has propagated throughout mainstream media news, how people are grappling with algorithms and echo chambers. And they're forming their own opinions about these things but more exposure at least helps them familiarize themselves with the further education that might be helpful here in making those consumer purchasing decisions or collective organization politically for being able to introduce regulation on these topics. And I think within the industry too, a lot of engineers, and designers, and product managers are now thinking, I really hope I haven't worked on something that ultimately impacts groups that I didn't intend to hurt. A lot of these issues are unintentional side effects from optimizing for some metric or for performing some intervention that was supposed to help some other groups. As we have more people who are growing aware of these things, I think the next decade is going to be really interesting in terms of what legislation gets pushed through with the European Union being a really great leader in privacy and fairness issues. CHANTÉ: I couldn't agree more. Jacob, what are your thoughts here? I mean, you're on the tech team and you're a technologist. I'd love for you to weigh in and ask a question. JACOB: I'm curious. You mentioned auditing a few times earlier. And I'm curious what we mean by that in this context. MATTHEW: I've been very fascinated with the idea of an algorithmic audit. I think a lot of organizations in the past couple of years have been pushing this forward as a way of building trust and accountability around how private industry is using algorithms and also how government organizations are using algorithms whether it's predictive policing or healthcare allocation, anything. And I think the main difference comes down to whether audits are conducted internally within corporate organizations or externally by third-party regulators or by the government themselves like the FTC or something. And I think that it's more of a question of whose responsibility it is to make decisions and kind of regulate the use of these algorithms whether private organizations who have more access to their internal data, a little bit more tractability and how they're able to structurally change things quickly. Maybe they're the ones who should be coordinating some kind of report being transparent around how they're using data and algorithms. Or from the other side, maybe you think external audits are the way to go. And the compliance tool is really necessary in making sure that companies adhere to a standard and have some kind of pressure to be able to reveal things that might not be necessarily in their business interest. So it's kind of where you believe the different levers are effective in making this issue actually improve in the future. JACOB: And how you can identify who the affected parties are in the first place is something I think about. There's an example I think of a lot which is the -- so you know the navigating app Waze. It helps you figure out what's the best route to get somewhere. And it's based on sort of real-time data that other users are sending. So there's one weird example where for whatever reason, the Waze algorithm was regularly telling people in Los Angeles that a very specific shortcut was telling drivers to get off the freeway and get on this one particular residential street. And suddenly, the residents of that street found that this regular residential street was essentially becoming an interstate basically overnight, bumper to bumper traffic during rush hour when normally it would just be a regular street. And I just think a lot about who would have made that connection in the first place? Because with the developers working probably in Silicon Valley who made that algorithm, would they know? Probably not. Would the residents? How would they know? Because they just know there's always cars driving there. And I think a lot about how these really complex algorithms can have unintended consequences. It's how are the impacted communities identified in the first place? And then once they are, how do they have a voice in giving feedback to the developer to say, "Hey, there's this impact here that you probably didn't realize." CHANTÉ: Yeah, I saw that one too. And I loved that. That was a great, brilliant example, Jacob. I think that's right. And the other thought when I read that article initially, the thought that came to my mind was okay, and the other way to think about this is if the developers had a bias of let's avoid a particular neighborhood because it's predominantly Black or Brown or Asian or it's gay, it's whatever. That also comes up and shows up in the code. And I wonder if we've done an audit on that, which neighborhoods you should be avoiding. I mean, I'm sure there's some that are within reason because of safety, but I'm curious to know those who are building apps like this for the public consumption if there's the bias baked in and the answer's probably yes. MATTHEW: Yeah. This Waze example is a really great kind of indicator of just how difficult it is to give direct feedback between the users or those impacted by a product and then the designers of that product. Because in this case, the community members would need to first figure out it's Waze that's directing traffic through their neighborhoods. They would have to figure out the right person or the right path and the app to get a hold of someone who could represent and take feedback. And then that needs to trickle through all the different levels of the organization to then be prioritized for work so that engineers could design a fix. And it's just very opaque. There are different levels to it. I don't think anyone is satisfied with how this process of actually resolving this issue would go because no one really has insight into the overall journey of that feedback. And I think overall, it just makes me think more about how people participate in the design of technology products and machine learning models if there is a way to hit two birds with one stone where you can be transparent about all the different steps that go into building these things and then also involve people so that they can learn more about how to use it and contribute user design in that regard too. There's definitely a lot more talk of this idea of participation and machine learning that has been going around conferences this year. And a lot of people have been excited about this idea of a civically engaged constituency that's able to make decisions around how things get trained and then also give direct feedback that then contributes to feedback loops of which biases end up getting encoded. JACOB: And I think this is touching on what you said is the question of who are our users? And in the case of Waze, they would say "People who are using our apps when they're driving. Drivers are our users and everything else like the streets and the congestion are resources." So suddenly, that neighborhood has become a resource and not other people. And it's sort of if you can shift that focus of the neighborhoods that our drivers are driving through are also our users, then that completely changes how we design things when we stop thinking about people as resources and start thinking about them as our constituents. CHANTÉ: Yeah. And in this instance, if we were just talking to the end-user, which would be ultimately the person who downloads the Waze app, whether you're a passenger or a driver, that's one thing. But then the stakeholders are the folks in those communities in which are impacted by those unintended consequences. And in this instance, it kind of feels like it probably on the surface looks benign, but for me, I wouldn't want traffic coming into my neighborhood if I'm on a residential street, if it didn't have to, with my children. And then that becomes a public health safety concern. So it's interesting. And I hope that companies do more. I do think that we're kind of inching our way, probably more than inching. But we're moving towards that model because of what we've experienced this year, that these decisions made by folks who don't know those end-users or stakeholders very well can impact public health. And one kind of quick example of this is what are we doing with all the data that's being collected through COVID? I have a real concern as just an everyday citizen. I'm really nervous about what that means from a public health standpoint but also the long-term implications on surveillance of the population. MATTHEW: I think that's touching on a really important tension. You want to be inclusive of both users and stakeholders and kind of manage that trade-off to satisfy everyone as best as you can. But I think also bringing more people into your systems means that you are now tracking those people or you have data on those people. And I think that there's a lot of literature out there where there's a weariness around using technology to surveil more people. And the aggregation of data allows the ability to abuse power, not really intentionally, but just by design being able to make decisions over larger portions of the population. This podcast is brought to you by An Event Apart. For over 15 years, An Event Apart conferences have been the best way to level up your skills, be inspired by world-class experts, and learn what’s next in web design. An Event Apart is proud to introduce Online Together: Fall Summit, a three-day web design conference coming to a device near you, October 26th through 28th. The Fall Summit features 18 in-depth sessions, each followed by a live, moderated Q&A session with the speaker, plus unique one-on-one conversations with some special guests. You’ll learn about advanced CSS from Miriam Suzanne and Una Kravets, design systems, and patterns from Mina Markham and Jason Grigsby, design engineering from Adekunle Oduye, inclusive design from Sara Soueidan and David Dylan Thomas, and much more. Attending An Event Apart boosts your brain, inspires your creativity, and increases your value to your teammates, employers, clients, and most of all yourself. And you can boost it even further. Purchase a Three-Day Pass and receive six months of on-demand access to their first three Online Together events. That’s a full six days of jam-packed content for the price of three. Greater Than Code listeners can get $100 off any multi-day pass with promo code AEAGTC. Once again, that promo code is AEAGTC. So grab your spot and join An Event Apart’s Online Together: Fall Summit, October 26th through 28th. See the full three-day schedule and register today at AnEventApart.com. MATTHEW: So there's this podcast from Data & Society based off of this book by Ruha Benjamin, an associate professor at Princeton University. And it's mainly talking about how race is a technology that has been used for kind of reflecting sets of biases and kind of systems of control from one generation to another. And in previous generations, you might've had the Jim Crow laws and segregation. In this case, surveillance and technology are kind of the new era of what is used for being able to track and control marginalized communities. So I think what's really interesting and kind of scary to me there is the idea that this is all serving interests that were already there. Nothing about this is new. And when we surveil people and gather more data, that can in itself be somewhat problematic. I think there are examples where people are overrepresented and hyper-visible based off of their data when it comes to over-policing in communities and kind of how technology and algorithms are used to justify interventions in communities of color. And then there's being left out and not being visible enough in data, when you're left out of resource allocation issues whether you should be followed up with for healthcare or whether you should be given a home loan. There's kind of this problem of being both hypervisible and not represented. And I think that's mainly because a lot of times these interventions are kind of just being used to reflect the status quo from before. I think about that a lot in terms of what is the balance of gathering enough data versus gathering too much data. CHANTÉ: Yes. That's a great question to be always thinking about - What's the balance? And who does it serve? Is it in service of the private corporation or the publicly traded corporation? Or is it in service of the consumer or some greater good? Perhaps public health and safety and world development and things like that. And I think the question always can be answered from a skewed point of view because we all hold differences of opinion when it comes to this. What I believe might be good for my community you might not. So it's just like an endless debate, but I think having it as part of the process is what's the most important. It's making sure that you're constantly asking it just as you would with other things in relation to building a new technology or offering a new data and service. And I just want to mention here too that I really appreciate you mentioning Ruha Benjamin example. I've heard of her. I haven't had a chance to listen to that specific podcast. So I definitely will. When you said that, it made me think about Jamie Kalven and The Invisible Institute here on the South Side of Chicago. Are you two familiar with them? JACOB: Yes. MATTHEW: No. Tell me about it. CHANTÉ: Oh my God. I love their work. We should probably get them on the podcast too. But Jamie is a writer. And I don't know if he's an investigative journalist, but he certainly was chasing down how we were over-policing people on the South and the West side of Chicago. And because of his work, he was able to get access to some data. And I think they had to start a lawsuit basically to get that data. But long story short, they have started the Invisible Institute, and they focus on kind of the impacts of technology on predominantly the Black and Brown community here in Chicago, and I think even outside of the city. Great work. I'll drop a link to it in the notes so that people can check them out. MATTHEW: That sounds awesome. I definitely want to look more into that. CHANTÉ: Yeah, for sure. They stay on my radar. I look forward to watching. I think that some folks over there even have a podcast. If they don't have their own, the people have been on podcasts. That's how I initially heard about them and the work that they're doing. MATTHEW: Yeah, I think this is emerging in hyperlocal ways. Like with this Chicago example, there's the gang database that's been tossed around multiple different levels of the state and districts, and it's a real data governance problem. CHANTÉ: Exactly. I think that they actually address that specifically as well because so much can happen from having -- While the databases are helpful for somebody because somebody had the idea to do it, it's extremely problematic. And the question of ethics is do you even know that you're being surveilled and by who, and what the reason is for that surveillance, and what the implications for your life could be if you are associated with some gang database as a teenager? And then 20 years later, you want to go get a job. What does that mean? And it's funny, not funny actually. But my sister and I were just having this conversation a week or two ago. I have family on the South and West side of Chicago. If I go there, does that mean that I'm now affiliated with something that I don't even know what's happening just because I'm just there visiting my family? And is that affiliation or association bad? And it's not for me to answer that question. It's people who are our policies, and lawmakers, and folks who are building technology and services with that data without my knowledge and consent. But nonetheless, I've become a victim even if I go visit my family. And so we see this happening. This is a big issue actually with young, black women, just to kind of give a poignant an example. Here on the South Side of Chicago, I was talking to some young women who were saying like, "I can't date any of the guys in my neighborhood. And if I do, it's a problem. But if I bring somebody from outside of my neighborhood, not only am I putting that young man at risk because there's a lot of heavy gang violence and activity over here, they could be associated with something that they don't want to be associated with just by visiting me." I thought that was really interesting. When somebody shared that with me as direct feedback, it just really hit me what that means. And this is why I was like, we've got to pay attention to what's happening with that data, what that means for people who are completely innocent, who have nothing to do with nothing just living their lives in these neighborhoods. MATTHEW: Definitely. I know that story was a large part of the driving force around the police shooting of Breonna Taylor just like the surveillance that was previously there. I believe her ex-boyfriend was affiliated with a gang and then kind of how it resulted in this unfolding of this larger story where she was then profiled and kind of placed under surveillance as well, and her home was put under police watch. So I just feel like they're all linked. What's happening in Chicago also relates to what's happening in Louisville, Kentucky. And all of these places are kind of being born out of these similar trends. CHANTÉ: Yes. Absolutely. Way to bring it right back to what's happening now. So I guess that would be a good segue. I would love to hear how these things that we're dealing with, for example, COVID and unfortunately, the killing and murder of George Floyd, and Breonna Taylor, and Ahmaud Arbery, how is that impacting you from a personal and professional level? MATTHEW: When it comes to speaking out about racial equity in the workplace, I don't think there's ever been a time as normalized as it is now to do so. I'm engaging with that in my own workplace. There are a lot of other people who are also interested in justice and wanting to talk about these matters and not go through this kind of dual world system where you show up to work and you don't bring any of your personal life with you. So I think more than ever, I've seen people who are calling things out inside of company Slack groups or teams' channels, and people who are forming small groups to say, "What can we do? How can we improve?" I'd like to think that this is more of a mobilizing call to action for technologists. And I think that a lot of it mirrors what's happening outside of technology organizations with grassroots activism associations and kind of the work that they're already doing for bringing attention to these issues. I hope to think that a lot of the lessons that are being reflected from more of the on the ground canvassing and kind of justice-oriented initiatives are being taught to people who work at these companies who are then finding courage from the moment to stand up for them. CHANTÉ: Since we all work at VillageMD, and I work on the diversity and inclusion team. Can I ask you both, has talking about race or racial inequity -- for example, we had Courageous Conversations. Was that impactful for you? And if so, and I know it was. But I'd love to hear specifically why and how that's showing up now in your work. I mean, you gave a great example of Slack and calling things out. But are you now more informed? Did it help with your day-to-day in understanding the positions that you play or the technologies that you're building? MATTHEW: At least personally on my side, it helped me realize that there were so many other voices who were people of color inside of what I previously thought was more of a homogenous white organization who also were struggling with the very same issues that are being talked about in the media and around racial inequity and police killings. I think what that enabled was knowing that there was kind of some inequity inside of the organization in terms of whose voices and stories were being uplifted. I think that within corporate technology, there is a bias towards kind of white, Asian, American voices. And more on the clinical side and maybe more in contract roles, you have people from Black and Brown communities filling the ranks there. And I think this is kind of part of a larger trend in the technology organization where there's white-collar work. And then you have contract positions or blue-collar work filled by communities of color, and there's this kind of ghost work class. So I think it helped me realize that there are a lot of these problems in our organization too. And that rather than just pretending they don't exist, it's very emboldening to bring it out into the light and be able to try and put those people in leadership positions to be able to set the agenda. CHANTÉ: Yeah, thank you for that. I think you're right especially in organizations like ours where we have a big kind of corporate body that is in one kind of corporate office but then we have folks who are dispersed all throughout the country kind of providing the boots on ground day-to-day, in our instance, clinical services directly working with patients. And as we know, now more than ever, those folks who do that predominantly happen to be women and people of color whether that's at the hospital or in the community clinics or just in integrated networks. And so just with both COVID and the racial inequities we're seeing play out on TV, it's just been an interesting and fascinating year. And I couldn't be more concerned but also more proud of the work I've personally been doing and just knowing for example, Matt, that you've taken a big part at VillageMD whether people know it or not here, I’ll just say it. You've been a great ally and advocate, and we've invited you onto the Diversity, Equity, and Inclusion Council at Village. Jacob is also another huge advocate and showing up. And I think it takes folks who are not black or other indigenous people of color to speak up and talk about these things so that we can actually have it work and have it infiltrated and kind of integrated into our everyday ways of working in systems. So I do want to call that out and just say thank you to both of you. And I want to see more of it. I'd love for more folks to really get emboldened this year by what the country is seeing, and not only this country, but other countries too. But because we're Americans and we're living here, I want to see this place be better. And I'm really really nervous about what's to come in the next 6 to 12 weeks. As we wrap up or as we get closer to the election as we wrap up the year, I'm super nervous. JACOB: You're not the only one. MATTHEW: 100% seconded. CHANTÉ: Do you mind if we go there then? How are you both preparing? I didn't necessarily mean to take you to politics, but it is political. What can we do? As we move into the election season, as we get closer and we're kind of getting towards the end of the year, how is this -- all of these things that have happened in 2020 and knowing what we just talked about in terms of civic involvement and tech consciousness and things like that. Do you have any plans to change or make any difference in terms of your personal and/or professional world? JACOB: I would say that my wife and I are I think trying to get involved more in local politics, which maybe doesn't directly answer your question or maybe it does. We've been getting involved in the races in the city for our small city council. I feel like I live in a small town, and two people can make a meaningful impact in effecting political races in that respect. That's one thing. The other is I think there's been a lot more visibility in modeling of elections recently with 538 being the most prominent example. And I think something that was a big misunderstanding in 2016 was that there was this absolute certainty that Clinton was going to win that all of the models agreed, all of the pollsters agreed. And it was this giant surprise. And I think it led a lot of people to sort of believe that at best, no one knows what to believe until an election actually happens or at worst that we shouldn't have faith in our elections, to begin with. I try to tell people, anyone who will listen to me, and that's not as many people now because of COVID, that we have to be skeptical of polls. We have to be skeptical of models that are going to try to predict what will happen in November. And on the flip side, we also have to be skeptical when the president says that there's going to be a rigged election. I think skepticism cuts both ways. And I think what I'm getting at is that I think for a healthy functioning democracy, we need people who are not going to be so easily swayed by ideas that maybe just confirm what they were hoping to believe and be more interested in what's actually true. CHANTÉ: Yeah, touché. MATTHEW: Parts of that definitely resonated. Among my peers, I've seen so many people getting involved with the political process for the first time, and being interested in phone banking, canvassing, taking an interest in local races. And who's the one who assesses property taxes in Cook County, or who is the water traffic controller or the financial controller for the city? There's just more of a general awakening and less taking for granted that this government apparatus will function in your best interest if you just stay out of the process. I think to what Jacob was saying, as people who put a lot of stock in data and making decisions based off of evidence and kind of taking better decisions over time, I think that we should be trying really hard to clear up misinformation and getting as many people as possible involved and voting whether it's absentee ballots or clearing up early voting rules and guidelines or just letting people know what they can bring to register on-site same day. I think that that's kind of the tangible sort of action for being able to contribute to the election since there are so many different ways where someone can dispute the results of the election and so many ways where you discourage people from going to the polls and making their voices heard. Anything that we can do to kind of counteract that, especially in a year of COVID, in a year of extraordinary campaigning, every little bit helps. JACOB: Yeah, well said. CHANTÉ: It makes me think about our friend, Nicole Tay. I don't know if you saw that, but she's been working on some blockchain initiatives as it relates to voting and what that means. And I wish we would have folks who were thinking about this. I mean, there were people thinking about it. But I think to be loud and to kind of socialize this idea of using blockchain as a technology or as a mechanism to make sure we have safe and secure fair voting is really important. And it's just another example of why you have to have consciousness when you're building out technologies and algorithms that impact the masses because it actually becomes a public health and safety concern and wellbeing concern for all of our citizens. So lots of thoughts there and potential for me to go down a rabbit hole, but I won't. When the girls get coding! Join us on your screens, October 13th, for the live@Manning “Women in Tech” conference to celebrate the rising movement of women in technology. We still have a long way to go to achieve diversity, inclusion, and equality in technology. Our contribution is the live@Manning “Women in Tech” online conference, October 13th, starring the women rocking the tech boat. Cloud navigators and serverless gurus; algorithm sorceresses and community advocates; we proudly bring you the women creating the tech world we live in. October 13th, live@manning “Women in Tech” Twitch conference! http:JACOB: JACOB: mng.bzJACOB: oR5M CHANTÉ: Loving this conversation, Matt. I'd love to know if we can circle back slightly to when Jacob was reading your bio, for example, that you got your Master's degree in public health and you focused on medical informatics and what that actually means. You're not necessarily working as an informaticist, but in a lot of ways, you work at a healthcare company that is probably highly embedded into your work. So I'd love to hear a little bit more about that if you don't mind. Yeah, I would love to talk about that. MATTHEW: So during the public health degree and the health informatics specialization, the main focus was population health and the systems around how public health contributes to systemic deviances between groups in terms of their health status, whether that's their cardiac health, or whether that's cortisol stress levels, or whether it's prevalence of certain kinds of respiratory illnesses across geographic zones. There's very much a focus on how do you look at the health of groups and how are you differentiating between different groups for your analysis? So coming out of the degree, I was definitely thinking healthcare feels like it's a very individual thing. You get sick, maybe you break a bone or you have hypertension, and your blood pressure is high. But there's this whole ecosystem of changes that informs how you got to that point. And then another ecosystem that determines what you're able to do about it. They're probably the same ecosystem and facts because that helps inform why haven't done something about it to that date. So I think working in health technology there's a growing consciousness that it's a very difficult problem because a lot of the issues that concern healthcare right now are chronic diseases that have to do with weight, behavior, diet, activity, exposure to different environments and things like that. And when you get into behavior change through technology, it's both very sensitive to tweak. And then it's also very difficult to tweak without something like Facebook where there's massive consumer adoption of the service that you can use for conducting all these little nudges. So I think people are really hoping for something on the scale of Facebook or Google inside of health tech where they're able to incentivize people to be healthier, to see their primary care doctors more, or to eat less red meat. I think that they're still figuring it out just because there's this whole legacy of older medical systems. And I think there's a lot more inertia and transitioning to the healthcare industry from pen and paper to EHRs to then making data-driven decisions and being able to quantify it and evaluate how they're doing year over year. CHANTÉ: I love this conversation and where you just took it. When you were talking, the one thing that kind of stood out you said, "Deviances" are you talking about positive deviances in regards to healthcare, public health? Because that's kind of the way I understand it, but I'm not into healthcare informatics in terms of I don't know the specifics, so I might be speaking incorrectly. But I'd love for you to, if you know what I'm talking about, to speak to that a little bit. MATTHEW: Yeah, definitely. In this case, deviances implies just the negative differences that communities of color often face compared to people who have more socioeconomic status and funds, who have solid education in schooling, to be able to purchase healthcare, to be able to be health literate inside of their communities, to have access to fresh groceries. Where you live has a huge bearing on what you're able to access, and the diet that you're able to consume, and the exercise you get, and your own motivation for being able to do the things that are hard and do them in an efficient way that improves your health. To have the funds, to be able to go see a doctor, to have a doctor's office nearby that's accessible by transportation all of these little things build up to the kinds of public health studies that are able to screen by age, gender, race, socioeconomic status and then say, "Why are we getting such high rates of hypertension amongst older black people versus people who are young, working professionals inside of Manhattan or whatever?" CHANTÉ: Got it. Yeah. Thank you for that explanation. That is a question we have to ask, and some companies are better at doing it than others. If we think about specifically the example of where we all work at VillageMD, I would say we do it pretty well being that we are helping to kind of -- we're on the front lines of innovation as it relates to primary care. And I also think that as quickly as we can move, as a person of color, as somebody who's seen my own community and have my father, for example, I can't wait for technologists to get it right. Every day that goes by, could be the day that he dies. And so I get super passionate about this and I just hope that again, one of my other prayers is just that we move as quickly as we can but as consciously as we can putting people and real stories at the center of what we're actually building and making sure that we're doing it in service of them, not of us. So I appreciate that. And that kind of answers my question just learning that part about you and specifically, I think it's helpful. Did you know, Matt, that when you were probably looking at that program, were you specifically thinking I'll go into public health because I know I want to get into this particular part of it or were you kind of generally headed that way and then throughout the program, you decided I want to get into technology because it was my way to use this lever? MATTHEW: I initially started the Master's degree in the sociomedical sciences department. I had more of an anthropological approach, so thinking about health disparities. But I ended up switching over to health informatics kind of in that moment in 2015 where data science data-driven decision making was having its heyday. And a lot of people were really gung ho about applying it to every different sector. There's a lot of momentum around the idea of efficiently allocating your resources and optimizing. And I think that was something that was really attractive to me because all of a sudden, I was seeing all these problems being mentioned in quantifiable and solvable terms that I never thought of before. So whether that's, how do you move the needle on whether you can reduce the number of flu infections you have per year? Or how do you just get people to exercise more? How do you solve general BMI issues across different parts of the country? Things like that and then quantifying the value of an intervention. I think that there was a lot of appeal to being able to feel confident in making a decision and knowing that you did your best research around it. So that was something that was a really powerful motivator, the idea that we're collecting data that has never been seen before in history. It's personal information about us that as we've discussed, kind of represents potential vulnerability but also it helps point out systemic inequalities in a way that's never been possible before. We can now point to actual differences between demographics for the first time and say, "This is literally what being born a Black gets you versus being born Asian or being born White." And I think that's a really powerful signal as well. So that's why, long story short, I switched over in that degree, started going more in this direction because I saw that there was a way to kind of meet policy and data and technology and kind of meld all of those together, which was really exciting. CHANTÉ: I wish I had known about it sooner. [Laughs] And also my brain doesn't necessarily allow me to be successful when it comes to informatics from a technical standpoint. [Chuckles] At best, I get to talk to people like you and Jacob who are super brilliant and can actually do it and execute it. And so I understand the value, and I'm really excited. I want more people to pay attention to healthcare and medical informatics because I think it has a huge opportunity for folks in terms of job opportunity. We're going to have healthcare issues forever because the system is so broken. We're going to have public health issues, global health issues. And you're right that data and technology really have an opportunity to allow us to see things we never saw before or understand them in ways that we were sort of maybe thinking about it superficially and understanding, wow, this is very complex. There are systems upon systems upon systems that are determined by your social determinants of health and by your identity. And the particular health system that you sit in or the technology that that health system sits in or that health system uses all these little kind of things inform the way you as a patient or consumer, depending on who you're talking to, use healthcare and whether or not you can be healthy in today's world. So lots to think about but a huge opportunity for folks who are listening, I think. MATTHEW: Definitely. JACOB: So now's the time in the show where we like to think about what was something that we might be taking away or a call to action for this show. So yeah, I've been thinking about who unintended affected communities and people are via technology and not just unintended but unknown. Again, going back to that Waze example of how would I know that I am harming somebody necessarily? What are systems I can have in place as a technologist that would enable those people affected to A, know who they could give that feedback to, and how it could be given to me and how I could then be able to act on it. So I think that's a really interesting question that probably needs to be considered more. CHANTÉ: Yeah, I would agree. Actually, that's what kind of stood out for me too diving deeper into what Matt mentioned, the algorithmic auditing as it pertains to ethics and what that means for organizations and leaders who hold that power and influence. And just to kind of understand more specifically what it means if you do it internally because internally could relate and grow to more bias or allowing a third-party coming in and doing it from an external standpoint but then losing the control. And it's just something that really stood out for me. And I appreciate that we got to dive into that today. I love the Waze example and you providing us with the Ruha Benjamin example too. That's going to stick with me. So I appreciate that, Matt. MATTHEW: For sure. Always happy to spread around these links. For me, something that came out of this conversation and especially Jacob's mention of unknown communities that are being impacted by a specific intervention or a technology product is just how far-reaching a lot of these technology platforms are within our society. And in certain cases, they're either complimenting or supplanting things normally managed by the government whether that's being able to regulate the amount of traffic that goes through a certain area or thinking about how healthcare gets delivered, and who gets resources, and who gets loans. In certain cases, some of these things might be considered to be too important to just allow consumer purchasing power in a free market to regulate. So I definitely think with everything we talked about here with accountability, there are certain parts that internal organizations can't do by themselves. You kind of need someone to put their foot down and say, "I don't think that this is right. We need to just nip this in the bud here. And this isn't good for society as a whole, or there needs to be some kind of financial penalty attached to the way that you're doing things right now and some [01:01:42][01:01:42]reparations or restitution to the people who are impacted." So I'm definitely just curious in the future to see how those mechanisms work. I think that there's growing momentum around it. In 5 to 10 years, I think that all engineers and all technologists are going to have to become more fluent social scientists in order to accommodate this shift. And I think I'm very excited personally just to be able to see that happen. CHANTÉ: Wow. That was awesome. What a great last thought. Very well-spoken. I couldn't agree with you more. I mean, you really just [Inaudible] honestly, something that people can probably think deeper on, and I'm hoping that that might lead to people to more invitation to come on the show. And Matt, I'm sure Jacob would agree with me that we would love to have you back on if you want to talk more about that. As your role and your understanding of technology evolves, let's continue the conversation. JACOB: Yeah. MATTHEW: I'd be super excited about that. And I'm just really grateful for both of you having me on and inviting me to kind of talking about my suerpower for a little. CHANTÉ: Yes, it's a privilege. Thank you so much.