Keith Mccormick Mixed.mp3 [00:00:00] What most people do is they jump to kind of that inside step, so they explore the Data trying to get these interesting insights, then project stops. Well, if you think about Christie, you know, you've got business understanding than Data understanding Data and then modeling. [00:00:40] What's up, everybody? Welcome to the artists of Data Science podcast, the only self development podcast for Data scientists. You're going to learn from and be inspired by the people, ideas and conversations that'll encourage creativity and innovation in yourself so that you can do the same for others. I also host open office hours. You can register to attend by going to bitterly dot com forward, slash a d. S o h. I look forward to seeing you all there. Let's ride this beat out into another awesome episode. And don't forget to subscribe to the show and leave a five star review. [00:01:31] Our guest today is an independent Data miner, Trayner speaker and author. He's got a wealth of consulting experience in statistics, predictive analytics and data mining. [00:01:43] He's a sought after speaker who routinely leads workshops at conferences. He's given keynote presentations at many international events and is an award winning instructor for U.S. Irvine's predictive analytics certificate program. You may recognize him as the instructor of 13 courses on LinkedIn Learning, where he's taught over two hundred fifty thousand learners through his courses. Or you might recognize him as one of the coauthors of one of six books, including SPSS Statistics for Dummies and IBM SPSS Modeler Cookbook. Since serving as a VP of Analytics for a small consultancy, his consulting has shifted emphasis towards helping his clients build and manage their analytics teams. So please help me in welcoming our guests today, author of SPSS Statistics for Dummies, Keith McCormick. Keith, thank you for taking time out of the schedule to be here today. I appreciate you coming on the show. [00:02:44] I look forward to it. [00:02:45] So, Keith, talk to us about how you first heard of Data science. What drew you to this field? [00:02:51] Well, of course, it wasn't really called data science at the time. That was that famous sexiest job article. And that's when we all started calling it data science. [00:02:57] I started doing this kind of work, depending on how you start counting in the mid 90s or the late 90s, I was doing more traditional statistics and the mid nineties and then I relocated to North Carolina, where I still live, figuring that I would do PhD work but, you know, had to pay the bills. So I was looking for part time work and I knew SPSS statistics quite well. So I got an opportunity to start teaching their introductory classes. And next thing you know, I was heading up to Arlington, Virginia, frequently to teach those classes. And then year after that, I was teaching 40 plus weeks a year all over the country. So obviously grad school didn't work out. And I was a software trainer for more than ten years, almost nonstop. [00:03:45] So most of that training and teaching that you're doing was for companies in that corporate kind of events, or was this in universities as well? [00:03:52] Oh, somewhat all of the above. But I was a contract trainer, so a number of my colleagues were full time, but I was just kind of brought in by the day. But I was working directly for SPSS Inc, teaching what initially were introductory classes and then more advanced ones. So to answer your question was really somewhat varied because I would be in an actual SPSS office with people from all kinds of different companies. But sometimes I was sent directly to the company and sometimes they were more academic settings. I remember, for instance, having a group of researchers at CDC, you know, and naturally I wouldn't have known their specialties nearly as well as they did. But it was so fascinating to have that job because I would meet these experts and be showing them how to use the software. [00:04:42] So you mentioned you've been in the field since the 90s, and it wasn't until recently that it became known as the quote unquote, sexiest field. How much more hyped has it become since since you've broken? [00:04:54] You know, it's funny because I spent weeks trying to figure this out a couple of years ago. [00:04:59] That sounds like a silly thing. Right? But what was happening is conferences, said Keith. We really enjoy the workshop that you did on introduction of predictive analytics, but could you rename it the introduction to A.I.? [00:05:14] And I'd be like, really? We're just going to change the title slide where, you know, the the meat and potatoes of the course. So I started to become really fascinated with why everybody wanted to start calling things I that just a few years earlier we had called machine work. And naturally, a lot of things really have changed. [00:05:32] But what I concluded was that 2012 was really the big year where the dominos started to fall, that we're calling everything A.I. now, because that was the year that Hinton and his team won that data visualization award with the first neural net that had a bunch of layers of it. And deep learning was basically born. And it took a couple of years before that move from the academics to everybody else. But that's really why we're calling everything a now. So I think the trick is to remind people that that affects things like autonomous vehicles, medical imaging, Amazon echo. I have one right here, so I don't want to call it by her other name. She'll start speaking to us all. [00:06:16] That stuff has been massively changed. But the day to day stuff that I've been doing all these years, a loan defaults or insurance fraud hasn't really changed. It hasn't gone through this Dremel. [00:06:29] Change over the last several years, it's really been those other areas, so I'm wondering kind of it might be an alternative question here, but I'm wondering how much of an impact that use of tools have had on the name of the field, because when I was coming up in grad school, it was primarily fast and highly specialized software for doing the statistics and stuff. And in recent memory, it's been moved into a python, which is more general purpose programing. What are your thoughts on that? Do you think having this technology, I guess, democratized, for lack of a better word, had any impact on it being called? [00:07:08] I I think you were on to something. I think there's definitely something there. But, you know, when you think about ah how long has ah been popular, maybe 15 years was kind of really starting the rapid rise of ah. But you know, that didn't prompt the change did it. Right. So I think what was driving Ah. Was open source but I think there's a connection between the 2012 event that I was just talking about and the Python picks was Python has been around for many years, so it's a great language, but it was always thought of as a general purpose programing language, like much of the Google search engine is written in Python. [00:07:45] But when I was starting out, certainly even though Python existed, it wasn't associated with machine learning like it is now. So I think if you went back and looked at Google search Data, I bet that Tensorflow and everybody calling in Python, taking over R and everybody calling, everything I say is probably around the same time. [00:08:08] Excellent project idea for anybody that's listening. [00:08:10] So it would be very that would be a fantastic blog post. I would love to read it. [00:08:15] So you've got a great philosophy about the effective use of machine learning and analytics that in order to make it work, we need to have effective teams to talk to us about what an effective team looks like and how do they operate. [00:08:28] Yeah, I'd be glad to. I am passionate about this one because I think that most client organizations that I encounter, what they're missing, what prevents them from being effective is effective analytics, middle management. And by that I mean, for one, nobody can figure out who the data scientist should work for, you know, should they report to it? That's probably true 30, 40 percent of the time. [00:08:53] But that puts a certain spin on things. Do they report to the line of business? Sometimes that happens. Maybe it's marketing. Analytics is the first use case. So they end up staying there and they're reporting to the VP of marketing. You'll see that it's not at all uncommon about 10, 10, 15 percent of the time, data scientists will report to the CFO. And then, of course, you get the whole Center of Excellence model. So that's one problem that has to be overcome. But very frequently, as a consequence of that data, scientists aren't reporting to other data scientists. And you can imagine what happens then. Right? People feel misunderstood. And, you know, that's why I think that even though it's a hot job market, at least that's the reputation, that it's a hot job market, that then Data scientists don't stay sometimes 11 months. You look at a lot of folks and LinkedIn they're actually quite successful, but they've had five roles in five years. So something's broken because obviously they're they're getting that promotion, they're getting that new role, but they're not staying. And if they're not staying, I think there's a reason for that. So I think probably you need to have data scientists reporting to someone that really understands them, that has an appreciation for data science if they're not a data scientist themselves. And then you have to sort out who that team is going to report to. [00:10:11] So you also talk a bit about diversity on teams. Talk to us a little bit more about what you mean by that. [00:10:19] Yeah, I think the problem is, is that sometimes we get obsessed with technology like we've already been mentioning Python. I think also we can fall into the trap of thinking Python is one thing. I mean, you know, I could learn tensorflow all these different packages. You know, people can have different specialties within that. We also think that open source is open source. [00:10:40] Well, you know, try to tell an R programmer to leave their R that they're comfortable with and now they have to do everything in Python. They'll be able to make that leap. [00:10:49] But nonetheless, there's there's a lot of specialties with an open source. [00:10:53] So the problem with a lack of diversity on teams is if you decide up front everybody else, you know, everybody's going to use Python. That's the only option. So now you tell HRR that and now you get coding exams and all that kind of stuff involved. So now a fabulous candidate comes around with 10, 15 years of health care analytics experience and maybe they do OK on the Python exam, but they don't ask it for exactly the reason that you mentioned that 15 years ago they would have been learning something else. Right. So so. So they demonstrate competence in there, but they don't answer. And the next thing you know, you have systematically eliminated anybody with more than five or 10 years worth of experience, which is not what you set out to do. So those young Data scientists, are they contributing to the team? Absolutely they are. But do you want a in a five or six person team? Do you want everybody to be a clone of each other? Go and grab that person who has that amazing Python background, but then someone else who maybe is a little bit weaker in Python because they learned in SAS, let's say, but they've got 15 years of experience. So that's really what I mean. It's really from the journey that they've taken to the job, that they have a diverse journey. They're also you prevent groupthink if people have different backgrounds. [00:12:09] That's a very, very, very important piece. Talk to us about how groupthink can inhibit or limit team effectiveness. [00:12:18] Well, I've got one specific thing that always seems to come up that that drives me a little crazy, to be honest. Right. [00:12:24] There's this assumption now I've already mentioned from my background and you've mentioned, like when you were starting out that not that many years ago, it was more common that people would be learning this kind of stuff on predictive analytics workbenches. [00:12:38] But as an undergrad, I was a computer scientist. So I'm not I'm not anti code by any means. Right. But there's a bit of a myth around this that drives me a little crazy, which is, wow, if we just force everybody to use Python, that the reason that we're doing that is that any code that they write while they're prototyping a model or putting a solution together is inherently deployment friendly, because code, by its very nature, is ready and easy to deploy. And of course, that's nonsense, right? Because you always have to rethink things between a prototype and putting it into production. So the notion that you're going to dismiss anyone that's using any tool other than coding in the same coding language that the team has adopted, you're you're working off a false premise, in my opinion. [00:13:27] You mentioned a little bit earlier right here about hiring and retention and analytics that that is broken. Talk to us a little bit more about how it's broken and what can we do to fix it? [00:13:42] Well, of course, part of the reason that it's broken is that we get these crazy job descriptions. You know what? I know that you've spoken to other guests about that. But then you know what's crazy is the behavior that that drives. You know, I think one of the most common pieces of advice that people exchange on LinkedIn, a platform where I'm quite active is, oh, just ignore the job description because the job because we all know the job descriptions are crazy. [00:14:10] Well, you know, it seems to me there's an inherent flaw there. If the entire Data science community recognizes that the job descriptions are nuts and therefore everybody should ignore them, you know, so then it's broken in that sense. And then another piece that I'm passionate about is, and I'm sure they're related, is that too few organizations consider hiring from or promoting from within that as passionate as I am about statistics and that that's an important piece. It's an important but small piece. [00:14:45] So I find very often folks coming from it or B, I become great data scientist. But then they don't have all the checkboxes in these crazy job descriptions, so they're dissuaded from applying and that's one of the reasons that they leave because they're doing Coursera or reading Data science books on their own, whatever it might be. They know the organization. They know the Data. They go and they say, wow, I really want to apply for this Data science job. Well, you don't meet all the criteria. Next thing you know, they're gone getting an entry level Data science job at another organization rather than staying. [00:15:23] So we have all this turnover and I don't think it's necessary if we were to to distill down the checkboxes into a few essential pieces that should be common among all Data scientists in the field. What do you think those checkboxes would be? [00:15:39] Well, I think what's happened is we put so much emphasis on the coding that there's there's this huge gap between running the code that creates something simple like a decision tree and knowing the basic foundation and concepts of how the tree is is growing and how to interpret it. [00:16:04] So for me, it's just absolutely bedrock basic stuff. As to folks, this kind of sounds so simple. It would seem obvious that folks should understand decision trees like cart and linear regression. And I was going to say, oh, I know that. But but do they really, you know, regression in particular? I remember when I did relocate down here and I thought I was going to do a PhD, the topic that I thought was going to pursue was a psychometrics. So what I was doing is I obviously had to get stats out of the way, but it was going to be psychology and statistics and a blender, basically. So I had some stats background already, so I audited the introductory stats courses just to get them off my plate before enrolling. And naturally, as I explained, that never happened. But those first two courses were basically a regression course. But this was a graduate level course. And for 10 months we did nothing but regression. So this idea that somebody copied and pasted some code from GitHub and they taught themselves regression in 20 minutes is just just nuts. So I really would say those two topics are the most important. It seems like a short list, but if somebody really mastered those two things, I'm already kind of liking them as a as a candidate. And people would say, well, Keith, you know, you're talking about stuff that's either over 100 years old or in the case of CA since the early 80s, fine. But the same guy who invented Kaat also came up with random forests and his work also led indirectly to boost. Right. So I really want to know that they know those basics. [00:17:41] Yes. About that. The principles. Right. Understanding the principles from a intuitive kind of level like you master this seemingly simple concept or small concept, a much larger body of work. It sets the foundation for you to be able to go learn more and more complex stuff on top of that. [00:18:00] Great agreed. And I would say from the statistics side of the House, I want to know if they know when to trust and when not to trust the Data. I'm really I'm really big on that, you know. So if I asked them, you know, this kind of thing, you would perhaps ask as an interview question to a candidate if tell me about a time that the Data seem to tell you one thing, but then it ended up the truth ended up being something else. If a serious candidate has never experienced that, I've got real doubts about what's going on because because they're just not bringing the necessary skepticism to the job. If they've never had an experience where the Data led them down one path and they said, oh, the Data is actually you know, the other thing is what's really going on in the population. [00:18:50] I'm excited to dig into that. So if you're listening, hang around with me, get into this topic a little bit later. For someone who is the first data scientist in an organization and they're responsible for building up the data science practice, what are some of the challenges you would see them facing and how could they overcome these challenges? [00:19:11] This is a tricky one because, of course, I'm coming at this as someone that's been an external resource for virtually all of my career. I think that a lot of organizations. Do this too quickly. In fact, I was asked by I don't know if you've ever heard a tweet, why they do conferences and so on the Data Warehouse Institute. But I was asked by them to do a webinar once, and it was called Your First Hire. And it was it was a whole hour on this topic. [00:19:45] But the basic premise was what organizations figure is that they just hire one unicorn and give them a head count. They'll be fine. But it puts that person in an impossible situation because they're arriving with the budget for a team, but no team. Well, how can that person arrive? Get to know the organization, get to get to know management, get to know the Data in the organization, the challenges that are being faced by the organization while running around trying to deal with H.R. to hire a team? You know, I think that's a recipe for disaster. So I think I think what's probably better, perhaps self-serving again, is an external resource. Right. But I think what is better for most organizations is bring in an external resource to help with a specific project and try to find existing talent within the organization to collaborate on that project. [00:20:35] So one thing I really don't like at all is when organizations use an external resource and they throw the data over the fence and then they just get the solution delivered on their desk. That's a nightmare. Right? But if you bring in the external resource and part of their responsibility is to mentor internal talent, get through that first project might only take six months, 12 months, whatever it might be. Now, you're in a better position to figure out because now you have some talent that you've identified internally that would report to that new data scientist coming in. So they're not coming into a team of zero. They're coming in with a couple of folks that have already been mentored on a project. I think that's the better bet because otherwise it's really a very difficult situation to be all alone on a team that exists in theory but does not yet exist in reality. [00:21:24] That question was really selfishly asked for myself because I had in that exact same situation. [00:21:31] But at the risk of putting you on the spot, are you finding as challenging as I described? [00:21:36] It's definitely challenging, but I'm really liking the support from the organization to make it happen so that everything you've outlined is 100 percent true. And I find myself in this situation basketry, listening to a level where you're giving me. So let's take our. [00:21:52] Yeah, I can ask a follow up to that, if I could, is my observation that there's always good internal talent there. Also true in that case, and it sounds like your your organization really is supporting you in that way. Have you found that you're able to grow your team with at least some internal talent and not have to reach outside for everything? [00:22:12] Yeah, so I've been able to tap into some borrowed resources from some other cost centers and rope them into projects and get them to help out. And I'm a big fan of remixing talent because it just helps people with that. They could be in Chapter two of their career, chapter three or four in their career, and they've got all the skills, but they just need to think about how to apply them in this new way, in this new context. So, yeah, I'm all for remixing talent. I think that's awesome. [00:22:39] You've put your finger on one of the best ways I think to do that, which is keep the existing internal talent in their current role, but have them have a temporary dotted line to the project during the lifetime of the project. That's absolutely the best way to do it, because then I think that before someone contemplates leaving their current department and joining the Data science team, they should never do that until they've done a project from start to finish as a borrowed resource. [00:23:12] Then you make the decision because think about it, you don't want somebody to officially sever their relationship with a previous relationship, let's say the team, the team or whatever, get through the very first project and then regret it. That's just nuts. Why do it that way? Have them be a borrowed resource, even if it's only half time, then at the end of that first project, have to sit down and say, where do you want to be in two years? Do you want to be on this team? If you want to stay where you are, do you want to occasionally be a borrowed resource? [00:23:38] I love it. That's absolutely the the path we're taken at my current company. So I'm glad we're a bit of validation there like that. Yeah, good. [00:23:46] Good for you. Because I'll tell you, it's not the most common scenario. I probably see it maybe 20, 30 percent of the time, but it's absolutely what I've experienced to be the most successful way to do that. [00:23:57] So taking the conversation, a little bit of a philosophical direction here. Talk to us about what you think the goal of analytics should be. [00:24:06] Whenever I'm asked this, I always give a similar answer, and it may seem technical. You know, it at first not know not to us and our our immediate audience. [00:24:17] But, you know, sometimes if management asked me that, they're initially taken aback by the answer, I always suggest that when someone's trying to vet projects, I mean all projects that the Data science team might be facing to take this concept at least out for a spin and see if it fits the problem, because it fits the problem most of the time. [00:24:39] And that's to ask whether or not what you might be facing is a binary classification problem. You know, are you trying to make a yes or no? Are you trying to provide a yes or no answer to some decision? Because so frequently you are like in predictive maintenance, whether or not you have to take thing something out of service for unplanned maintenance or whether or not you are afraid that someone is going to default on a loan. [00:25:03] So there's some intervention strategy like the offering of a refinance or something like that, or a potential fraudulent claim gets routed to the investigative team. And again, it sounds perhaps overly simplistic, but the reason that this is so powerful is that if you can frame the question in this form, you have massively increased the likelihood that the model will be deployed, because that scenario that I've just described is so deployment friendly. I've never sat down with an executive and talked about whether we can frame it that way for 15 and 20 minutes and not have some positive outcome. [00:25:33] We either conclude that we can indeed then we're off to the races or we conclude that it's more complicated in some way and maybe it's more complicated in some way that we can address. [00:25:44] Maybe it's so ill defined and complicated that we actually step away from the project and say, well, maybe this isn't the right project to do because we can't even agree upon what we're trying to accomplish. [00:25:55] So now the philosophical question here about the use of insight to what do we use these insights for? [00:26:01] And while the rage about them. [00:26:03] Well, I mean, who doesn't who doesn't like insights, right? I was just chatting with somebody earlier this week and the way they phrased it, very common phrase, actionable insights. But what do people usually mean by that? [00:26:15] It means that you look at the Data kind of poke around, explore it for an indeterminate amount of time. These days. It's usually for two weeks because that's the length of time in a sprint of the sprint sessions. Right. Which is a whole we could do a. We're on just on just sprints and Agilent and to what degree agile aligns with predictive analytics. So that's a huge topic. But, you know, you can't sometimes it's hard not to mention that that that's out there. [00:26:42] And then a lot of organizations for them, agile just means that every two weeks we have a meeting, a super casual thing, and colleagues of mine that are really well trained and into agile would say, no, no, that's not agile. That's nuts. But nonetheless, let's say let's say you're in an organization that's really all about the sprints. So you give people the first two weeks and the goal, the first two weeks, just see, oh, just see what the data is telling us. You know, we'll call it a p.l.c.. We'll give it a fancy description, but really just kind of poke around the Data we make a discovery or to throw it in a slide deck, have a meeting with management. And we've we're supposedly have reached some kind of interesting milestone. The problem I have with this is you just kind of go in circles, you know, and when is that thing going to end? [00:27:25] It's almost like you're still playing pinball or something. You know, it's like if you do get an actual insight in the first two weeks, then you win a bonus round and give you another you do another sprint. You know, I would much rather have someone try to develop a deployable model and get the actionable insights along the way. That I think is a whole lot more productive because I want to have or I also want to have some sense of the scope of my project. So the problem with me is that insight all by itself is too unfocused. I like the insights, but I want my project to be more than just insights, and I'm definitely in the minority on that one. [00:28:06] Talk to us about the goal of achieving a deployable model. Once we've achieved that goal, then what? [00:28:13] Well, I mean, you go you go live with the model. So let's say it's what say it's insurance fraud. Then every month you are generating a list of claims that should be investigated that go to the investigative team. And, you know, something that people forget. It's obvious once you think about it, but people really do forget this is that existing systems were already in place. It's not like the investigative team was waiting for the model to be done. They were already doing that. So this is another reason why all of the binary classification, because what you're really doing is you're marrying this new propensity or risk score that you have with existing business rules. And that's why, in crestmore, for instance, the evaluation phase, which and I know I know a couple of the authors pretty well, they nearly called the evaluation phase the business evaluation phase, because that's really what it is. [00:29:06] You've got this model. You take it out for a spin in and it's good. It could take months. I mean, you know, investigations might take a couple of weeks to complete. So for six months, you might be in this evaluation phase checking to see if you can move the needle in the right direction. That's very different than making some discovery, like, wow, we've discovered some new kind of fraud and you put on the slide deck and then there's either some new policy or you socialize that within the organization. How can you measure the impact if the insight was delivered in the form of a PowerPoint slide? Right. But if the insight comes in the form of a risk score, which is married with your business rules, you can measure almost to the penny. In fact, you probably could measure to the penny exactly what the impact of that diploid model was over the months that you're measuring that impact. So it's totally different world NOPSEMA that. [00:30:01] Thank you for that. What's up, artists? I would love to hear from you. Feel free to send me an email to the artists of Data Science at Gmail dot com. Let me know what you love about the show. Let me know what you don't love about the show and let me know what you would like to see in the future. I absolutely would love to hear from you. I've also got open office hours that I will be hosting and you can register like going to Italy dot com forward, slash a d s o h. I look forward to hearing from you all and look forward to seeing you in the office hours. Let's get back to the episode. [00:30:53] What are some things that we need to monitor and track once we've got this model deployed and it's out there in the wild? What are some things that the data scientist should care about, should look at? And on the flip side, from the business understanding side, what are some things that they would care about? [00:31:11] Well, one of the things I want to briefly revisit is when we were talking about teams and existing internal talent and borrowing talent from other places, one of the real world situations that I encounter quite often is how can we utilize somebody like inturn that maybe we're going to have four, six, eight, 10 weeks in? If you've ever managed a resource like that, it's really tough. It's like, gosh, how am I going to get them up to speed? And then next thing you know, they're gone. [00:31:43] Well, monitoring is the perfect way to utilize a temporary resource like an intern, because it's tedious work on one level. [00:31:53] But it's also important and therefore kind of interesting on the other, especially if you're only doing it for a couple of months, because what you're basically doing is you're checking to see what would the business rules have said and the absence of the model. What did we investigate? Because the model said so, but the business rules didn't say to do so. And you're basically doing some simple, simple arithmetic, but that arithmetic is giving you some really important numbers. And tied to this is that people get mixed up about why models degrade. They go, yeah, yeah, we've heard that models degrade, but we're rebuilding our model every night or every month or whatever, so our models are not degrading. Know your model is still degrading even if you're automatically rebuilding it. And the reason is, is that the model is only one step in a long process. You're also making assumptions about what data is relevant, what variables are used in the model and all that. All important feature engineering that we're always talking about is data scientists. So those decisions eventually degrade even in the presence of an automatic rebuild. So somebody is going to keep their eye on the monitoring of those models. [00:33:05] So years ago, what used to happen is one person would wear all the hats. So you'd have your data scientists. They were usually alone 15, 20 years ago. They they were often the only one. [00:33:17] And then, you know, they just often wouldn't monitor as carefully as they should because they're in the middle of a new project right now. So I think that once you've got four or five, six, eight or more models, Kukan, it's really great to grab somebody from it that aspires to go into data scientists, maybe maybe they're in their first career job. They just starting out the kind of person has the stack of Data science books on their on their nightstand, you know, and say, OK, for about a half day a month. [00:33:48] You're going to check these eight models, you're going to check to see if they're generating the logs, they should think about it. You could have you could have a model that's pulling from eight or 10 different sources. One of those sources could go down in the model, still kicking out scores. If you're not checking that once a month, that actually could go on for quite a while. But depending on how you've set it up to give you warnings and I've worked with clients, one in particular really, really striking worked with a client that they had a model that had been running for about 15 months. I read the documentation that the consultants, it was an external resource had done to create. It seems like a really good model to me. They said, you know, the sales team just doesn't trust the model. And I said, well, did it just drop off a cliff or was it slowly, slowly getting worse? I said, well, I don't know. No one's no one to check that since we launched 15 months ago, what could be more fundamental to diagnose the problem than if it slowly degraded or just suddenly it was like the lights going out, you know? And boy, they were so stubborn about it because they did not want to try it. They just said, like, I don't care what what went wrong, just fix it. It should have never happened because they if they were just checking a few minutes each month. [00:34:57] Thank you very much. That some really valuable information about the audiences. And I love that that take place there. So continuing on this kind of thread here, can we talk about how to define track and measure ROIC of an analysis to make sure that the work we're doing is actually delivering value for the business? [00:35:19] Well, you know, this is a little bit tough to talk about verbally, right? Because in a sense, it's so visual. But the way that I would have people picture it is as data scientists, we spend a lot of time during the modeling phase thinking about confusion. Matrices are true positives, true negatives, false positives, false negatives. Right now, I think everybody's familiar with that. So what it really comes down to is trying to quantify financially because, you know, even a nonprofit, I have one colleague in particular that's always reminding me, you know, nonprofits don't have our way. [00:35:52] Well, you know, but if you're saving if you're saving labor, for instance, you can turn that into salary. [00:35:59] So it really is money most of the time. But for all of those four possible outcomes, true positives and negatives, false positives and false negatives, you should be able to associate some kind of monetary amount. And that's the you know, without short of, you know, walking through a specific example, that I think that alone gives people a pretty good idea of what what it's like. [00:36:18] Yeah, absolutely. What are some steps that we could take then to turn a business problem into a Data science research? [00:36:26] Question number one, again, I think would be posing the question, can this be a binary classification problem? [00:36:36] You know, I'll tell you what frequently happens. This is potentially a big topic, too. But senior management will usually think what they want is time series forecasting. [00:36:49] Because that's what they learn in school, especially if their MBA is, you know, they're more familiar with Time series forecasting, which, you know, everybody can just picture looking at a stock Data, IBM price fluctuating or something like that. That's what they think they want. They think that they think they want it at a high level. So usually you just have to kind of gently push in this direction that what is more actionable is what a lot of folks in Data sides call a micro decision, not trying to forecast how many frauds will have last month, but whether this claim ID is associated with fraudulent activity rather than in the aggregate. Because think about it, senior management, they're usually managing in the aggregate make sense. That's what that's what they do. So this gentle introduction to the notion of a micro decision, I think is really kind of the wake up call for them. [00:37:41] And then, you know, in fact, it leads to a broader problem, doesn't it, is that we have to find a way to briefly introduce executives to be able to think like a data scientist so that both senior management and the data scientist can meet halfway. That's where that analytics middle manager really is. But if you can get senior management to think in terms of micro decisions, the rest starts to fall into place. [00:38:06] So you've got a tremendous career working with in Data centers, consulting. Talk to us about what the difference is between working for a regular organization and working as a consultant. [00:38:20] One of the things that I love about just going on a personal note, like what the day to day is like, I love the fact that I've worked in so many different industries. I did a gig with a regional casino once that was interesting. So I've never worked with the big casinos. And in Vegas, I have friends that, you know, have you can guess the kind of work that you do, loyalty programs and so on. [00:38:40] I did a brief gig with the Navy SEALs. That's always so funny because it's so unexpected. It was looking at personnel profiles with text mining. And I actually was in the army when I was in my twenties. I was in the reserves, so I briefly had had a clearance. [00:38:59] But it's not something that I've maintained. So I couldn't I couldn't look at the personnel records, but I taught them how to use text mining software. And we talked you know, we went to the whiteboard and we talked in the abstract about these records, even though I couldn't examine them personally. But, you know, that's just tremendous variety. And then there was a project that was about staged accidents that almost certainly had kind of an organized crime aspect to it, because doing a state, Jackson is a big, complicated thing. You're you're faking invoices for MRI's and all kinds of stuff. So it almost certainly was organized crime. So on a personal note, I love the variety that's involved. [00:39:38] But switching back to contrasting that to what life would be like as a data scientist on the inside, I think that one of the challenges I mean, it seems like it seems like a good thing at first, but I think it's actually a challenge to be overcome is that as an external resource, there has to be this process where the client wants the project done. You're trying to determine the scope and you're working up a contract. And having done dozens of these, I can tell you it's not writing contracts and going through that sales process is not the most fun part of the work. And it's also something that a lot of young Data scientist probably aren't ready for if they were to be a freelancer. But think about that compared to working with an organization, as we were just talking about a few months ago, that process never happens. So a lot of times the process of deciding, is this going to take 12 weeks or 20 weeks? Is it going to be three people or five people? This person, the subject matter expert in another department that I really need it. Has it been reduced to writing that I have access to them one day a week mean? It's never it's never reduced to writing. It's never thought out. So I always have that contract to go back to as an external resource and say, hey, you know, we kind of agreed upon this a few weeks ago. We either got to stick to the plan or modify the plan. Right. And I've got something to refer to. And I think internally that step never happens, which would seem to give you freedom. But sometimes halfway through the project, you might wish that you had agreement on those issues. [00:41:07] I was wondering if you had any tips for people who are trying to maybe break into Data science, but do it through freelance work. [00:41:16] This is tricky because and of course, this is the way that I've done it all these years and I've really enjoyed it. [00:41:21] But what makes it tricky is that if you reflect back on my career, I was fortunate in that by the time I started consulting, I had been doing software training for quite a few years so I could refer to that experience. [00:41:37] And when IBM bought SPSS, it was IBM themselves that sent me on my first few consulting gigs. [00:41:45] So I absolutely encourage people to do it because I've loved. [00:41:49] The freelance aspect of my career, I would never I would never change that decision, but the trick is and the current environment, with so many certificate programs and boot camps and things like that, I think that probably before someone sticks their toe in the water, they want to be able to establish themselves a little bit in some way. So if they're working with an organization, in other words, if they aspire to be freelance but they currently have a job, then I would want to be the kind of person that we're talking about a few minutes ago that gets assigned to a project and a visible way, because certainly no one I mean, otherwise, how are you going to do it? If you're going to be freelancing at some point, you have to do your first project and you probably can't do your very first project as a freelancer. You have to get that credibility somewhere. [00:42:37] Another one that I'm a big fan of and I wish I had done a little bit earlier in my career, was writing the book. [00:42:44] You know, the first book I wrote was 2013, so quite a few years ago. But when you think about it, I was 15 years into my career and I know a lot of data scientists who took that step earlier. You know, it's nights and weekends. You log the hours and most people aren't going to make very much money from the book. But it's a huge way to establish credibility. And LinkedIn learning asked me to do a course on related subjects was actually a lot of fun to do. Side hustle for Data scientists. And one of the specific recommendations that I made in the course is trying to be a technical reviewer for a book because that somebody literally you could be a day out of undergrad. Right. Or if you were really gifted undergrad, you could do it while you were still an undergrad. Because what a technical reviewer does in a technical book, whether it's an art book or Python book and SPSS book or whatever, think about the huge volume of books that are really impacting all these other companies put out every year. The role of a technical reviewer is to basically check the code, check the steps. It's almost like being a beta tester for a book. But even though the pay's not that you're not going you're not going to retire being on being a technical reviewer, but you can list that you were in that role and you're not on the cover, but you're in the you're in the credits for the book. And I think if someone did that just once, they would have a just a little bit more leverage when they went to submit a book proposal to write the book themselves. [00:44:22] You know, now you've got a little bit more credibility, right? You go up to that client because what's going to happen when you try to do your first project? Freelance people are going to look you up on LinkedIn. They're going to do whatever surah winning a couple of Kaggle, you know, doing. They're not winning literally. But, you know, doing well in a Kaggle competition might be some credibility. But there's something about being involved with a book that really puts a client at ease that you know what's going on. [00:44:52] That's some awesome advice. Thank you for sharing that and actually didn't have any clue that there's a course on LinkedIn learning called side hustle for data scientists. So I will be sure to link to the show notes. That is amazing. [00:45:06] My acquisitions editor. Love that. I love that title. What ever we both did. But when I said, wow, you know, is there anything in the library on this subject, I said, not all Data scientists have salary jobs. He goes, no, there is nothing yet in the library about that. We should do something about that. So I worked up a proposal we both liked and it was a lot of fun, actually. [00:45:27] That's awesome. So that's another philosophical turn here. [00:45:32] Talk to me about what it means for you to be a good leader and Data science. And how can someone who is a individual contributor embody the characteristics of a good leader without necessarily having that title? [00:45:47] Well, you know, everybody kind of develops their own leadership style. And although my military career was fairly brief, this was something we would talk about. [00:45:59] I don't know if everybody knows this about when you're on a ROTC scholarship, you just a college student like everybody else. But like one night a week, you have to take these army classes and they're, you know, mostly on leadership and so on. My leadership style has always been very much about protecting my team. That's just always been my emphasis. So when in an analytics middle management role, the number one thing that I'm trying to do is make sure that my team is protected against just kind of random bandwidth filling ad hoc stuff. It might even sound kind of silly, like, well, of course, you don't want to do that. But think about how much pressure, how much tremendous pressure, frankly, is on a Data science team lead not to accept every. Project that comes along the pike, right? So project lands on your desk, one of the team members just finished something perhaps. So it's obvious that they have some bandwidth, that they have some time. But then it's something like, let's say, a dashboard in a project or something. There's tremendous value in that. Right. But if that person was hired to do predictive analytics, they don't want to be on a dashboard project. And if you don't push back, however, diplomatically and gently, you have to to protect your team against that. Next thing you know, you can have a data science team that's basically like the Help Desk that does everything that comes their way. And the team's going to be frustrated and they're not going to be happy. So, you know, if I'm running a predictive analytics team, one of my number one things is making sure that we're spending our time doing predictive analytics because this other stuff might overlap with our skill sets. But it's not going to show that ROIC and if I'm not demonstrating, are why I'm not positioning my team the way that they should be for lateral moves or promotions or their next opportunity. [00:47:52] I love that it's an excellent leadership philosophy, definitely to adopt that one. So another philosophical question here is statistics the same as data science? [00:48:03] Well, part of the problem is none of us know what data science is. Right. And by that I mean that's a it's a puzzle that we haven't figured out yet. [00:48:12] It's the term is being used in so many different ways. So something I think that hasn't come up in conversation is that for about the last year I've been involved with an organization called I Adds. [00:48:28] It's we can put that in the show notes as well. But I encountered them at the CHEDID conference last year in Anchorage, and some folks may know the name Oussama Fayyad, who was one of the original chairs of the first CD conference and helped coined the phrase. Right. So he's been around for a long time. He also has another interesting distinction. He was the first CTO ever. He was a he was the first person to hold that title when he was at Yahoo! Anyway, he and a colleague Pommie who we also know and work with, they were presenting about a survey that they did of data scientists trying to figure out how mid career data scientists described themselves and, you know, somewhat all over the place. But they're trying to figure that out. So they are working with employers, applicants, universities, boot camps to try to define all this stuff. So there's no question that statistics is part of Data science. But there's also no question there are some people that self-described as data scientists who aren't particularly interested in statistics at all. So I wish I could give a better answer than no one's figured it out. I think that knowing statistics is fundamental, even a machine learning, much less a broader topic Data science. [00:49:46] But there's no question that there are some people that go through some kind of a program that's called a Data science training program and then call themselves data scientist and have minimal, if any, stats background. I don't think they need to have a lot, but I don't think you can do data science without without some stats. [00:50:04] One hundred percent agree with that, not just because my research as a statistician, but for the reasons you just said, how can you do machine learning without really understanding statistics? Thank you for sharing some insight on that. So you've got this awesome course that I spent early part of this week going through and I really enjoyed it. And it was the non-technical skills for Data scientists. I was wondering if we can walk through some of these non technical skills that you talk about and of course, one that I really enjoyed. Well, we'll pick a few here, but I wanted to start with this embracing ambiguity. Talk to us about what that means. [00:50:36] Yeah. So, well, obviously, all of these are inspired by my consulting career. And what I've grown to learn is important, you know, about this. But this one in particular makes the top 10 or whatever, number four up in the in the course I remember no the the the names that I chose to represent these traits. But when I'm teaching workshops, I often have folks come to me in the break. I understand why they why they might get frustrated. But usually the question will come in this form. It'll be, wow, I'm really enjoying the workshop, Keith, but I've been trying to write down the steps that you used to solve Problem X. And I'm kind of having trouble because I feel like I don't have a recipe here. And next time I face a problem like this, I want to be able to follow the exact steps so that it will be easy. Right. And I say, well, the reason the steps weren't clear is because there is no cookbook here. You know, there is no recipe. It's about figuring things out. I've always liked you know, it's a really old TV show now, Columbo. But what's for people to have a. Now, I'm sure they've all heard of it, but if you're not a fan, you might not know that he pretty much knows who committed the murder within the first five minutes. The show isn't about him figuring out who did it. [00:52:01] The show is about him trying to establish the evidence to do that and that it really feels like it really feels like what we do most of the time, that we get a hunch of what's going on the Data. But we're crossing our T's and dotting the I's. Right. But there will be all these twists and turns along the way. So if you're going in either as a career or within the context of a single project, figuring that you're just going to get a book off the shelf or copy and paste some code from GitHub and everything's going to be cookie cutter, you're in for a surprise. [00:52:35] Kind of get comfortable using a compass, not a map, right? I like that. I like that. That works. You talked about this this skill of cognitive empathy with this home. [00:52:48] This is one of the fun examples to share in the course. But it's also one of my favorite things about doing this kind of work is that we're always trying to solve people's problems with the math that we do in data science. And the reason that that's the case is that, well, majority of the time there's some kind of customer that's involved. But even when there is no customer involved, like people might think, oh, if we're doing predictive maintenance, then, you know, it's more an engineering challenge. Not really true. [00:53:19] People are using the machines and repair people or doing repairs and accidents can be made. Someone was just sharing an interesting example with me where one of the things they look for with this kind of Internet of things Data that they have is that people sometimes brush up against this thermometer. He was describing it verbally, so I've never seen it. But people will sometimes brush up against the equipment and that will cause a problem. So they're trying to see that in the Data. But that's what I mean. You have to think like you're the repair person. You have to think like you're the customer to understand what's happening in the Data. So I'm such a fan of this that I am somewhat notorious among my colleagues. They think I'm somewhat crazy, that I want to spend a half hour on projects listening in on the call center or going to if it's a retail kind of a thing, I might want to go to the store and just walk around for a half hour just simply to understand what are the people things that are generating those Data. And it really helps me analyze the results and build a good model. [00:54:21] 100 percent agree with that. I remember well, I've been in my company for almost a year now and I started working on a project and just first kind of just dug into the Data, started building my models and stuff, and eventually after my probationary period was over, I was able to fly to our other office where the people could be using my model were in Atlanta. So I got a chance to spend like two days sitting with them, watching them, how they work and how they're going through their decision process, because I'm trying to model their decision process and came back and just the insights I got from that, I was able to build a model that was just so much more better than had I not done that. [00:55:00] Yeah, absolutely. You know, and there's there's another aspect to it, too, is you have to you have to have cognitive empathy when you deploy because you have to reflect on what's going to you know, how is the sales team going to react when the order in which they reach out to customers is determined by the model or something like that or the or the priority that a particular customer gets is determined by the model. How are they going to react to that? So you have to really be able to think not only like your customers, but also the end users within the organization. [00:55:35] A lot of people break into Data science. They hear it's the sexiest field. They see the pay scale. And obviously you get paid good amount of money and they get drawn to the field for that reason. Talk to us why it's important that we had a commitment to our craft. [00:55:51] Well, I just can't imagine that that they're going to enjoy it over the over the long haul unless it's it's fun to think about all the other careers that folks do, that when people think, you know, their folks want them to be a lawyer or something like that. Think about all the people with with law degrees that aren't doing the law because, you know, their dad was a lawyer or whatever, and they felt pressure to do it. So I haven't met too many people that became Data scientists because their parents made them become Data scientists. But I agree with you that the draw because of the income is there. But I think it's ultimately the same. If you're choosing the career for some external reason, you're not going to stick with it and you're not going to do your best work, especially since it's endless reading. I have to I consider myself lucky. I mean, this has been a stressful year for all of us. Let's face it, with with with covid. I mean, our schedule has been turned upside down and so on. But I've been able to. Get books off the shelf and look things up that I've been meaning to for months or years, and then finally I'm here because sometimes I would go on trips for weeks at a time. How many books can you bring with you? You know, so if it wasn't if it wasn't on my laptop, I wasn't going to read it. It's so typical year. I always made sure that like four to six weeks a year, believe it or not, I would be not just speaking at a conference, but sticking around for a few days and stuff like that. [00:57:14] This year I've been told to do twice that because I've had so much less time standing in line and security. So that's the other thing. If you're not committed to this as a career, how the heck are you going to keep up with all the learning that's required even a year? Twenty five or whatever year I'm on? [00:57:28] It's the one thing I absolutely love about this field is the lifelong learning. You have to be that lifelong learning to really thrive in this field. Another one of these skills that it's really been core focus about what I've been studying the last few months is persuasion. There's a lot of great courses on LinkedIn learning for persuasion. I've been getting into a lot of Robert Cellini's work and this is fascinating to me and I think it's a critical skill for Data scientists to have a talk to us about this, talk to us about persuasion. How can we be more effective with our persuasion skills in Data science? [00:58:05] I was reflecting on something similar earlier today. Just I think we all do this. You know, you drive in and you run an errand, sometimes just alone in the car. You lose some of your best thinking. And I was thinking, how have I learned some of just the basic business knowledge that I have and how would I recommend that others do it? I think it's somewhat related to cognitive empathy in this way. Right. You have to be able to think like the VP of Marketing or the CEO or the senior VP of maintenance or what have you. Right. But then how the heck are you going to do that if you're, you know, late 20s, early 30s, what have you starting out in your career? How are you going to do that? If I had had resources like LinkedIn warning and stuff like that when I was in my late 20s, because I didn't definitely didn't. I mean, it was back, you know, you would you'd spend five hundred bucks for a couple of videotapes to come to you in the mail if you wanted to do something like that. I think as time permits a little bit of basic accounting spreadsheet type stuff, not not like Excel, like how do senior how does senior management think when they are like working like a spreadsheet and what's caused with everything online? Now you can find courses like this. I think part of it is that we have to be a good presenter. You have to do all those things. But part of it is understanding just a little bit of how all these VPs and how these sea levels think. And when I was starting out, it really had to be trial and error and school of hard knocks. [00:59:41] But I think now you actually could. Just a week ago, I was exploring a particular issue of the natural tension between sales and marketing for sales is has short term goals that they have to meet and marketing has long term goals that they have to meet. [00:59:58] So I was thinking, well, I wonder if there's anything on LinkedIn learning on that. And sure enough, there was a forty five minute course on how to get sales and marketing to work together. [01:00:07] So if I were young Data scientist all over again, one of the things I think I would be doing is trying to give myself the equivalent, a little bit of MBA type knowledge here and there, not because I need it directly in the project that often, but because I need to communicate my Data science models and their purpose to people whose lens is MBA stuff. [01:00:32] Learn to build, learn to sell. If you could do both, would be unstoppable, right? Yeah. So you've got a course that just came out focused on the most important part of the crisp D.M. Lifecycle. The Data understanding talked to us about this. [01:00:52] Yeah, I'm really proud of this one because on one of the things that I'm proud of is that, you know, that they let me do it because think about it as a potential author. LinkedIn Learning has an interesting model because it's heavily curated. So typically they reach out to people that are already established in one way or another. In my cases, in my case, the books that I had had written. But this is not the kind of course, typically that would be a first course that an author would do, because it's somewhat going out on a limb to do four hours on this topic. But this is why I was so passionate. I'm thrilled that they let me do it, is that we were talking earlier about what most people do is they jump to kind of that insight step. So they explore the Data trying to get these interesting insights, then project stops. [01:01:38] Well, if you think about crisped, you know, you've got a business understanding than Data understanding dataprep and then modeling. So in my experience by the. You get to modeling, you're about eighty five percent of the way done. So what I cover in the course isn't just exploring Data, it's how to do a proper assessment with a very specific goal in mind, which is what is going on in the Data that I need to correct during Data prep in order to make the maximum model. So if it's true, and I certainly believe it is, that Data prep is more than half of any any of these projects, Data understanding is critical because it's how you plan the Data prep Data prep is not just cleaning, it involves all kinds of other things like feature engineering and so on. So that's why I'm so passionate about this one. I'm thrilled that they let me do it. You know, I could have also done four hours and Data prep, I'm sure, and that's also a super important phase. [01:02:32] But this is the ignored phase. No one talks about this. They probably just think it's the same as data visualization, which it's not. It's very specific things to kick the tires, so to speak, so that when you get to the modeling phase, you're ready to go. [01:02:47] I'm really excited to get through this course. I'm looking forward to probably spend some time early next week getting through this, talk to us about the role that skepticism plays in this part of the lifecycle. [01:03:00] Well, boy, that's a that's an important one, too, in in non-technical, I think that's where the state's training comes in. You know, I was mentioning earlier that one of my favorite questions that I would ask, what's an applicant if I was interviewing was, you know, tell me about a time that the Data seemed to tell you one thing, but it turned out to be something else. That's really what the skepticism skill is. And what it comes from is thinking about what is the journey that the Data took to get to me, you know, and is there anything that could have happened to distort it? Right. [01:03:32] But people assume that if there was no explicit source of bias, if there was no mistake that was made, that therefore we can trust the data completely and we have nothing to worry about. And it's not true because we still have sampling variation. [01:03:52] People think that a machine learning, we don't have to worry about that because you have, quote, all the data. You never have all the data because, you know, even if I have all the data on the data warehouse, the data that's in there has gone through a journey. And I certainly don't have the twenty, twenty one data as we speak here in twenty twenty. So I still have to understand basic concepts like confidence intervals and so on. [01:04:14] So I have to recognize that the data may be trying, may be seeming to indicate one thing, but it's not true. You know, we're in political season. So the classic example is that, you know, I've always been a fan of five thirty eight, which is the website. Actually, Nate Silver's book was over here. Somebody was just asking me about it. But Nate Silver, who runs five thirty eight, doesn't do polls himself. He consolidates all the polls. Now, the reason I'm bringing this up is that during election season, the newspapers will report whatever poll came out last night and they'll fluctuate so you can get dizzy with, wow, it's leaning one way or leaning the other way. [01:04:53] But when you consolidate all those things and you do more meta analysis and a bootstrap and the fancy stuff that they do on that website, you get a more levelheaded sense that the polls aren't going. The polls are just one data point that can be measured. That would be the kind of thing about skepticism is that don't let yourself get whiplash with the little changes from day to day. Your business is not suddenly succeeding or failing every hour. A lot of its natural variation in the data and people with stacks training understand that. [01:05:26] Absolutely love that. That is definitely turning into a, quote, graphic. So the last formal question before you jump into a quick random round, and it's one hundred years in the future. Keith, what do you want to be remembered for? [01:05:40] Well, you know, even though I'm proud of the work that I did on it, I have a feeling that I'm not going to be remembered in one hundred years for SPSS for Dummies Fourth Edition, because because the life cycle of a book like that, it's probably more like three to four years, right. For the next edition. So one thing that I've wanted to do more of and that this crazy year that we're having is really caused me to reflect on is more the analytics management and more the philosophical side of the business, which you can probably tell that I enjoy. I was a I was a philosophy minor, actually. I somehow managed to squeeze in a philosophy minor with a with a computer science degree. And if I write a book in that area, either analytics, management or philosophy, I think that might at least give me a chance of 20, 30 years from now. Fingers crossed that it would be anything as wonderful as one hundred. Yeah. [01:06:39] What's your philosophy of life? Do you have a particular school you subscribe to or someplace that resonates with you? [01:06:47] Well, of course, when when folks think of philosophy, they often think of that side of philosophy. I was always more into epistemology, which is study of knowledge. So I get super fascinated with like Turing. This, though, probably like two people that hear this that will be also fans of this guy, Wittgenstein. I don't know if you've ever heard that Kenshin is he's probably a bit obscure, but anyway, I'm like a huge fan of him. But on the more kind of philosophy of life type stuff, I'm a young fan. I'm kind of kind of into the the book thing. And I've got young collected works down downstairs. That all started because a lot of people I'm sure I've heard of the I've heard of the MBI and Isabel Myers who who wrote it had a collaborator late in life like the last, I guess, 20 years of her life or so. And her collaborator was someone that I had a chance to meet when I was still a freshman in college and knew her quite well for the last several years of her life. So so for me, it isn't like a paper and pencil thing. It's like the whole youngin thing. So that that was for about 10 years, I did a lot of research on on that and the instrument, that's a whole nother that's a whole nother story. But the more philosophy of life stuff, I'm a big young fan. [01:08:11] Awesome. I think you should connect with Giuseppe Bhandarkar, who's also a data scientist. He's written a few books for Impact and he's also heavy into the philosophy. I think you two Together in Philosophy book would be an awesome remix. [01:08:25] I have to have to check that out. Yeah, I've enjoyed the I've enjoyed the the sessions, the episodes that I've heard, but I haven't heard that one yet. [01:08:33] That one's releasing next week, I believe. But I'll introduce the AIs on LinkedIn as well as Fagles. So let's jump into a quick random round here. What do you believe that other people think is crazy? [01:08:45] Well, you know, we've talked about it, but the answer that comes to mind is that, you know, the goal of projects is not actionable insights because I'll get silence when I say that I like a workshop until I until I'm able to explain myself. It's like everyone's shocked you can have a billboard placed anywhere in the world. What would you put on it? Oh, I was reflecting on this kind of thing. Where do I put it? [01:09:09] It was a was a young quote actually. You were just talking about young. It was. But how did Young himself raise it and showing it to us that this quote Crowfoot, the best thing that can happen to you in life is to be yourself something very close to that. [01:09:22] I love that. What are you currently reading? [01:09:25] Oh, I'm the kind of person who reads like a dozen things all at the same time. But there's two books that I've been somewhat in the grip of this year, A AIs SuperPower's by Boakai Fuli. It was written a couple of years ago. But I think that's really powerful because he kind of explains the philosophical difference between how Silicon Valley handles A.I. and how the companies he works with in China treat. I find that to be really powerful. Then the other book is a little bit more technical, but I'm really in the grip of it. To The Book of Why by Judah Pearl, which is about causal inference, but written for a general audience and really, really something and great things about that book is definitely on my list of books to pick up next few questions. [01:10:10] We're going to go to the random question generator. All right. What's one of your favorite smells? [01:10:19] Oh, this is I get this from my best friend, he turned me on to this and he's right or. All right. Smell after a rain, a rainstorm here. If he hears this, he'll get a kick out of that. [01:10:33] Oh, OK. I didn't even know that. Had a name. Awesome. What fictional place would you most like to go to? [01:10:42] You know, the answer that comes to mind is I kind of got into Game of Thrones there for a while. And although it's a real place I like the pictures of the capital city are just amazing. So I'm hoping that I'll go to is it's in Croatia, isn't it, that they that they filmed it? [01:11:00] Yeah, they found a few places in Croatia. Dubrovnik is where the Kings Landing was, where for the later seasons they've also filmed and split as well. Yeah. [01:11:12] Yeah. So so I'm probably cheating, but my answer is Dubrovnik. And I've been I've been to over 50 countries, but I have not been I have not been there. [01:11:20] Oh, Croatia is absolutely beautiful. We did we did a road trip along the coast, starting up top in City was called Poola and we drove from Cool all the way down to Dubrovnik. [01:11:31] This is just shooting a second time. But I'd also love to see The Hobbit town. Oh, you see that? Apparently they preserved and didn't tear down right there. And I have been to New Zealand. That's fabulous. [01:11:42] But is when was the last time you changed your opinion about something Major? [01:11:48] When I was college age, I was a Reagan fan, so I was a bit, you know, a bit conservative. Was that because I was military? Who knows? Right. But since then, I am not. [01:12:03] Last question here from the random question generator. What was your best birthday? [01:12:08] Oh, I've got kind of a fun answer for this one. When I was approaching 50 and wanted to do something fun. [01:12:15] I arranged for a rental of like four cottages in Italy and rented out all of them so that a bunch of friends could go. But I couldn't get them on my actual birthday. So it turned out to be like a year before so long. Story short, for like a year and a half, I had a series of birthdays with because I was celebrating with my friends in very different places all over the world when it was convenient to hang out with them. So it never ended up being one day. It ended up being like a year of birthday celebrations, which was a lot of fun and absolutely awesome. [01:12:49] So how can people connect with you and where can they find you online? [01:12:52] Well, LinkedIn is absolutely the best way because in particular, since I've started doing the courses, I'm on there almost every day. So so that's a fabulous way. So if they'll just simply follow me on LinkedIn, that's an absolutely great way to stay in touch. And if folks choose to check out the courses, I hope they do. LinkedIn is out added this Q&A feature. So what's fabulous about that is if somebody asks a question about one of the courses, everybody can see it and becomes a conversation. So that's a great that's a great way to be in touch as well. [01:13:25] Thank you so much for taking time out of your schedule to be on the show today. I really appreciate you being here. Thank you. I enjoyed it very much.