HH83-27-05-22.mp3 Harpreet: [00:00:09] My friends, so happy to be here. Shout out to everybody that's joining us. What's going on? Russell J. I actually got a chance to meet Jay earlier this week out here in Denver. Jay, thanks for hanging out. If you're free later, you can be up and we've got toast up. Tom, what's going on? Yo, what's up? Hanging out with Joe. Everyone. Everybody can. Can. Je is taking over as the host for me today. So, Ken, thank you so much for doing so. Guys. Q Shout out to sponsors. At today's episode we got the MLW Ops World Conference Machine learning in production, taking care of the of the just taking care of the show and the community. Thank you guys so much. Q Shout out to y'all. Look, I'm here just for a second. Just want to say hi to everyone. Ken: [00:00:57] Joe messaged me. Hey, good to see everybody. Awesome. Well, Ken, good to see you again. Harpreet: [00:01:03] Ken, this is this one's all you, man. Thank you so much, man. I really appreciate you taking over as host. And I'm out of here. Yo. Speaker3: [00:01:09] Yes. All right. See you. I'll see you again. Ken: [00:01:12] All right. Thank you. Our awesome. Well, everyone, welcome to this happy hour. I'm so honored that HARPREET would let me let me take it over. I'm I'm surprised. You know, it's a bit of a liability letting me run one of these things. But at the end of the day, I think we'll have a lot of fun here. I'd like to start start this off with some some light topics. Of course, we're always taking questions on LinkedIn as well as YouTube. I will. Harper usually does. I will plug my own content first and then would love to to sort of dove into a little bit more in-depth conversation around that. So my, my podcast series, Neighbors had the 100th episode this week, which was a pretty, pretty wild occurrence. And I took the opportunity [00:02:00] to interview my dad. It was in a seventies. He's a he's an oral surgeon, knows nothing about about what data science really is or our careers. And I'm wondering how you explain or how you interact with your family members or people important to you when they don't quite understand your career? Like, what are their impressions? What do they think that you do and how do you and do they appreciate it? Do they not understand it and think you should become a doctor? Might have had some of that and my past. Any thoughts there? I'd love to start with with you, Tom. Ken: [00:02:36] Oh, thanks, Ken, because I had a funny answer. My wife actually helped me with this answer. She said, Well, you remember how on Friends everybody was wondering what Chandler did. That's what I tell people when you're not around. Oh, you remember Chandler on Friends like Tom? It's hard to explain what he does, but I found out later Chandler is actually a data scientist. But to put it in the most important terms, the way I like to say it is, well, money's an asset, right? Gold is an asset. Real estate land. Do you do you think Data has become perhaps the most important asset in the modern age? And then there's a little bit of a light there when you put it that way. Now imagine being able to visualize that better, tell stories with it, even make predictions with it. Could you see how huge a value that would add? Yeah, that's what I do. Ken: [00:03:40] That's amazing. I think that looking at data in more common terms, whether it's a commodity, whether it's an asset or whatever it might be, there is there's value in similarities, there's value in the storytelling there. And of many storytellers, you stand quite pretty close to the top. Tom So I would imagine [00:04:00] that that that that messaging is very compelling for everyone around you. What about you, then? Speaker3: [00:04:09] Depends how much I like the person. Like, that's that's genuinely it. If I think they're, you know, sometimes I'll say, yeah, I do apply machine learning research and just stop. Like expecting them to know what I just said. And it's amazing. You know, you get you get people that react to you very badly. And for me, that's like I did data science in 2012. I didn't even know what it was called. So my answer to this is kind of interesting because when I first started out, I didn't have a title I didn't have. I was just a consultant. Literally, that's where I was and I was classified as a consultant. And so when my parents ask me what I do now, they've just stopped. Throughout my career, they've asked, So what are you doing now? What are you doing now? Now they're not. My dad just is. Yeah. He has his own business. He's a consultant. And there's no other explanation anymore. But when I actually have to explain it to somebody, I just say, you know, I work in technology. I work with data. If you've heard of A.I., I do the dumb version of that. And that's usually getting everybody laughing and kind of gets them to understand what it is that I work on. Ken: [00:05:19] I think it consultant is the the lowest hanging fruit. Everyone knows that that is and nobody wants to ask any more questions about it. So it works pretty well. Gina, you had a kind of funny. Funny. Speaker3: [00:05:31] I got to I'm sorry. I got to tell you, I've got to throw my dad's zinger in. He said, you know what? He got asked once while I was there, what I did. And he said, you know, I bought him an Atari when he was when he was very, very young. And I think he took it a little too far. So that is my dad's answer to what I do for a living. Ken: [00:05:52] Well, honestly, that's that's better than my dad tells everyone. I got my masters in I.T., not computer science. So all of his friends [00:06:00] called me to try and fix their computer. And it's it's pretty brutal. But he knows what I do. He just doesn't understand the difference between that that small that small nuance there. But Gina, I'd love to hear your story about about your PhD friend here. Speaker5: [00:06:20] Oh, yeah. Just real quick. So I was at a management consulting interview event and some of us were just talking before our interviews. And this guy, he has a PhD from some prestigious university or other and we were talking, Oh, what do you do? And I don't know, it was probably econometrics or something like that or economics. And he goes, Yeah. He's like, I just tell my grandmother that I'm a lawyer. I just thought that was so funny. I told my grandmother I'm a lawyer, which is pretty much nothing like what he does. I don't know. He could have said a mathematician, but. Ken: [00:07:03] Is sometimes I've probably explained to my parents what I do. 50 times, and I decided to one day just make a video. And every time they asked or every time they didn't get it, I would just keep sending it to them. And it's still, still hasn't worked, but at least it was a more scalable shop. I do want to call out. I see Mickie is in the building. She just today released her first YouTube video. Someone feel free to link that in the chat. Definitely give it give it a look. It was extremely high quality, especially for her first video. Way better than probably even my 50th video. So definitely, definitely give that a look. Russell, do you have any thoughts on this question? Speaker3: [00:07:45] Yeah, I'm sorry. It took me a little while to get off mute. So I actually spend probably half of my professional and personal life explaining to people what I do and how I work with data. And to make it easy, [00:08:00] I try to come up with a metaphor or some kind of analogy for it. And the one I use most often is imagine data is types of food that you would, you know, make a recipe with. So I'm kind of both the food source, the supplier, the Stockist, the chef and the and the waiter and I kind of pull all of these things together and push it out to other people. And that tends to be appreciated fairly well by everyone from, you know, family members to work colleagues that aren't immediately familiar with with data. Ken: [00:08:40] I personally love that metaphor. I actually made a video that I've linked in the chat data science explained with cooking. It didn't really didn't really get a lot of traction. But I love the I love the metaphor. It just is such a perfect fit with data preprocessing everything along that pipeline. Anyone else have anything they want to add to this thought? Or we can move on to a customs question. Let it rip. Ken: [00:09:06] Yeah, but for me, it's really easy. I, I just tell people, hey, I teach robots and machines to see, right? And most people seem to get that without needing to ask about the details. Plus, it's kind of cool to say that. So let's go with. Ken: [00:09:20] That very cool. Ken: [00:09:21] Link. I think Gina had a question as well, though. Ken: [00:09:24] We'll do yours and then Gina's. I did not see OC. I believe you asked first. Why don't we go with yours and then Gina will be next. Ken: [00:09:32] Cool, cool. Okay, so question of the Day. Data. How do you guys do testing in the context of data? Now I'm talking more. Hey, I've set up data pipelines to get. Certain amount of data in from regular sources. But we're talking about how do you analyze that data along the way? How do you make sure that the the quality continues to remain the [00:10:00] same? How do you ensure that, you know, the processes that you're running on it? And if your initial transformation and loading steps, how how do you make sure that they don't change in a way that's going to break previous functionality when you have to test with data in the context, just curious how people are going about it. Like we're doing a few things on our end at work and things like that, but it's always good to kind of get all the people's picture on that. Ken: [00:10:31] Amazing. Eric, you want to you want to take a stab at this one? Speaker6: [00:10:37] I was only half listening because I'm looking up doge memes, so I'm going to have to pass on that. Ken: [00:10:43] All right. Well, in that case, let's go to Van Veen. You're my go to four for everything along the pipeline. Speaker3: [00:10:50] You know, it's funny data pipeline stuff. It's all automated for me, so I have no idea. And I'm not like, I'm not giving a sarcastic answer, but I've had really smart people just automate everything. It's less time I had to do anything manual with a pipeline that was like the actual architecture and the stuff that's happening under the covers. It was a really long time ago. I mean, the only thing I do anymore for the data engineering side is like building out ontologies and doing the curation on that end of it. So, you know, my answer really is automate everything and if you can buy something, there's a ton of tools out there right now. You end up like building a stack, but it it's so much easier and cleaner. Just find things that meet your needs and that require the least amount of effort. Because everyone I talk to now who does this, it's like, this is all they do. It's craziness. Ken: [00:11:46] You know, we unfortunately, we are our main data engineering folks aren't necessarily here today. We miss Joe Reese. Like at the beginning, we should have just like dragged him in here. Ken: [00:11:59] Hopefully [00:12:00] the same thing earlier on today. I mean, you're right then I do a lot of the automation of the pipeline and stuff now because we don't really have anyone to do a lot of that. So we're doing a lot of automation upfront so that we can get to the, you know, being able to experiment quickly and things like that. So we're kind of figuring out how do we automate and test that at the same time, that's kind of where the question comes from. Speaker3: [00:12:22] Well, what kind of yeah, for tools. And that's kind of where I would ask you, do you have budget to buy some tools? Ken: [00:12:30] We've got reasonable budget to buy tools. I mean, we tend to prefer off the shelf tooling where possible. We prefer to buy overbuild in most cases as long as it's not exorbitantly expensive. Speaker3: [00:12:44] That's what I would do is I would just bring somebody in who's an expert on that end of it, and you can go to Amazon and they'll they'll basically give you a consultant, especially if you're a small business or a startup. If you commit to using theirs and I think GCP might be starting a program like that, but they're kind of not as interested in their customers as as Microsoft and Amazon are. But if you go with one of those two, they'll typically end up like handing you over a consultant at really low cost, and they'll help you. Basically, they're going to make you use whatever stack they have, but they'll build out a lot of the underlying pieces and they'll give you the high level architecture and they'll make it as easy as humanly possible. So that's really what I do, because the amount of knowledge that is required in order to build one of these and end, it's substantial. And like I said, I'm I'm kind of privileged because I have really, really smarter than me people who help me out with that. But if I was building a startup from Ground Zero, I'd probably just go head up 88 of us and like knock on Greg's door for a few minutes and see if he'll kick you down. Some smart people. I mean, he's got that product manager cred. You might as well use it for something. Ken: [00:13:59] Amazing. [00:14:00] Mickey, do you have any thoughts on that one? I feel like you're probably the closest to the pipe lining that we have in lieu of in lieu of a Joe Reese or or some of our more traditional data engineers. Speaker5: [00:14:16] Yeah. I mean I think it kind of depends rates so like enterprise companies they can if they're using like a popular cloud provider they can usually kind of there's like two approaches. They can kind of just make things easy on themselves and go for like an all you can eat approach, which is it makes sense because you don't if you're a big enough company where you're supporting such a large team anyway, instead of trying to get all them to like optimize their pipelines or models which no one in their queries which no one really loves doing, you just basically say like, Hey, why don't we just do like a blanket, you know, all you can eat and then you don't have to worry about that kind of stuff. But that still doesn't cover like the actual building and the maintenance phase because a pipeline that's broken is a pipeline that's not useful at the end of the day, regardless of how heavier light weight is, if you're a smaller company, it gets a little bit tricky. I'm I'm also a fan of just leaning on. The cloud vendors or even like for example if someone's using because that's like I think the popular direction right now is that for all the open source libraries, especially around like data processing and pipe lining most companies are going for, let's create an open source version of it and then let's try to monetize via like a managed version. So that can also be like still an option. Even if the company is not GCP, AWS, they might still be willing to like work with certain companies or teams because they're like, okay, this could lead to like a, like a really popular developing, a really popular like case study that we can then sort of showcase on our website. Speaker5: [00:15:55] So I do feel like it's an understanding what are the levers of negotiation to get [00:16:00] that kind of support? Because at the end of the day, startups want and like start products. They want people using their stuff and they also understand that if they just put it out in the open, people are going to have questions like if you're an open source library, are you still going to be around in like two or three years? So leaning on that and getting that support from them is really important as well. And I think ultimately it's also at the end of the day, like if people are really serious about investing in that data pipeline, they need to hire the people and they need to like make it important, i.e. like having data. Scientists are doing 50% pipe lining and data engineering work and 50% data science work ultimately leads you to like frankly ineffectual data engineers or ineffectual data scientists or people that are just really stressed out and then they become ineffectual. So either way, you get ineffectual like value for the money if you're like putting all that stress on them. But yeah, as much as possible, leaning on resources that are around and finding sometimes very unique ways to negotiate that kind of relationship is also like a huge, huge advantage. Ken: [00:17:14] Amazing. Russell, did you have any thoughts here? I also hear I saw a LinkedIn post. You were on some form of team. You want to give some backstory and maybe provide a name here. Speaker6: [00:17:24] You're not going to fake his way out of this one. Get online. We want to know. Speaker3: [00:17:27] Yeah. Okay. I'm cool. I'm caught up. Can you hear me? Ken: [00:17:30] Yes, we can hear you. Speaker3: [00:17:31] Yeah, I think. Do you know when. When the pressure gets too hard for me? I think my laptop senses it, and it just. It just slows down to give me a little more time to prepare. Yeah. So I was at a two day hackathon in the U.K. here in London. Ken: [00:17:47] Which is a project delivery centric. Speaker3: [00:17:51] Data hackathon type event, where we use machine learning and data analytics to approach problems that are either [00:18:00] common or unique within the product delivery industry. So that's where I was. Yeah. I put a I put a post up fairly reverently saying we came up with a silly name for our, for our team this time can you guess what it is? And it got a little more traction than I expected. And I think I've really tortured Tom certainly and Eric and perhaps a couple of others here. And, you know, if you twist my arm gently, I will reveal. But I was trying to let him know that he's he's kind of got the name. All but putting the words in the right order. So you give me a nod now and tell me if you want me to say it right here. I will. But spoiler, it's it's probably going to be a real anti-climax because it's not it's not the best of names. It was just odd and it had nothing to do with data. So I thought it. Ken: [00:18:53] Would be I'm asking you to reveal it more for Eric's sake than mine, who's struggling. Speaker3: [00:18:58] Even more than me. Sorry for the vacuum. I understand. I understand. And I appreciate you looking out for your for your fellow peers there. Tom okay. So there was there was four of us. Three of us had facial hair. One of us didn't. And the one without facial hair was named Johnny. So I think I'm really telegraphing this already. But the official name was Johnny and the Three Beards. As I say. It wasn't that special. I just thought it was a little bit a little bit of irreverent fun to have on on LinkedIn. And it really got a lot more traction than I expected. Ken: [00:19:40] I thought you were going to go with dummy variables. I was kidding. Speaker3: [00:19:45] That was. There was some really good alternatives, actually came from people. And I want I want to stick those in a little black book, and I'm going to take those with me to the next events, and I'm just going to choose from them. And I particularly liked that one that you mentioned, Tom. The more better so more better [00:20:00] data. I think that's that's the next one for me. Ken: [00:20:05] I love it. Well, let's move on to another question. I think, Gina, you had two of them, correct. You want to hit us with with some some knowledge here? Speaker5: [00:20:15] Well, hitting you with questions and seeking knowledge. So the first has to do with well, I have a technical question and a more of a career question, but let me start with the career related one. So I recently so I'm career pivoting for those who don't know into data science. And I recently posted my resume to a. Tech jobs board site. So not to just the biggest of job boards, but tech job board. And I've been getting inundated by recruiters contacting me. I put it up there deciding, you know, let's just see what happens. And yeah, so and I'm just I'm scratching my head a little bit. And I did do a little Google search and I can't find anything that really explains what's going on here with these tech recruiters. Yes, I'm sure some of them are scams and they're just trying to get your personal information and so on. But, I mean, are any of them legit? Are are any of these jobs legit? Or should I just take my resume down right now and forget about that, that stuff? So I guess the larger question is, is how do the tech recruiters operate? I mean, obviously, there's some very high level ones that are sourcing like the the most experienced or most knowledgeable talent. And then there's probably some that are just scrambling to try to find people who will hopefully look good in front of their hiring manager. And then there's these other ones where I guess they're seeking contract positions. So it's really unclear to me and some of them just [00:22:00] seem really unprofessional or the jobs don't really match anything. I mean, they're not close enough to what I have in my LinkedIn and on my resume. So it kind of doesn't make sense. So that would be my first question, and I figured this group here would have some really good knowledge about what's happening with this. Ken: [00:22:22] Go ahead, Mickey. Speaker5: [00:22:24] Yeah. So it might be helpful to kind of understand what's what's that phrase. Recruiters are not a monolith. So it's probably helpful to understand kind of the different personas. And this is kind of similar and like so for example, in real estate agents write like you have people who are. And I know this because I work with a real estate company, not because I actually own a home in the Bay Area. That's just it's a pipe dream. So but so in real estate, right? You have a buy side and you have a sell side agent and you also have agents to kind of work both sides of the of the deal. Right. And what that means is so a sell side agent, they're responsible for listing the house or the job in this case. Right. They make the job look as good as possible. They do the writing of the description and all that. Their goal is to source as many leads as possible. You have by site agents who typically will represent like the candidates or so sell side kind of represents the company or the house buy side typically represents the agent or a lot of times the job candidate. And you can have people that are quote unquote, in-house, they're freelance or they're part of an agency. So those are kind of the three ways recruiters can kind of operate. And within recruitment, if you think of it as like a pipeline, kind of like the marketing, the sales pipeline, you have some recruiters and a lot of times these are kind of like younger sort of they're either like younger, more junior, or there are people who just are really good at quote unquote, sourcing their job is to [00:24:00] hit up as many people as possible that could fit the parameters. Speaker5: [00:24:03] And for them, it really is a numbers game. Eventually, if they do move on past that, they'll then be responsible for the like interview, for setting it up. And the payment is also different. So if someone is like a recruiter that's in-house for a company, so for example an Intuit recruiter like for my company right there in house, they get a base salary and sometimes they might get a placement rate, but for them it's like a job and they're more focused on like getting the right candidate into the team for the role, like they maybe have a quota, but like they're very, very much so like interested in we need to meet someone, we need to get someone who a candidate who meets kind of what the team is looking for and also their stuff. And they are usually directly associated with the team recruiters that are what they call full cycle or like there's another term for it, but who are freelance, for example? A lot of times these are folks that are headhunting for very senior roles, like executive roles. They have very specialized experience in an area. So if they're hiring for like tech CEOs or VP's, basically they've already gone through the gantlet sometimes of working as a recruiter for a company, but they're kind of like so good at their job that like they're really hired for those like high value expensive roles. Speaker5: [00:25:30] A lot of times with them they are very targeted, like they're not doing the spray and pray on the keyword approach. What they're doing is they're looking at like the candidates like resume on LinkedIn, they're really thinking through this and then they're crafting like a message to you that will directly target your like LinkedIn profile, right? And then you have people who are sort of like they work for a recruitment agency, they're incentivized on numbers. So [00:26:00] basically, like if they're only sourcing, it's how many resumes they can get people to say yes to if they're being paid on placement. It's literally like how much they're being paid on like people they can fit into like a role, right? So you have like this all these different types of recruiters that are kind of all operating like out there on LinkedIn. And it it can be very, very confusing because the recruiters are like targeted. They're looking for higher value candidates or maybe they're just better at their jobs. They will a lot of times come off as being more professional because they are more professional. But sometimes you also get junior candidates who like their real their only real incentive is to like meet their numbers. So part of it lines up with how they're being incentivized. So you'll get kind of a mixed bag. Like even nowadays, you know, it's been like five years since I've done Salesforce like, like admin stuff and I still get recruiters going like, Oh, do you want to be like a Salesforce admin for like an email marketing team? And I'm like, Yeah, no, I'm good. Speaker5: [00:27:04] Like if they just took a look at my LinkedIn profile, they would know that that is like. That was Mickey, you know, alpha alpha release, you know, not even beta. It's just we're past that. We're on, like, Mickey B2 or whatever, but they're just doing spray and pray like that's their thing because that's how they get incentivized. Right. Other recruiters, they'll be a little bit more targeted, probably because they're not being they're not being paid on like a number of resumes being sourced. They're being paid on like who we can place them to a team. You know, so that kind of how lines up. But you'll definitely get like a mixed bag for sure. Yeah, I get a lot on like Salesforce admin, email marketing, growth hacking like you name it or finance. And I'm like, if you just took like one little look at my like LinkedIn where I say I am specifically open to these roles or whatever, they [00:28:00] would probably see that it's not like, you know, but I would say like just keep your resume out there because you just you, you never know. Like I've had two or three of my roles where recruiters who like found my resume, it just happened to line in the right point in time where my resume matched my profile and what I was looking for. But I would say, like, never cut out, never cut out options for yourself in terms of like how recruiters find you or like getting roles, you know? Ken: [00:28:28] Thank you, Maggie. I think I think that sort of there's some heuristics you can take out of that. These are ones that I used when and I got most of my positions through recruiters. The first thing I look at is Mikki had just mentioned is did that recruiter? Is that are they directly from the company or from the are they from an agency? I usually think that the ones from the companies are more serious if they are from an agency. You can look at the agency's website and see if they're legitimate or not. I find that if you actually work directly with the agency and they know what you're looking for, they will source you the specific opportunities that you're interested in, because a lot of them, as Mikki said, do get paid by if you actually land the job or not. So you can sort of outsource a lot of the searching to the agencies and it's a win win situation there. You just have to find a couple of agencies that you genuinely like and that's pretty important. Go ahead. Yeah. Speaker5: [00:29:29] Yeah. So so the website that I posted on is DICE, which I'm sure many of you guys are. Ken: [00:29:36] Desktop video from them. Yeah. Speaker5: [00:29:38] Oh, really. Oh. Ken: [00:29:39] Vince out on dice. Speaker5: [00:29:40] Yeah, yeah, yeah. So yeah, it's been a while since I looked at their website and yeah. So it just seems like a lot and I don't, I haven't looked at most of these, I haven't even responded to and there were a few where I initially responded to one that seemed good, but then [00:30:00] they're like, send me your send your resume. It had to my question was, is there any possibility of it being remote? They said, send your resume. I'm like, no, I know something about this seems fishy. And then another I think that same person might have asked me for all kinds of info. I'm like, You're crazy. Like, we have a bunch of jobs, we're filling in this and this and this and this and send us your info and references. It's like, I'm not sending you my references. I'm not sending references until we're talking about a job offer. Why would I even contemplate you contacting my references just randomly out of the blue? Hell no. So yeah, so that's the thing. So I see Vin gave me gave that and I would really appreciate hearing from him but thank you Ken and Makiko on that because yeah, it seems like maybe there are and I don't know what those agencies are that might be able to hook you up with reasonable jobs, especially if you're kind of just starting out. My situation is a little more complicated because I do bring a lot of experience, project management and program management that I think could be really useful. But yeah, but being like a data science manager or a senior data scientist that might, depending on the company, just be a bridge too far given my current experience level. So yeah, then I'd be really curious to hear since you were very emphatic in your response. Ken: [00:31:36] And floor's yours. Speaker3: [00:31:38] Yeah. Dice was. It used to be great know when I was Eric's age, but that's been a while. And so it's it's not great anymore. And so what I would say is avoid the bigger job boards like Monster Dice Ziprecruiter. You know, I can keep going. They're not bad, Michiko. [00:32:00] So nice. The money's in the mail. Thank you. So, yeah, I would avoid the major job boards because they're not great for our types of roles. And once you get past about five or ten years of experience, most of the job boards are just not. They're not for your type of position. So that's going to be one. What Mickey going through was like 100% everything you need to know about recruiters and the best ones. Really the best time to work with recruiters is very early on in your career or when you're working with what Mickey was calling those, your headhunters very targeted. The recruiters who know exactly what they're doing, usually executive level recruiters. And you're close to that stage. I mean, you might want to reach out to one or two of them and see if there's somebody with within your niche. The best way to find them is they are all over LinkedIn and they will probably be. I mean, if you look for posts on recruiting around project manager positions and data science positions, you'll find the best ones. And that's going to be LinkedIn is recruiter paradise. Speaker3: [00:33:11] And so if you're looking for a good one to work with, that would be a good way to network in and say, Hey, this is my background, I'm looking because they're going to be the ones who from there will kind of take the relationship and help you really find the type of role that you're the best for. And like I said, you have a level of seniority that's you might end up be really benefiting from that because there's yeah, there is some, you know, it's kind of like a 5050, but I'm pretty sure you'll get some value from working with them. But talking about job boards, I would avoid that. And in your position, I mean, I know your background, I would start networking with C-suite at SMEs, just start on LinkedIn, send maybe four or five invites a week with [00:34:00] targeted messages and just say, Hey, I want to follow you want to understand your needs around data or product project management and start start conversations with people. And what I tell a lot of people is have a number of conversations you want to have every week, say you want to have two conversations a week, so you want have five conversations a week. It's almost like a sales cycle at this point where you want to sit down and really niche into exactly who you want to talk to, talk about company size, talk about seniority because you need to be talking to see sweaters. Speaker3: [00:34:32] If you're in a smaller midsize business, if you're going for like a huge business. Yeah, I'd say like VP levels just fine, but small and midsize is going to be your sweet spot for targeting. Talk to the C-suite put together like this is how many meetings I'm going to have this week and then figure out how many times you have to reach out to people in order to get the connection responses, in order to get the meetings. And what you're going to find is if you set it up to have like four meetings a week or six meetings a week or two meetings a week, whatever you have time for, you're going to get into a sales cycle or you'll have they say, Hey, I got this role for you, and then you're going to be able to get the cadence where it's like this many connections equals this many meetings equals this many jobs that I can come in and apply for and you'll get the cycle like in less than a month, you'll kind of have it figured out and you'll end up just it's, it really is. It's a sales funnel. And from there, you'll end up being employed and probably about 2 to 3 months in a really good, high quality job. Speaker5: [00:35:35] Oh, bless your heart. Thank you so much. That was extremely helpful because I think one of the biggest issues just in job search generally is prioritizing your time and figuring out who you should be talking to and how how much effort you should be putting in. I mean, I think most of us by now know that applying for stuff even through LinkedIn or even on company sites is [00:36:00] a real crapshoot. And so networking is the name of the game, but even there, it's hard to know who exactly to target. And yeah, so I really appreciate that so much. Ken: [00:36:14] I just want to give you a broader go. Ken: [00:36:17] Okay. Ken: [00:36:17] Tom, just real quick. I think Ben's advice is spot on. I made the mistake of waiting for good recruiters to find discover me and come talk to me. And after seeing how good a good recruiter can be, it dawned on me. Oh, I could have asked around what a really good recruiting groups and. Given my stuff to a lot of them. You still get ghosted even by good recruiters, but it's it's just a better crapshoot, I think. Ken: [00:36:56] Thank you, Tom. I'm interested from the group perspective, how do you feel about inbound recruiters on LinkedIn? Do you find that more of them are legitimate or is it still sort of a crapshoot? I mean, my thought is, if you're in Gina's position, you're talking to a lot of people like I. I'm more than happy if I'm getting roles that like, I'm not looking right now. I try to forward them on to whoever's interested. And if I know my friends are interested, I just basically say, Hey, look, talk to this person. Maybe they'll respond, maybe they won't. But it's, you know, you're making someone else's day better. If I do a recruiter a lot of favors, maybe they'll help me out in the longer term. Kind of broader question do we think that most of the stuff coming on LinkedIn is more legitimate than pretty much every other platform? Eric, do you have any thoughts on that? Speaker6: [00:37:49] I would say. I guess kind of most recently maybe recruiter messages on LinkedIn. Okay. But [00:38:00] more so, I think LinkedIn is better has been better for me in getting messages from like somebody who's actually hiring like a hiring manager rather than just a recruiter who found my resume or whatever. Actually, I probably like the the best recruiter message I got recently was actually to my email. I don't know. They probably just got it off my LinkedIn profile because it's there. And and it was it was good. They clearly like looked at my profile and had like actually they basically like told me they want to talk with me about a job that's pretty much the exact same job as I do right now. So I was like, no, I don't really want to like redo the whole learning curve. And so it was. But the point was that they had put the work in for it, as opposed to when I get other emails or messages that are just like so like murder mystery of like I have a great position hired being hired by a super important VP of data science or whatever. I'm just like. Like, spare me the spam. Spare me the, the, uh, you know, emotional roller coaster of it. I just was just talk about it because we're humans and professionals. So, yeah, I would say it's okay. It's, it's fine. I mean, I'll take more great recruiters and bad recruiters if they just want to give me a good sample. Why not? Ken: [00:39:18] I literally never respond or have never responded to one of the ones that's like this anonymous company, X, Y, Z. The only reason they do that is because you'll Google the company and or like Glassdoor the company and see how bad the reviews are. And they don't want you to do that first. So to me, that is a massive, massive red flag. Gina, did you have a second question? Speaker5: [00:39:42] Yeah. So my second one is it's quite technical. I'm working on a project right now. Is it okay to transition to this question, do you think? Ken: [00:39:52] Does anyone have anything to add to the previous one? Go ahead, Eric. Speaker6: [00:39:55] I have something real quick. So since we're talking about recruiting, we are hiring at [00:40:00] LendingTree or hiring a paid paid social analyst. And so I want to at least drop that in there so I can put it. Yeah. So if somebody has experience with like we're like it's not social analyst, like we're looking for somebody who's an Instagram influencer, it's social analyst like, yeah, you know, you have a creative bent, but it's a data position. So, you know, needing to know SQL and Tableau and being able to work at the Facebook ads platform and things like that. Yeah, sorry. Ben. So anyway, I wanted to throw that out there for anyone who might be interested because yeah. And I know the, I know the guy on the team, he's not the hiring manager, but he would be the he's part of the team and good to work with. Ken: [00:40:43] Eric in the company. Eric wouldn't say this. I just have to remind everybody that's listening. The main perk is getting to work with Eric Sims. Speaker6: [00:40:54] Oh, 100% right. Ken: [00:40:55] Darn right. I might even apply. And just just for that and any any recruiters watching. Take note. This is how you get qualified candidates. So I don't want an influx of. Yeah, exactly. Exactly. Yeah, I think we're good to to transition here to something more technical. Speaker5: [00:41:17] Great. Thanks. Oh, just a public service announcement for the folks who are listening. Perhaps now or later. Just do watch out for scans. I mean, there are full on scams out there. And I just think it's worth pointing out because like I've heard from fellow boot camp folks in the chat, what? Yes, it is true. There are scammers out there and even on I don't know, maybe I don't know how LinkedIn if they're dealing with it. But yeah, like especially boot camp grads will be approached by companies. Little red flag if they offer to hire you before they've even interviewed you. Look out. I mean, some of this stuff should [00:42:00] be pretty obvious, but, you know, sometimes there are some more sophisticated scams out there. But, you know, red flag, anytime somebody's asking you for money up front for anything, run away. That would be my advice. So I just want to put that. And so. Go ahead, Ken. Ken: [00:42:21] No, no, I think that's a really important one. I absolutely agree to watch her all at first. Speaker5: [00:42:29] Most definitely. So, yeah, my technical question is this. I'm working on a project with an organization, data science for good type organization, and I have done a bunch of data wrangling and I built a TensorFlow model. The images are a lot of different kinds of textures, and we'll just leave it at that. It's the group we're working with is a startup, and so it's proprietary stuff, but there is. So at the feature extraction point and I'm trying to create. They're looking for essentially a recommender system like given that I put in a certain image. Can you tell me the five or eight most similar images, at least for now? Well, in our database that we have. And it's in the training dataset alone, we've got something like 18,000 images. And so I found ways to do the feature extraction in TensorFlow before the final layer. I just set up my own little model, and now I get all these features and I'm struggling to figure out what is the best way to like. Literally, what I want to do is find for any given image, what are the eight? You know, what are the other eight images? And I'm assuming the best way to do it would be to [00:44:00] find eight images that are closest in terms of the distances. Whichever algorithm one uses to compute the closest, the smallest differential between those, if that makes sense. And so, I mean, you could do K means, but that's clustering and I don't think that's going to work really well. I haven't done K and not Ken's nearest neighbors, but neighbors. Neighbors. I like that name. And then can you even said on your on your podcast, you said, I didn't come up with it. You know, someone else suggested it, but it's brilliant. Anyway, I would love any suggestions on this because I'm just trying to it just seems like a really difficult, I mean, a challenging computational problem. And so for any given image, how do you go through 18,000 plus images and compute which eight images or so are the closest? Ken: [00:45:12] Mickey, do you want to? Do you have some thoughts on that? It seemed like you're familiar with some of the the like using the embeddings. I mean, I would assume you like the first approach would be using Canada's neighbors on the embeddings. But is there something that's more more efficient? Because that could take a long time. You have to calculate all the distances. Speaker5: [00:45:37] Yeah I would check out. Spotify is like annoy package like it's a I think it's similar to nearest neighbors but that's like the pattern I've seen with a ton of like image based recommendation projects is they literally do that. They extract like embeddings, they use like Spotify is like annoy to do the actual like embedding comparison [00:46:00] or to get the similarity, I think. And then they just find the ones with like the top like, like they rank it right in the order of like the near similarity to like farthest and then they just pick off the top like five or something. I'll finally I'll post in the channel. Oh, cool. Thank you. I appreciate that. I was looking at a Udemy course also on recommender systems and they're using Python and they use a surprise lib. And so there's I mean, this is just for recommender systems generally like they use the movie lens database I guess is they're kind of examples. So it's, it's not really it's not entirely analogous. But anyway, thank you for that. Ken: [00:46:49] Awesome. Tom, did you have any thoughts on this one? Ken: [00:46:53] Yeah. Put something in the chat. So in my python or excuse me, my pytorch review journey which I got away from because there was a reason I need to learn Docker. But when I go back to PyTorch, I was. The great thing is there's some really good pre-trained deep learning machines out there that you can apply. They they just spew out pretty well. Oh, I see these objects in that image. Now you can tokenize those into an occurrence matrix and then use simple cosine similarity to to say, oh, these images group together. It's when you use cosine similarity and you do it at scale. It's very much like applying Ken's nearest neighbor. So I will be over forever corrupted saying it that way now. And if you need help with a premade linear algebra machine, contact me. I've got to get repo. That will get you a fast start on that. Speaker5: [00:47:53] Great. Thank you. That's cool. Yeah, it's interesting. When I was messing around just building these models, other people [00:48:00] in the group had used Rosneft and BG 19. But I don't know that they at least when I was building a more complicated models, they didn't seem to do as well on these types of images because they're just textures. They're not I don't know why I'm assuming this might be why, but even when I added more filters or made the did a bunch of stuff like that and added more conv layers, it made things, made them worse, made it really worse. And when I added drop outs, that just destroyed the whole thing. So there's something interesting that I thought was going that I think is going on. If you're just using textures as opposed to any other kind of normal photographic type image, and I don't know if anyone has any thoughts on. Ken: [00:48:49] That, I just want to emphasize resin that's kind of old now, and there's some later ones that will detect multiple objects and just let you know what they are in the image. So it could be that you get a hold of an existing image and you simplify it, but I would avoid trying to modify it because you're going to you're going to get into the weeds really quick with that. For what at least? I'm not saying you don't have the abilities, Gina. I just mean for what I understand you're trying to do. I would try that first if it were me, but then I would listen to Mexico for me to. Ken: [00:49:27] Amazing. Anyone else have anything to add on that one? Ken: [00:49:33] I must have missed this. But what kind of textures are you trying to match? Speaker5: [00:49:38] Yeah. So I was going to say, but I mean, it's textiles. The challenge is public on the on the website and yeah, it's textiles. And so it's like finding similar textiles that it is labeled data. So you and I also I did something that maybe isn't [00:50:00] like the best. There are three different labels and I combine them also with like 105 labels, but I was just trying stuff I've been yeah. On to your point. I've been working, I've been spending some hours, but this is just my brain. How it operates when I'm able to devote some time is just trying different things and seeing how it goes. And that's how I found out that. Yeah, that the, the, the bigger models didn't seem to work as well, but I didn't actually use BG 19. So. Ken: [00:50:34] So here's the, here's the catch with some of those bigger models, right? So a lot of them are designed around specific challenges, particularly object detection or classification. Right? And when you're looking at that, you're looking at models that are trying to understand at a semantic level what the context is of a particular shape or object. Right? What you're trying to do is slightly different. What you're trying to do is look at the textures and the and the patterns appearing within an area more than anything. So if you look at any of the larger like deep learning models, the first few layers are going to be essentially doing texture extraction. The further down the model you go, the less it's about textures and the more the embedding becomes relevant to your semantic classification, right? So really adding more layers isn't necessarily going to help because you're just getting further and further away from the textural embedding. Right. So that's one side of probably why it's not performing that much better. So maybe if you take the embedding at an earlier layer, you might out of COVID-19 or one of those models, maybe that'll perform better as a embedding process. Ken: [00:51:53] The other side that might be interesting in this kind of a challenge because it's so specific, is how [00:52:00] do you do your can you manipulate your feature extraction a little bit? For example, out of the image, can you spot other patterns? Can you process the image in other interesting ways? Now in this may be looking at like simple things like Fourier transforms of the signal patterns within the image, right to make edges stand out to make a similar pattern stand out. There's all sorts of random and seemingly well, seemingly random things you can do to an image to make different patterns stand out and essentially transform it into a different domain. Right. Textures would stand out completely differently in different domains if you transform it right now, I don't know exactly what the patterns are and what you're looking for, but essentially you can merge a bit of the old school pattern recognition techniques that people used to use with some basic neural networks at the front or some basic convolutional networks up front. And it must be very deep. Speaker5: [00:53:04] Yeah. One of our guys is working on a fast for your transforms of the image and yeah some some folks did some. Yeah. Some like Kenny Edges and Hog and some other things. So I don't want to probably don't want to say too much, but anyway, yeah, this is really this is really, really good stuff, you guys. Ken: [00:53:32] Awesome. That was an awesome take us to buy. It makes a lot of sense. I mean, I was thinking that maybe it has some similarity to like patterned audio or if you transform it, it has a very different, different insight, which is pretty incredible. I mean, I experimented with some audio embeddings and those were. A very powerful looking spectrograms. Any of those. Speaker3: [00:53:55] Types of things? Ken: [00:53:57] How many how many images did you say you [00:54:00] had again? It was like 2000. Ken: [00:54:01] Or something along. Speaker5: [00:54:01] Those lines. No, no, it's well, in the in my training dataset, obviously, I could split it different ways. I think in the total dataset we have, it's like, well, okay, so it's like 22,000 images, but. There are. About 14 images. They're all taken a very particular way, and it's about 14 images for each, let's say, sample. And so this is another, pardon the pun wrinkle in the textile identification or so. Yeah. So. And there's a question about but they're taken in slightly different ways to sort of help show different aspects of the textile. Ken: [00:54:45] Awesome. Awesome. I think this. Go ahead, Tom. Ken: [00:54:48] So sorry. Based on my experience, Jena, with a lot of image detection, recognition stuff, which is actually quite similar, I. I think Eric's suggestions on these open CV links and more that could bear some really big fruit. Yeah. And sorry I wasn't listening more carefully earlier. Speaker5: [00:55:11] No, no, no worries, because I didn't say textiles right off the bat. Ken: [00:55:15] Okay. Speaker5: [00:55:15] Textures. Ken: [00:55:17] Oh, I feel better blaming you than me then. Speaker5: [00:55:19] Thank you. Yeah. Yeah. Feel free. Ken: [00:55:24] Does anyone have anything to add? Awesome. So I think a pretty good segway from this type of challenge would be even going backwards. Speaker3: [00:55:35] Towards. Ken: [00:55:36] The hackathon that Russell did. And I'm interested in what you all have seen from data related hackathons or like data thons, whatever it might be, and what makes one successful. I mean, this is something selfishly I've been thinking about maybe putting one on towards the end of the year, and I'd really like to hear what makes one go off well, what type of data is relevant and those types of things versus what [00:56:00] might make one suffer a little bit. Russell, you want to start that one off since you have the experience? The freshest the freshest memory of one? Speaker3: [00:56:10] Sure. So I think the best thing is choose challenges to solve and data that are relevant to something that's active in the real world right now. Things that will bring value to a person or an organization or anything immediately. Because then if something does bear fruit fairly immediately, even though it's it's very immature, it can then get investment from from outside to then really enable it to burgeon beyond and outside the constraints of the hackathon, no matter if it be a single day, a two day event or a five day event. And no matter how intensively you approach one of those and how fantastic you are, you won't get a fully polished solution at the end of that. But you will get a great a great what I would call a scaffold or great plan to build from or a great demonstration article to build something that could be a polished article after it. So choose carefully the the data that you have available so that it's high quality data so that the teams that do the hackathon don't have to do an awful lot of pre-processing or pre structuring or pre cleaning, etc. That saves a lot of time. And in fields that are open for improvement and almost crying out for improvement. So hence the one I was at was a project that every delivery has been around for, well, hundreds of years. But in the, in the modern shall we call it the, the, the 3.0 age and bordering into the 4.0 age that we understand more than ever the restrictions of it. So that's a great area to look at. But outside of that, you know, there are far more [00:58:00] nuanced things. So looking at the new tech age and some of the great things that we mentioned in the in the call this evening to some of those as a seed for to. Ken: [00:58:11] To set. Speaker3: [00:58:12] Teams. And to to come up with something unorthodox. And that's the biggest value of the hackathons, is to aim for something unorthodox, aim for something that's outside of the box. Don't just build something that, you know someone else has done. Come at it from a completely different perspective. And if it fails, completely. Okay. You've lost two days. If it hits the mark, though, those two days bring a real huge benefit. Ken: [00:58:38] Do you think it's better to have multiple data sets for one hackathon or everyone focusing on the same data set? I mean, is there a chance that it gets a little bit boring if you have one, or what is the general philosophy on that? Speaker3: [00:58:52] So I think there's a threshold. The one I was at, there was multiple data sets and multiple challenges and different teams approach different challenges. So I think there was there was a maximum of 15 different challenges with their own unique data sets, not all of which were picked up. But I think we went forward with maybe about ten of those. And for some challenges there was more than one team approached the same challenge. So there was a little bit of hackathon competition there. So definitely better to have more than one, I think. But there's going to be an upper threshold depending on how many people you have coming to the hackathon. If you have 200 challenges in 20 people turning up, there's going to be a lot of a lot. Ken: [00:59:33] Of. Speaker3: [00:59:34] Missed data, missed opportunities there, you know? Ken: [00:59:38] Amazing. Then you have any thoughts on the hackathons? You're also welcome to tell us what Web 5.0 is as well. I know Joe Rice said he had a couple of companies reaching out about Web 4.0 and 5.0 companies, apparently. And we we got a very good chuckle out of that when I was visiting him. Speaker3: [00:59:57] Yeah. It's craziness. Everybody [01:00:00] that had A.I. in their resume or their like bio header. In the early 20 tens, then went to Web three recently and now they're all trying to, like, jump on quantum, which is halfway nutty. Yeah, it's it's kind of gotten weird on the hackathon thing. I love hate hackathons. I think as a hiring tool it can be amazing. But when it comes to like actually making forward progress on an issue, I don't love it so much because a lot of what comes out of it isn't you get it into like the point where you would put it towards production and you realize it isn't feasible. I mean, it was a great idea at the time and it's probably going to lead somewhere eventually. But I have a hard time with a lot of what comes out of hackathons. Just because there's not the the kind of work that gets done at hackathons just isn't so applied in most cases. And you'll have incremental experience like incremental improvements and small pieces that could spawn off into something. But a lot of times what comes out of a hackathon is really just good teams where three or four people self assemble, they get some really impressive work done. And if you're looking to hire, that's an amazing use case for a hackathon because you find these small teams where you can just hire the entire team and they're already proven to have worked effectively together on one project. You've watched them function effectively, kind of in a high stress time, compressed environment. Speaker3: [01:01:36] And for larger companies, it's great for hiring for things where you're trying to make forward progress on an actual problem. It's not so much a hackathon as I would do a larger challenge. And I've seen this done with startup challenges before, where you your hackathon is more over six weeks or eight weeks and you offer check points, you offer expertize. [01:02:00] You have different companies that support with infrastructure and just a little bit of help and some guardrails and people treat it like they're starting a company like it's a startup. You get an MVP or something like that out the door at the very end of it. And then you get funded, you get 100,000, 250,000. And those are the ones that actually lead to some pretty interesting advances because you throw a whole bunch of crowdfunding at it. More times than not, you get one or two experienced people come combined with a couple of college kids and some people that are just entering the field from software development or someplace else, you get some really interesting teams and you get some good solutions. And that's really where I like the concept of hackathon, but it's just more of an extended like an extended process rather than a three day one week. And Long live web 5.0. If somebody needs Web 5.0 consulting, I'm more than happy to explain exactly how to. I mean, the best way to make $1,000,000 in Web five is to start out with ten. And I'm I'm your guy to help with that. Ken: [01:03:07] Their consulting fee is also $5 million. Ken: [01:03:09] Correct. Speaker3: [01:03:10] Can I just check is is Web 5.0 only around Hawaii? You know, for me in the in the UK seeing TV, you know, from from back in the in the seventies and more recently with the Hawaii Five-O program, I can't help but make that association. Ken: [01:03:27] Yet. We're having the new Hawaii Web 5.0 Hawaii Five-O meet up next month. So you guys are all welcome. Tickets will be available for $1,000,000 apiece. Eric, did you have any thoughts on on this one? I saw you raise your hand there. Speaker6: [01:03:44] Yeah. So real quick. This is an article from May 2013, so I don't know how old is aged, but its web 5.0 is the sensory and emotive web is designed to develop computers that interact with human beings. [01:04:00] Relationships have become a daily habit for many people. I'm not really sure what that means, but there you go. Googled it or symbiotic web. Hmm. Anyway, yeah. So I've had a kind of a couple of different thoughts as Russell and Ben have been sharing, but my experience has been so I actually recently was in did a hackathon at work. And something that made it a good experience was that the organizers of the hackathon had taken the time to talk to like leaders around the company ahead of time to say, what are some projects that you like the hackathon hackers to work on and then we just like and then we picked something rather than having it be too open ended, we kind of knew a little bit ahead of time Self-assemble and then could work on something and try and come up with an MVP. And, you know, I don't know what's going to happen to all of the projects, but I know that some will be prioritized. You know, the hours will probably be done at some point, but it didn't end up being super high priority. Okay. Um, another thing though, I think as I think like, okay, if I was going to do a hackathon, like what's the, what's your purpose? Is your purpose to have people get together and build something that's going to become a startup is your purpose to like do a civic hackathon where you have like local organizations that need resources and it doesn't matter. Like maybe they're not really complicated and people can pick them up fairly quickly. Speaker6: [01:05:32] Or is it like Vin was describing, which kind of is like I mean, you could call it a hackathon. I've heard it call the hackathon, but I wonder if you'd get a different pool of applicants if you called it something like a fellowship or whatever, where it's like, this is a thing that I'm like setting myself up for for three months or six weeks or something like that. And so people maybe see it as like less hacky and more of a more of a thing that maybe I should maybe be a little bit more committed to. Yes. Fellowship of Web 5.0. [01:06:00] So I don't know. I was trying to think like through that through that purpose because again, that will describe whether everybody shares a data set. Are we trying to win? Are we trying to optimize and come up with the best model or or what? And then I'll just drop this in the chat. But I found a little while back when reading, researching about hackathons a bit myself, a couple of good posts. One is called it's from Hackathon Guide. It's the website, it's just like a GitHub page. And the other is, so you think you want to run a hackathon, think again. And it's a civic it's a civic hackathon article. And so it's like how to make it's like talking about if you want to approach like a hackathon from a perspective of like Capital H or if you just want to engage people in like civic tech or whatever. So it may not quite be what you're looking for, but I thought it was a really good read, kind of opened my mind to something other than just the usual hackathon stuff that I'm used to hearing about. Ken: [01:06:58] That's amazing. I will definitely check those out. Tom, did you have any any thoughts on this one? Ken: [01:07:04] Only things that are totally unedifying and not serious at the moment because of what's going on in the chat. Ken: [01:07:13] By the end of this will be on to Web 6.0, which could be the end of us all, from what I understand. Ken: [01:07:21] I think Ben did inspire me to start marketing myself much better. I want to be a quantum web 7.0 thought leader that I don't know what I was thinking before. I'm always underselling myself. Ken: [01:07:34] Yeah. You have to add evangelist in your title as well. I feel like that's the missing piece. Before we get off the Hack Hackathon concept, I think it's it's really interesting to just evaluate what's in the market. I think Kaggle is is probably the the front runner there. I still consider those hackathons, but they're to me missing a very aggressive social [01:08:00] piece, obviously, that you can have teams, but there's something really cool about coming together for a short period of time and actually. Like producing something. I think that skill of taking a couple of days and having an output or a product is something that so many people along the journey just forget. I don't know about you guys, but I start so many projects without actually finishing them. And it's nice to have a time box dedicated time to start and get something to like MVP, which is like finished of version 0.0, right? So I think that there's really good value there. You're so right on the purpose and the execution and the timing. There's a lot to work out, but I think that there is absolutely benefit longer term for people who participate, for people who host them, it's a significantly dicier proposition. Does anyone have any additional questions? I figure we might as well wrap it up with with the Quantum Web 5.0 evangelist talk on the way out. Ken: [01:09:13] I've got a I got a touchy one. If it's okay and it's okay if everyone says no, just strike out. So Navy, I don't know if you want to come on camera for this, but I think the brave post of the day award goes to Navy because she introduced she wasn't trying to answer the question about should we have gun control or not. She was trying to say, can we bring data to bear on this question? And I've been on shows where. And then help me here. I've seen a group of data scientists spitballing opinions and I'm like, Wait a minute, I [01:10:00] thought we were all data scientists. Why are you trying to answer with opinions? Instead of saying, Let's go collect the data? So that's why I really liked Northeast Post today, because she took a very emotionally charged topic in our country and begged like, well, not beg, but she admonished us like, Hey, can we get data to bear on at least some of the questions around this? And I thought it might be interesting for this group to explore how how do you get people to stay on track? When it's such an emotionally charged cultural issue to keep focusing on the data, because I've even been on judging panels for data science competitions and had to very politely confront domain experts for not judging according to the practice of data science principles, just because they like the, the, the way a certain team was talking rather than looking at the way they were going after the data. I hope I made all that clear if I made it too complicated. Sorry, but not the good job today on that post. Speaker5: [01:11:21] Yeah. Things happen when you can't sleep at night. And that was one of those posts. I think it's just been difficult for a lot of us to just hear the news and just not do anything about it. And I was actually. You know, I'm a member of a D committee for my school district. And, you know, we're talking a lot of things around the qualitative aspect of DEI. And I'm like, wait a minute, why can't we put data in this? You know, and that's the conversation I had with my school last week. [01:12:00] And then this happened. And you know, again, I don't want to make this about pro-gun and Second Amendment, but I'm like, what can we do? As data scientist. When we lead these companies to do better, how about we start within our communities? You know, they have data, but schools are they have a lot of data, but it sits in these data lakes. So they do make a lot of informed decisions about, you know, this is their GPA, this is their code, this is the student, these are the number of IPS they have. So they do have data. Speaker5: [01:12:44] It's just doesn't sit in a software and and Excel sheets and whatever. So I think, a, we should talk to our schools and ask them what they have in place. You know, start with those data lakes that are already in place that they don't see as data lakes, but we know they are there lakes and try to make sense of this before we jump the gun and say, you know. Let's do this or let's do that. Right. So to me as. A person in this community. I feel like we can start somewhere where I feel like, yes, I'm involved because my kids go to this district, you know? So that was kind of. The idea behind the post. Can we do this? I wrote to my board of directors a few days back and he's he really wants to talk about this, but we have a long weekend now. So but anyways, but, but to me, it is really about, you know, can we be smarter about this and not be judging this whole conversation on a different point of view, which doesn't really have which hasn't really done anything. Ken: [01:14:02] Thank [01:14:00] you. Can you share your post in the chat as well? Mickey, do you want to go ahead? Speaker5: [01:14:10] And this is something that like. So I think in general this has been a rough couple of years. If you are, number one, a black or Hispanic person, if you're female, if you're LGBTQ, it's been a rough couple of years and it on my team and in my family, including myself, you know, we have people that check all the boxes, you know. So it's it's been a rough couple of years. And I think so if we just switch to like a different example, like let's look at the Zillow example. I mean, people have feelings about that, but it's like kind of tame some of the things that and people here have talked about the Zillow example, right, that could have gone wrong and why that company got into the decision that it did potentially was one there was like a lack of trust and conversation between the people who are analyzing the data versus the people who had the power to make recommendations. So that's that's like a big thing. And that's for like a relatively trivial example where people did lose their jobs afterwards. But for the most part, right, no one lost their lives and they certainly didn't lose the lives of the people around them. So having that like lack of trust and the ability to have conversations, I think is something we see all the time in like the data field. And I think it's one of the things where I kind of wonder. Speaker5: [01:15:39] Like I think there's two there's three sort of things where I want to think about it, right? One is our kind of the incentives fully transparent in those conversations. Right. And I think this is something that I. Feel very strongly about, which is that like even right now, I think most people don't feel like they are they have the [01:16:00] representation right in our government or in or in the media or even the ability to have kind of those changes get propagated, even if there is data to back it. That's that's certainly something that I very much feel. Right. I think, secondly, a lot of these conversations like it goes back to incentives but also goes back to like. And we've seen and they've done research on this. Right. Once people have made up their decision, it is very, very hard for them to actually change like what they've already decided. And they've done research on this for other like sort of political stories or, for example, when an election happens once again, fairly trivial, sort of. Well, I don't say trivial area. I don't want to trivialize civics and politics and government here. But once again, those are areas where people are like, well, how do you feel about this person's views and all that the minute they sort of already have made up their minds? Any news reports or articles that you publish actually will typically not change those minds. It'll drive them kind of further into their belief system. Speaker5: [01:17:05] And we saw this with like vaccines once again, it's like it's an example that I think prior to this, you know, prior to the 2010 elections, beyond that, right, there wasn't this sort of polarization around something very simple as like a vaccine. And I know people who have very strong feelings on on both sides. Right. Including myself. I'm very, very pro-vaccine. So I think I think it's challenging. Right. Because I think so the average Joe, Jayne, Mickey, Emily, Mary, Fred, although all of us it's I think I struggle with the fact that, one, I don't feel like, frankly, at this point, I can enact any change. I can influence any change. Number two, it feels like a lot of the politics and the conversations around me are sort of being driven by incentives that I'm not aware of. [01:18:00] And the other aspect of it is I feel like I don't have transparency into those conversations so that maybe data can help on that third front. Right. Giving people transparency into what's actually going on. I don't know how it will solve the first two problems of the lack of transparency and like the incentives that are driving people and also the lack of. I don't feel like at this point, even with the data that is out there, that even if it points to like a very positive, you know, fully confident outcome, I don't personally believe that people will change. So, you know. Ken: [01:18:44] But yeah. Ken: [01:18:45] I think that that's a pretty terrifying, overarching thought. I recently read a book, Never Split the Difference, and they talk about how logic in any negotiation almost never works. Like the most important things are building trust with the other person or the or the community or whoever it is in order to change their mind. And I think you see logic not working with all the statistics around many of these issues. I think that the numbers are pretty clear. If you look at, for example, like gun ownership in the US versus other countries and some of these other things, it's that there isn't trust from either either side of the aisle. There's there's no trust in the communities or with politicians or any of these types of things. And I also feel like that's an even harder challenge to tackle, even with the data where like data should be and logic should be, where we build trust. And I think it's very hard for for me, especially working in this profession, to see that people aren't trusting data and the things that that should be as close to ground truth as possible. But I digress. Go ahead. Custom. Ken: [01:19:58] I think the. The [01:20:00] interesting thing with these conversations and it's been pretty harrowing listening to it yet again from so far away. Right. We're sitting in Australia. I mean. In Australia. The story has been different over the last 20, 25 years in terms of how we address gun control and same thing with our neighbors. New Zealand and the countries aren't so different. So the difficult thing to wrap our heads around is that it's not from a lack of data existing, it's more about how do we how do we respond to data. I don't think people are as data driven as data scientists would like to believe, right? From all of us in the data world, we like to think that we can make data driven decisions. And you see there's a business level as well, right? So many times you jump into a into a large corporate company and they're like, oh, we want to make data driven decisions. But let's be honest, they they don't have the data. Or if they do have the data, they're still leaning on conventional wisdom and conventional biases that exist within the system of that company. Now, expand that out to a country. Ken: [01:21:10] Right? Expand that out to a country of 300 million people with existing biases, existing needs, existing cultural context. Right. That often ends up being much more powerful story than anything that the data alone can drive. And I think that's the story of what we've seen definitely over the last 20 years with regards to gun control and and issues like that. So how do we how do we add value as data scientists? So a story like that is we've got to leverage more than our data skills, our communication skills, right? We're able to communicate the impact of data at a business level. Now, we need to do that at a social level, and that's a much more difficult challenge. It's not it's not impossible to do right. [01:22:00] And it's definitely worth doing. The question is, how do we do it right? It's it's it's challenging. And it's difficult to believe that data alone will get people across the line on things like that. And it is harrowing to watch and understand. And that's the story, right, is how harrowing it is for everybody to understand what's going on. Yeah. I don't see a Data only way around something like this. Ken: [01:22:28] Awesome. Thank you to Navy. Go ahead. Speaker5: [01:22:32] I mean, I think there are valid points. I don't think that the at least the intent of the post was to change anybody's mind. You know, you can change my mind and I'm not going to change your mind. But I think if we think about this exercise as. Say a risk model. Write the rate of default. Of somebody not paying their debt. Credit card risk model. Or any insurance models. Or mortgage models or whatever. You know, you go into the premise that people are going to pay back. And then there is data that we have about these people that guides us on their propensity to pay back. Right. I'm saying that given that sadly, we have a lot of data about the number of shootings that have happened. What we don't have are the variables for the shootings or the shooters. That led to those things. So what I'm asking us to do is collect those data points that led us there. And it's a very tiny percentage of people. In a country of 300 plus million people. But there must be patterns of certain [01:24:00] behavior that have come out of this that we should think about. Without. Without saying that. I'm not saying take your guns. I'm not even getting there. You know, I'm saying. Can we start looking at that? Right. So we already know that the whole funding situation and they're not going to do what they're not going to do because they don't want to do it. But can we start somewhere gathering that data? Because, you know, the end point is that credit card transaction of purchase of that gun or whatever. But there were signs much before that, right? There were bullying incidences. There were police reports, there were counseling sessions. There was something must have happened that led to that final thing. And how do we. You know, use those skills that we have within the parameters that we know that we can do. To get the data together. I don't think I'm talking about changing anybody's minds here. I'm just talking about how about we use it the way we use it for businesses and defaulters and whatever. Ken: [01:25:13] So I agree with you 100%. I think we should have significantly better documentation on who owns guns like when they purchased them, even why they purchased them, and things along those lines from the people that I've talked to. And, you know, I lived in the very American Deep South for a long period of time. That is a massive point of contention for them, that is controversial, that you have their information, you're infringing on their privacy to own a weapon or whatever it might be. Do I think that that's a good thing? Absolutely not. But I still think that that is a massive barrier for any type of progress. You know, like the questions you wrote in your post, I think there are a lot of parents, particularly in like maybe Texas, that would be vehemently against giving up any of that information and would be radically. [01:26:00] Even even though like these are the communities that were just hit by this tragedy. And I you know, it's fascinating to me how deep the protection of privacy is in those spaces or privacy around these issues. I cannot personally understand I don't think anyone here can necessarily understand it, but that is absolutely something that people are specifically against giving up. Then did you have a thought there? Speaker3: [01:26:30] Yeah. I think from a data perspective, we have a ton of data. We really do. It's a and I understand that it's not readily available. And the reason why most people don't know that we have all of this data is because there have been restrictions put on it. A lot of this data was gathered at the state level and at the federal level, and there were regulations passed to make sure that that data never was applied in the way that it was intended to be. There have been several, like the CDC and a couple of other organizations at the federal level have studied this and they released reports that were basically shelved because that data reaching the rest of the world was just not something that people wanted to happen. But it's all out there. And so the data has been out there. There have been analyzes done on this really credible academic institutions. And some of the smartest people from thinktanks have walked through this. Federal agencies have walked through this. So it's not a data problem. We know exactly what's going on as far as the cause behind each one of these each one of these mass shooting events. There is a I mean, there's an investigation behind each one of them. And so there is a wealth of data and documentation on exactly what's going on. And so that's what I mean. That's the piece that I want to add is that we have plenty of data and we've been down [01:28:00] the road repeatedly of figuring out what's wrong. And at this point, every time and I'm not attacking you for this, so don't take this the wrong way at all. Speaker3: [01:28:12] Every time we say we need to study the issue, we pretend it wasn't already done. We pretend that if we study it, we'll learn something new. And that's been you know, we have sort of two sides of this issue that both have the same objective, which is not to do anything, not to change anything. One side is talking about the Second Amendment. You have to preserve the Second Amendment rights of people to bear arms. The other side is saying we have to study this issue and we have to figure out what's really going on. And we have to we have to do this holistic solution. But we can't do that solution until we do these studies, until we really take on this issue and get the data. And on both sides, there's a level of we just don't want to have to do anything with this issue. And if you look at the numbers as far as opinion polls and what people genuinely want to have happen, there's support for a number of different issues around gun control. There's support for a number of solutions around mental health. There's support for a number of solution just around health care and around education reform and around every issue you could possibly think of as being tangential to. This has been researched. It's been documented and not by not by like fly by night organizations. These are things that have been documented for 30 years now. Going back into the late nineties, early nineties, we've done the research on gun violence. And what what we have is this rich body of research and knowledge. Speaker3: [01:29:58] We have a massive amount of data. [01:30:00] We have a public that's saying, look, this is what we want to have happen, but it's not happening. And so every time we hear someone saying, you know, there's something aside from the obvious issue that everyone who has studied this has come to, anytime someone says we need stronger doors or we need more guns in the schools or we need, you know, none of those solutions have any merit. In fact, and everyone making an argument on that side should know it. Anyone who's saying we need to study this more. There's already been tons of studies done on this. There's already been solid conclusions on this. Anyone saying that should really especially at the policy level, they should know that. And so and again, Navy, that wasn't you. I'm talking about people who have been in a political position so long. They've funded several versions of this research. And for them to now be saying we need to do more research when they funded it 20 years ago and then ten years ago and then five years ago. I mean, you're coming at it from a genuine perspective of learning and education and data gathering, whereas there is an entire side of the argument who's coming at it from what we need to study it so we don't do anything. And that is our biggest problem, is that we have a disconnect between what the public wants. And we have two sides who are making arguments that don't make any sort of sense, but they sound great. And so people are sort of falling for it. Speaker3: [01:31:33] And so that's the you know, if you want to talk about this and hit it from a behavioral standpoint, the problem is that we have made it clear what we want as a nation, made it very clear and by majority made it clear even in those deeply southern states, even in states like mine, I mean, Nevada, we're we are nuttier than anything you can imagine for our Second Amendment rights. I love guns. Oh, yes, [01:32:00] but no. Take ticket. Yours. Those are child slaughter tools. That's what they are. That's the reality. There are certain weapons that have proven. Their lethality in the same sort of thing over and over and over and over and over and over again. I mean, the data is there. And so as somebody who loves my guns and loves me some Second Amendment. You have to at some point say there is a limit to everyone's rights and you can't come at that from multiple directions to derail the argument and to get people talking about something aside from what everyone has already said. And so, like I said, no matter where in the country you go, you're going to find a majority of people who say, yes, background checks. I think that's one of the ones that's something like 80%. So it's almost impossible for us to to look at this and say. You know, I just don't get it. I don't understand how we continue to keep going down the same path when we have plenty of data. We have plenty of research. We have plenty of public opinion. Ken: [01:33:26] Real quick, then I made you host. Unfortunately, I have to go. I do not want to disrupt this very important conversation. So, everyone, thank you so much for coming in. Again, it was an honor to be with you all, and I'm excited to rewatch what Mickey has to say here. When I when I get back from my next meeting, I will talk to you all very soon. Speaker5: [01:33:52] Yeah, I think I just I just wanted to add so there's a saying like, it's not a race war, it's a class war. And while I find that saying [01:34:00] a little bit problematic on some fronts, I do think it's kind of true as well. For me personally, I feel like as long as we have the same like structures in place that continue to support the historic decisions, things aren't going to change personally, but that that's not saying like we should overthrow the government. But what I do think is that to a certain degree we need young like we need fresh new voters. We need people like engaged in the political and civic system here in the US. Because that's another thing that is kind of amazing, is the fact that all the people who could vote don't vote, whereas in other countries the turnout is much higher. But I also think to you like this kind of showcases in some ways the limits of data scientists or like data people. And this is where we need to start leaning, leaning on our domain partners, because once again, like even like within my own family, I can present data. If they don't understand the analysis or anything beyond simple ratios, I'm going to lose them pretty much. Speaker5: [01:35:05] And we also and there's a lot of ways like we can't do like a pure experiment, right. To say whether or not gun policies are are in fact, like what need to be changed. We can only do sort of like after stuff happens, we can do the historic analysis, but there's no like humane way we can actually do experiments, right. To actually tease out the causality of certain things, you know. But I do think that, like, this is where we do need to be, like partnering with our data storytellers, with our policymakers. And I do think it does need be grassroots because like once again, the powers in play, like they don't want things to change. But people, on the other hand, I still believe in large numbers who do turn out to vote and vote out, you know, a lot of these sort of all of these like policymakers, I do think there is power there. So I don't think it's a is the data [01:36:00] there. I think it's say like what is the right messaging and what's the right conversation? How do you get as many people engaged in possible so that they go to the polls and make informed decisions? Because even in San Francisco, there are tons of times where bills are passed or we have propositions that like go up for vote. Speaker5: [01:36:17] And I don't understand half of it like I should, but a lot of times I have to lean on analyzes by the S.F. Examiner or Chronicle. I have to look at nonprofit or government watchdog groups to figure out like, Am I voting for what is meaningful for me? But at the same time that's a privilege in and of itself because a lot of poor working class families, they do not have that. They don't have that time and energy to devote to like, okay, how do I read and analyze the different points of this proposition? So I think whatever we can do to kind of get that information in people's hands, to combine it with like an understanding of like the conversations people are having and what level they're having it. For example, Gen Z probably has very, very different concerns from my from my parents generation. I think that's that's the stuff that like if we can somehow make it work, then I do think there will be change. Well, I think as long as we have the same people in power, it's just. Yeah, it's just not going to happen. Speaker3: [01:37:20] Let me go to Tom next and then Costa. And and are you still are you waiting to talk to or are you. Speaker5: [01:37:28] No. You know, the part that I don't mean it for just this topic, but in general, I think bias. And where the bias takes us is an important thing to recognize as data scientists. You know. So if my bias is that guns are bad, then I can't do this analysis. Or if my bias is that guns are good, I can't do it either. So. The conversation [01:38:00] is again going in the direction of, hey, guns are bad. No, I'm not saying that. I'm not saying any of that. I'm saying as data scientist, our job is not to decide that. Our job is to find the patterns. That can help us identify these these people before it happens. So that's I think and and it's important. Like I'm also I also have very strong opinions about this, but I don't know if I'll be a good data scientist on something like this. Right. So to keep our biases aside and still be a learner and try to find a what I what is my success metric for this project? And B, how how unbiased can I be or involve people who are as unbiased as possible to help us get there? I think that's kind of. You know. What I'm trying to get at again. I'm again, I'm not I think they're all valid points. But we already know what what that is. We already know that this is not a good thing to have because we're the only country with that problem. Speaker3: [01:39:20] So. Yeah, you know, I think you're right. I understand where you're coming from. And like I said, I get wanting to have more facts than I get, wanting to do more research because it makes us feel like we can do something. You know, as data scientists, it's hard to it's hard to sit on the sideline, isn't it? All right, Tom. Good. Ken: [01:39:41] Yeah. I'm actually glad you had me go first. So I was Nava and I were chatting privately earlier over LinkedIn chat and I said, What's hard for me is just from my current perspective and I'm sure I don't have enough data. I have this literally brainiac, [01:40:00] brilliant friend who is probably the world's best expert on the Second Amendment, historically speaking. I mean, even from pre. Declaration of Independence time the rules. Why they set up the laws they did and. Take a minute. And so I will reveal he's very pro gun rights. And when I just from my own viewpoint, I tend to think of, hey, I want good people to have guns, but I really am okay with. Here's my background. Yeah. Check me out. Yeah. Please keep the crazies from having guns, please. Now, what I was telling Nobby, I really want to be open to being wrong on my current stance, and I hope I'm not making enemies by revealing this. My current logic is. Hey. If you outlaw guns, outlaws will still get guns. Then what do you do when you're facing an outlaw with a gun? And yeah, no, I don't want to shoot him. But if I'm protecting my family or innocent people, let me put it this way. I have and I'm not bragging. It's just because I have a lot of kids, I have $1,000,000 in personal life insurance. Ken: [01:41:35] Well, when I got that, I had nine young kids, and I didn't want my wife to be strapped with all that responsibility. We've always hoped, I think I'm pretty sure Sue didn't ever want to use that insurance money. Please laugh anyway. But if I am concealed, carrying around my local area when I go [01:42:00] out, I don't want to use that gun on anyone. It's insurance. I go to a Catholic parish that because of church shootings. There's guys walking around. With their pieces. Ready to defend the priests and the congregation. Do I like that? Well, I like it better than my congregation gets getting shot up or my priests getting shot up. So it's an evil world we live in, unfortunately. However, if there was a better way to control it, I would be open. I don't want to be guilty of not being able to admit, Oh, my current thinking is wrong, because actually having that spirit has been one of the better blessings in my life, being willing to make major worldview changes. But I like some of the comments that people were saying in the chat and in this discussion. Like Eric said, You guys are so calm. A lot of people just rant about this and won't just say, I heard your viewpoint, here's my viewpoint. Oh, wow. Let me really try to take your viewpoint here for a minute. Ken: [01:43:19] That's been very rare. I've been able to have a talk like that with opposite sides. All right. This is the thing, right? Like it is. It is one of those things where you're right, the data does exist, right? I mean, I'm not trying to get into counter counter discussion points, to be honest at this point. But, I mean, we've seen examples from around the world where countries with guns have had different ways of controlling how they respect the use of guns, particularly within members of the public, right outside of the armed forces and and law enforcement. Countries without guns have also shown the efficacy of that. You've kind of [01:44:00] got to understand the temperament and degree of responsibility and liberty that your people has and tailor the tailor the laws to suit that. And I think out of the countries that have shown some measure of voice in terms of gun legislation, whether it's Australia or Japan or New Zealand or or Sweden for that matter, there are different approaches across all of these countries, but essentially it feels like America still hasn't got the right balance, balance of legislation as to people's actual needs and the temperament of the people of the states. Right. Very, very difficult conversation we had. But you can't get a full picture of that. This kind of roots back to what Mickey was saying before. Ken: [01:44:50] You can't get a full measure of of a nation's temperament unless you actually hear the voice of the full nation. Right. Like, I mean, this somehow really does come down to who we electing. Right. And and I'm trying to be really measured here because I'm not from the States. I don't fully understand the nature and nature of people from America is a big disclaimer. Guys, I'm Australian. It's a different perspective here, but. I guess an outsider's view viewed right when you have. The full body of people voting, you get a much broader sense of what people care about and then you vote based on that. You're represented based on that, right? So in a sense, yeah, compulsory voting or at least a larger percentage of voter turnout is really critical to actually getting the issues that matter legislated for. Right. I mean, people can say all they want until they're actually voting people in that will do the things they care about. It's difficult to justify otherwise. I mean, in a democracy, that's how it works. [01:46:00] Right. And the funny thing is, I hear interesting arguments against compulsory voting like, oh, people will throw donkey votes and fake votes. For those of you who don't know, a donkey vote is basically a vote. That's not not legitimate because you've voted for both parties as number one or something like that. Ken: [01:46:17] You filled in the ballot paper wrong. People do it on purpose. But here's the funny stats in Australia with compulsory compulsory voting. Less than 5%. Right. I think the peak donkey vote across the last 20 years or something was something like 5%. Right. I mean, yeah, fine. We've got a smaller population than the state of California, to be fair. But at the same time, the percentages don't lie. I mean, if people are there, they're I trust in incompetence over what is the right word for this. Incompetence over necessarily apathy or or negligence or something like, you know, just actively being against the idea of voting. Right. I think people are more likely to make a mistake than turn up to the election booth and say, no, I'm purposely going to do a donkey vote because I don't believe in voting in a democracy. I don't believe in my voice being heard. Right. Because there are people who who choose to opt out of voting, even in Australia, which is that your default voting choices that you should vote. And when 95% of people are put in that position where they're told to vote, they end up making a legitimate vote. And even if it's based on gut instinct now, it may be based on policy, it may be on gut instinct. A lot of the voters in Australia, from what I hear, had some reservation about Scott Morrison as the the previous incumbent PM that was that just lost the last election which is like a week ago by the way in Australia. Ken: [01:47:55] But like it was interesting how that had an effect. And you're [01:48:00] now seeing a third party rise into the Australian political spectrum because that's what people seem to care about and it's kind of unprecedented and it's a reflection on the the Australian people and how we see the leadership has unfolded over the last 20 years essentially. Right. And people wanting a bit of change have ended up voting in a diametrically opposite way to how they've usually voted. Right. Entire voting lines have changed. Like entire regions that have previously been a strong, safe seat for one party have flipped. Right. It's been a very interesting election in Australia and that's because you're getting the full voice right. You're never going to have that unless you get everyone to the polling booth and that includes people who are not fully informed about their voting opinion. Is it some level you got to trust the people's gut instinct, right? So it's a lot of it comes down to how do we operate as a democracy, right. In order to be able to get a fair balance of concerns out there. Right. However, opposing they are. Right. Speaker3: [01:49:09] All right. In the spirit of respecting everybody's time, I appreciate all the comments on behalf of Ken, who had to leave little early. I appreciate the honesty. Ken: [01:49:19] And then let's all agree we at least solve the problem here in this short time. Speaker3: [01:49:26] I mean, I think we're done. I think we're good for dinner right now. Yes. Let's break. Ken: [01:49:30] I'm going to go watch a violent American action movie now. I think I'm going to go have breakfast and watch a violent American action movie. Speaker3: [01:49:39] We have several to recommend. Thanks, everybody. Appreciate the time. I appreciate everybody coming on behalf of Harpreet and Ken.