HH61-10-12-21_mixdown.mp3-from OneDrive Harpreet: [00:00:09] What's up, everybody, welcome, welcome to the @ArtistsOfData science, happy hour. It is Friday, December 10th, 2021. This is a Data science happy hour number sixty one sixty one weeks we've been here for you on that. Sixty one weeks. It's been awesome. Hanging out with everybody, hanging out with all these lovely, beautiful people. What's up, everybody? Shout out to Eric Sims, Kenji Rashod of the building, Monica Jennifer Russell also sitting in the room Speaker2: [00:00:36] But not visible at the moment. Harpreet: [00:00:37] Eight And Serge A. Super excited to have all you guys here, man. I hope you guys had a great week. Hopefully, you guys were. We're enjoying ourselves this week doing awesome, fun stuff. You know, if you got anything awesome funny, want to share about this week? Feel free to share it with us. Everybody joining us on LinkedIn, on YouTube, on Twitter. Wherever it is. Twitch, sorry, wherever it is that you're joining us, if you got questions, feel free to to let us know right there in the chat. I'll be happy to take all of your questions. Let's go ahead and get this. Get this warmed up. I see we got a couple of couple of questions in the queue. Monica has a question, so go for it, Monica. Speaker3: [00:01:17] Awesome. Ok, so a lot of you might already know if you don't. I'm obsessed with learning, and I've been posting the past couple of weeks. Currently, I am studying for a certification, so I was just curious from like a soft skills perspective to get your guys's go to techniques for learning when you're studying for an exam or certification. Harpreet: [00:01:42] Yeah, definitely. So let's go to Kelly. I see Kenji is laughing. I think, Kenji, there's definitely one to get some good tips here, for sure. Speaker4: [00:01:50] Well, I'm actually in the process of making a video on like how I would learn to learn if I could start over, like just like, learn anything. Speaker2: [00:01:59] And so there's [00:02:00] it's currently scripting. It's like 12 Speaker4: [00:02:03] Pages so far. So I'm going to have to cut it down to make it into a video. But needless to say, I have a lot of a lot of thoughts on this topic. Speaker2: [00:02:11] I'm going to try and be kind of brief Speaker4: [00:02:13] Because that'll leave room for people to watch the video and whatnot. But Monica, if Speaker2: [00:02:18] You want, I Speaker4: [00:02:19] Can give you the early draft if you'd like and get your feedback, but it also might help you as well. Speaker2: [00:02:25] For sure, yeah. Speaker4: [00:02:26] So from a learning perspective, I think up front, one of the most important things for me is to actually be able to consume information at like high volumes relatively quickly and to be able to digest it. So like outside of a task specific thing, like studying for an exam, I actually think it's really important to first start like practicing, reading, reading and consuming information faster or if you prefer to listen it, like figuring out how to listen and consume a lot of information. And then the next logical step off of that is like, how do you organize it really quickly? How do you take really good notes? How do you consolidate, consolidate those types of things like I think the kind of information is really overlooked? Like, if I can read twice as fast Speaker2: [00:03:06] As someone else Speaker4: [00:03:07] With the same comprehension, like you get compounding returns over a pretty long period of time, which I think is I think Harpreet will agree with me on that front, like he's read so many books and he's been able to do that and like, grow exponentially from the from the information Speaker2: [00:03:22] He's consumed there, probably because it Speaker4: [00:03:23] Reads more. But also, I would imagine he consumes pretty fast. So that's like kind of the first Speaker2: [00:03:28] Thing is like, Speaker4: [00:03:29] Ok, figure out information really fast and efficiently. Speaker2: [00:03:34] The second is starting Speaker4: [00:03:36] Is like getting into your mindset, right is how how do you Speaker2: [00:03:39] Approach any of Speaker4: [00:03:40] These learning certificates, whatever it might be with the idea and the philosophy that you can do it. So like the growth mindset is hugely important here, just like figuring out in your mind, like how do you talk to yourself to make it a very logical thing for you to do and a very feasible thing for you to do to be able to tackle [00:04:00] this? Whatever learning objective it is, there's a whole lot more to the psychology, but I wanted to make sure I touched on the specifics of learning a new skill and like a very specific piece of content or information. So obviously, getting your hands dirty is something everyone recommends, like doing problems, primarily working on projects, applying it to something that's relevant. But there's also really unique ways that we can like rooms and like a fairly scientific way. So I forget what the technique is called, but it's a Speaker2: [00:04:36] Pattern for Speaker4: [00:04:38] Flashcards and like, it's like a yeah, it's like there's a there's a formula and a spacing algorithm that you can use that allows Speaker2: [00:04:46] For like Speaker4: [00:04:48] Like Speaker2: [00:04:48] Optimal remembering Speaker4: [00:04:50] Or retention of information. And there's some like websites online. My friend Jeff Lee talks about it a lot, but essentially there's an algorithm that you can use with flashcards and based on how often you answer them correctly, it'll show you them in like a very close to perfectly optimal cadence. Yeah, there we go. Speaker2: [00:05:08] I think it's inky. Oh yeah, there we go. Speaker4: [00:05:10] But that to me is is completely game changing, especially if you're trying to consume very specific information. I would try to get into those like meta skills of consuming information. And how can I allocate Speaker2: [00:05:25] The time if I'm going Speaker4: [00:05:27] To be memorizing flashcards or something to do it as efficiently as possible? And then the last thing I think is really important, just like, have fun with it, right? Like, you're taking it because it's interesting to you and and you want to do well. But like at the end of the day, the certificate, whatever these things, it's just a piece of Speaker2: [00:05:44] Paper like Speaker4: [00:05:45] Remembering that the process and having fun with it and enjoying it and learning it and how you're going to fly it down the road is what it really pays the dividends. So I apologize if that was a little disjointed, but hopefully it was helpful in some way. Harpreet: [00:05:59] I love that. Can't [00:06:00] wait. I love you. Yeah, yeah. They'll come out. Speaking of Geoff Lee, Aiden just said Shout out to both Kenji and Harpreet Sahota for having Speaker2: [00:06:09] Geoff Lee on Harpreet: [00:06:10] There on the podcast respective podcast. Yeah, that was a great episode. Definitely check that out. A lot of tips there to learn. Monica, I do want to say that there are a couple of episodes you might want to tune into from my podcast with Scott Young, who wrote the book Ultra Learning. And then there's also Barbara Oakley, who she's written a book called Of Mind for Numbers, but then she also is the creator of the course, learning how to learn. So we kind of talked about learning how to learn. So definitely look for those shout out to A.A. actually just passed a certificate exam. And if you'd like to share some tips with Monica that you found helped you in this process, please do. And also huge congrats for passing that exam. I think it was. You thought it was the Data campaign and the certification meant that's huge. Congratulations. What are some tips you could share with Monica? And if anybody else would like to share some tips with Monica, please do let me know. Just, you know, use the raise hand reaction, I'll give you a call. And he said he's going to do it offline because he cannot talk at the moment. Monica, I'm curious for for your learning materials. Is it mostly like textbooks, videos? How are you going about studying this? Speaker3: [00:07:15] So my current certification, I went through a formal course that they offered and they provided study materials. And also they had set up a flashcard website, which was pretty cool. So any resources that I find available, I definitely go to four techniques getting aside from this particular certification, just learning in general, which is my passion. Getting your hands dirty that can set is very, very useful. And something else to add to that is like failing on purpose to understand like the error codes and how to fix the back end and whatnot. Speaker2: [00:08:00] So, [00:08:00] yeah, Harpreet: [00:08:01] Let's go to Jennifer Nerd and I see Jennifer Darden has some has some tips here to go for Jennifer. Speaker3: [00:08:06] I'm working through my MBA right now and I brought my. I'm like you. I love to learn, and I just brought my finance grade from a solid C up to a solid A.. And the thing that I changed was my study habits, where I did more repetition and I introduced Speaker2: [00:08:22] Color into reading. Speaker3: [00:08:25] So where I read chapters multiple times, you can see, like each time I go through, I'm I'm using a different color pencil for highlighting things like that, reading it multiple times. But then the projects and the practice quizzes. I sent my professor the list of all the practice quizzes that I took and my grades tracked with my grade change in in the class throughout the semester. So definitely, if you can get more of those practice quizzes, oh, dove in on those. But repetition and whatever it is you're doing, Speaker2: [00:08:55] That's going to help your memory so much. Speaker3: [00:08:58] I love that color technique. That's really awesome for those visual learners. Speaker2: [00:09:03] Rashad Yeah. So I have a Speaker3: [00:09:07] Story about taking the gym mat in the winter of twenty fourteen. I think so. I had the Manhattan prep books and I only like studied for it for like a week. I think it's pretty intensely every day. And when I took the first practice test the night before the exam and I got Speaker2: [00:09:28] A 6:30 and I was Speaker3: [00:09:29] Not satisfied with that. And what happened is that they gave me exactly which Speaker2: [00:09:35] Areas Speaker3: [00:09:36] I was not performing at. So I basically that night I I sort of invented a couple drills, very simple drills to work on just those really, really specific areas. I did the drills that night. I woke up, I did the drills the next day and then I went to the Speaker2: [00:09:54] Test and Speaker3: [00:09:55] Took it and I got a Speaker2: [00:09:56] 7:40, so Speaker3: [00:09:57] Increased it by a hundred and ten in a day. And [00:10:00] I mean, that's a practice. So I practiced versus like actual right, there's a difference. But I think one thing that that helps me is to. Hone in on what you're actually missing, rather than to think of learning as this like giant impossible whale or something. Yeah. So like, yeah, I would focus in. Also, if you're talking in general about learning, I think the fastest way for me to learn is to have principles first and then all of the information I get can I can hang on those principles. I think of it like a skeleton, a framework. And those are usually like, I don't know, universal ideas, almost like features that work across Data, like centralization versus decentralization. You can use that to understand a lot of things, right? It works in your brain. It works in like social organization. It works in history and works in a Data like. The same concept can apply to many things. So if you learn more universal principles, I find it easier to hang other information or see connections between different fields. Thank you so much. Harpreet: [00:11:03] Any any tips here for for Monica? Speaker5: [00:11:06] Not really. I don't have fixed process. It really depends on what I'm trying to learn. I try to adapt to the content I'm learning. I can't think it's ever the same thing. Speaker2: [00:11:17] But yeah, I mean, I wish I had more Speaker5: [00:11:20] Insight into that. I guess I don't have Speaker2: [00:11:24] A very I mean, Speaker5: [00:11:25] I'm a very visual person. So like, I like learning and I also like learning by doing. Probably my main way. So yeah, it's it's more like a maze I have to run around to teach. Harpreet: [00:11:39] It's probably it's probably a good approach for sure for these type of certification exams. I mean, Monica was for like some software type of certification, right? So yeah, learning by doing and. And I would just I would see if I could find forums or communities where people have talked about it and maybe just discuss common topics and just ask yourself [00:12:00] questions as you're working through stuff. That's that's kind Speaker2: [00:12:02] Of like the biggest Harpreet: [00:12:03] Thing that I found helps me just make sure I'm going in with questions or I'm asking questions while I'm working through something. But yeah, everybody else has gotten some great, great tips there. If anybody else has anything to add, please let me know. Let's see if there's any other questions coming in on LinkedIn Speaker2: [00:12:19] Or in the Harpreet: [00:12:20] Chat here. No questions. If you guys got a question, please do. Let me Speaker3: [00:12:23] Know. Well, thank you so much, guys. I do have to hop off because I'm going to go study, but thank you guys so much. Harpreet: [00:12:29] Happy studying and enjoy. Shout out to the investors in building vintage. See why it's been a while. Let's so I don't see any questions. Let me go with the question that was about to open with at the beginning of happy hour. It kind of kicked things off. I think last week we were talking about what were some, some failures or losses that we took this year. You know how we react to them. But let's let's do the flip this time. What are some unexpected wins that happen? Like, what was something that at the beginning of 2020? Speaker2: [00:12:56] One you're like, you're just Harpreet: [00:12:57] Like the black swan of this year for yourself, I guess, like the black swan event for yourself this year. Let's start with the start of been a positive one. Positive, though, was something positive unexpected. As a black Speaker6: [00:13:08] Swan, I don't know didn't have anything that was unexpected. I think this year has been like the most planned, disciplined year I've ever had. You know, definitely had some successes. A lot of incremental stuff this year, but I don't really like I didn't have one Big Bang or one big win or one like massive jump forward. I guess I finally took the plunge on Substack. Speaker2: [00:13:29] Yeah, that's Speaker6: [00:13:31] Maybe that's my big win is I. Finally, I'm consistent on Substack. I'll just say that, yeah, Harpreet: [00:13:36] Vin has been so consistent that I didn't hear from him for a week and I was like, Is this guy still lives me texting me text to find out if you're not already following Ben on Substack, please make sure you do. Jennifer, what about you? And then after Jennifer, let's go to Rashad and then Russell. I want to hear about some unexpected big wins that you guys had this year. And by the way, everybody listening on LinkedIn. If you are listening on LinkedIn, I see there's about 10 or 11 of you. [00:14:00] Be sure to smash that. Like, smash your reaction, do something. Let me know you're there if you got any questions on anything whatsoever, whether it's data science, Speaker2: [00:14:06] Machine learning, any other Harpreet: [00:14:08] Random question that you have, please do let us know in the chat, Speaker2: [00:14:10] And I'll also drop a Harpreet: [00:14:12] Link for you to join us in the room as well. Jennifer. Speaker3: [00:14:16] Go for it. Many of you guys know I work at Intel. Our big win this year was getting Pat Gelsinger as our Speaker2: [00:14:21] Ceo, having a Speaker3: [00:14:22] Technician. Speaker2: [00:14:23] I mean, he's deeply Speaker3: [00:14:24] Technical and strategic, and having him back at the helm is a huge win for the company. Harpreet: [00:14:30] So I'm not aware of that. I don't keep up on like companies like this. But was he a Speaker2: [00:14:35] Previous Intel CEO? Harpreet: [00:14:37] Then he came back or what's that? What's that? Speaker3: [00:14:39] What's that? So he spent 30 years at Intel. Speaker5: [00:14:42] Literally, he came in like Speaker2: [00:14:44] An intern Speaker3: [00:14:45] Analyst kind of kid basically started and did 30 years of his career at Intel, and then he went over to VMware. He was a CEO there when he was passed over for CEO at Intel. And then we had some rocky years. And you guys may know that we've fallen behind and he's come back. He is Speaker2: [00:15:04] Injected so much Speaker3: [00:15:06] Energy into this. Company, it's absolutely fantastic, it's a it's a really good thing for the company, so it's it's a delight to have him back. It's getting us back on track. Harpreet: [00:15:15] Awesome, man. I'd love to hear that. Yeah, Intel, I remember just being a huge part of really my youth because I'm from Sacramento, and it seems like that was like the largest employer of people in Sacramento. There's huge intel presence in Sacramento and Folsom and in that area. Speaker2: [00:15:32] Let's let's go to Rashad. Harpreet: [00:15:35] And then after Rashad, we can go to a Russell and then I see some questions coming out on LinkedIn. So make sure to cue those up specifically additive supernatants question there. But Rashad will say an expected big win for you here. Speaker3: [00:15:51] Oh, well, I didn't. So I got two things to mention one, I became a consistent runner, consistent exerciser, which dramatically increased the quality [00:16:00] and the persistence of my thinking. I guess so I could like think harder, longer, which affects literally everything I do, especially in my job that started in September. So I have continued. And yeah, that's so many knock on effects from that. Just just doing that and I did without much of any self-discipline. So that's something by finding the right motivation. The other thing is job related. Speaker2: [00:16:24] So last year, around Speaker3: [00:16:26] This time, my team was cutting half laid off, right? Data Sciences. And now this year are we've had like enormous success in delivering a project, having nice results. And then now like the show, little people know about us internally and this and this is in real estate finance against the backdrop of Zillow's enormous failure rate. So having those kinds of events also help me hone my thinking to some extent on the systems you need to put in place to make sure that something like that doesn't happen in our case. So that's like, that's pretty cool. So it's kind of like for me, it's like a turnaround you, I guess, in my career and also in what we're doing. And while having more fun in the process leading into, I guess, what I enjoy Speaker2: [00:17:15] Most about being a Speaker3: [00:17:16] Data science, a small Data science practice, as opposed to focusing on the painful obligations and how people won't listen or whatever. You know, Harpreet: [00:17:24] Actually love that man. Thank you so much. Positive Black Swan for me, I will say was did not expect to be in this biweekly group with Rashad and Morgan Freeman and Avery Smith. We got Speaker2: [00:17:35] This little weekly group Harpreet: [00:17:36] That we meet up with and we just pour our guts out to each other. It's great. It's like creator therapy. So I didn't expect that. Speaker2: [00:17:42] But man, thank you for Harpreet: [00:17:43] Thank you for having me as part of that. Let's go to let's go to Russell Willis and then we'll go to an audience question. That's got an interesting question. Speaker2: [00:17:50] I'm going to send Harpreet: [00:17:51] This one straight to Ken because I know he'll have some good stuff to rally that wants to ask about deliberate practice in Data science and how we practice that. Because I know. Can you create [00:18:00] a few good YouTube videos about that? So we'll chat about that after we hear from Speaker2: [00:18:04] Russell Speaker7: [00:18:06] In the other one. So, you know, I'm struggling to think of a real big black swan thing, but one thing that has been a positive for me this year has been really enabling consistency to the struggle it disruption of 2020, post-COVID, etc. Everything was kind of up in the air, and one of the most obvious things I've done consistently has been joined the artists that data science happy hours. So I've been coming to these for a little over a year now. I think I started coming to them maybe November last year, but then over Christmas, with the holiday season going on, I kind of missed a few of them. You know when to do some some stuff? And back in January, I made a commitment that I wanted to kind of stick with them because they're great when I join and it's kind of like exercising. I find, you know, I used to I used to train in martial arts when I was young. And I find, you know, I get home from work late and I'd have like 25 minutes to turn around and get changed and get back out of the dorm. Some days, I just really couldn't motivate myself. Well, you know, I just feel I just watch the TV and chill tonight just for a change. But when I made myself go do it, I always felt better doing it and felt better after Harpreet: [00:19:24] You've went to meet there Speaker2: [00:19:26] For some reason. Harpreet: [00:19:27] But that was very powerful. Man, I love that you said he felt better doing it and felt better after you went through it. Speaker7: [00:19:32] Yeah, yeah. I mean, I love I love coming to these and listening to you guys is always great topics of conversation here. So yeah, it's it's it's a shame if I do miss them. I mean, I can't promise I'll never miss them again, but I always plan to come. Harpreet: [00:19:45] So, yeah, amen. It's my own little thing. And I, you know, I've missed I missed a few episodes here. I think you've been more consistent than me in my own happy hours this year. So thank you. Thank you. Thank you for being here, man. I appreciate you being here every [00:20:00] weekend and hanging out with us. So let's go to our audience question. I did have a question coming through from LinkedIn. He was. Asking for tips on deliberate practice in the field of data science. So we'll go to Ken to that, but for we go to Ken. I won't say I definitely send you a link to an episode I did with Jeff Lee. We actually talked a lot about deliberate practice in that episode, and he's also I'll try to find the blog post where he specifically talks about applying deliberate practice to Data Science and Data AIs learning, so that you can kind of read that as well, but can go for it. Speaker4: [00:20:34] Yeah, actually, Jeff is probably like my go to guy to talk about deliberate practice. Him and I both started training jujitsu around the same time, and we've been bouncing back and forth like Speaker2: [00:20:46] Different ideas Speaker4: [00:20:47] About how to improve on that and to be really efficient with our practice. I mean, I'm not old by any means, but like I, I feel it after a practice and there's a limited amount that I can do. And so just like being really efficient and useful and like and intentional about your time and your training, and anything is important because there's all these constraints for data science in Speaker2: [00:21:10] Particular, something Speaker4: [00:21:11] That I found to be Speaker2: [00:21:12] Unbelievably valuable, like Speaker4: [00:21:14] Disproportionately Speaker2: [00:21:15] Valuable is code review, Speaker4: [00:21:17] And it's something that Speaker2: [00:21:18] I used to really Speaker4: [00:21:20] Hate to do. It just wasn't wasn't a whole lot of fun for me. But as I've gotten older, as I've learned more, I've realized that the more I Speaker2: [00:21:30] Look at Speaker4: [00:21:31] Notebooks, particularly on Kaggle, just Kaggle kernels of what the people that I really look up to are Speaker2: [00:21:37] Doing, the more I Speaker4: [00:21:38] Get to see how they think and how they work, and I Speaker2: [00:21:41] Get to sort of just Speaker4: [00:21:43] Like do a bunch of different case studies. Speaker2: [00:21:45] And I think as you've heard me Speaker4: [00:21:47] Probably talk about in the past, like a broken record, I think projects and like experimentation is unbelievably important. But I think the flip side of that coin is like the case [00:22:00] study approach and following along and understanding someone else's thought process. Because the more you understand someone else's thought process, the better you understand your own thought process, and the easier it is for you to start applying a lot of the things that you're that you're that you're learning or that you're picking up. Speaker2: [00:22:16] You know, I would Speaker4: [00:22:17] When I was really trying to learn as much as I could, I was going through like three or four chemical kernels a day, like copying like cells and then like tinkering around with the code in the cells to see if I could get it to change different things. I think this is like, really valuable, especially Speaker2: [00:22:31] With with like data visualization, like figuring Speaker4: [00:22:35] Out the colors and all these different types of things. So that would be a place that I would Speaker2: [00:22:39] Start, and that Speaker4: [00:22:40] Might not Speaker2: [00:22:41] Be as Speaker4: [00:22:42] Traditional Speaker2: [00:22:43] As some of the Speaker4: [00:22:44] More basic things that I would usually say. Harpreet: [00:22:47] And thank you so much. Let's hear from Rashad. Speaker3: [00:22:50] So one thing I found when I was learning it was much more useful. Speaker2: [00:22:53] So in the beginning, Speaker3: [00:22:54] I was concerned with having being able to list enough things on my resume to have a breadth of skills. But it actually is far, far more useful. Speaker2: [00:23:03] I found to go really deep into Speaker3: [00:23:05] A specific Speaker2: [00:23:06] Project and then just sort of Speaker3: [00:23:07] Do all the things related to that. That project I find I learn a lot more. It also gives you teaches you Speaker2: [00:23:13] The things that are harder to teach, like Speaker3: [00:23:15] Asking tough questions of your Data and are you doing things right? I guess, broadly speaking, instead of trying to say I do R and Python and C++ or something. That's one thing. Another thing if you're in an organization. So I've learned this is rotating paired programing sessions. So the whole Speaker2: [00:23:33] Team goes into a sort of they Speaker3: [00:23:35] Rotate with each other and every week they get paired with another person on the team. And then there's two hours dedicated and then each hour one person leads. So every week you're paired with a different person and then you take turns leading and following in a paired programing session. And over time, what happens is that the knowledge everyone's knowledge comes out of their heads and into the team. So it's sort of a synergistic knowledge effect. So everyone's experience, it's brought [00:24:00] to bear on all the specific challenges. That's, I found, quite deliberate because it forces you to understand things that you're not necessarily doing hands-on all the time, Speaker2: [00:24:08] So it quickens your ability to Speaker3: [00:24:10] Understand new stuff. It also makes everyone familiar with it. Everyone else is doing. You can see like, Oh, maybe this would be useful for that client or that over here, over there. Yeah, a lot of a lot of your reasons to do it. But if Speaker2: [00:24:20] You can like how a group Speaker3: [00:24:22] Of small group of people and then like, help each other on each other's projects, I'd say that's a pretty, pretty good way to learn, learn quickly in a very problem specific context rather than a more generalized context. Harpreet: [00:24:34] I thank you so much. Real quick, though, this background you have here, it's quite interesting. Speaker3: [00:24:39] What is this? Yes, this is this is one of my favorite paintings. It's called the fall of Grenada. Granada is just it is a romantic betrayal, I guess. In nineteen ninety one, when the Spanish conquistadors, the Reconquista was complete and they conquered Granada in Colombia in the background. And I really like it because I saw it in Spain. And also I proposed to my now wife at the Alhambra in Spain, in Granada. So it's sort of meaningful and I kind of like the history stuff. Yeah. Harpreet: [00:25:07] Isn't this awesome? That's absolutely awesome. Shout out to that, delegates using the building, Shatner. Speaker2: [00:25:14] I see you there hanging out. No need to Harpreet: [00:25:16] Hide. No need to unmute or turn the camera on if you're busy. But just want to say What's up, man? It's always good to have shots around the building. So, so hopefully that was some great Speaker2: [00:25:27] Feedback for you. The biggest Harpreet: [00:25:29] Part of my deliberate practice is its repetitions and repetitions over a long period of time, Speaker2: [00:25:35] And you have to chunk out Harpreet: [00:25:38] Several hours per day to make it happen, right? If you want to Speaker2: [00:25:41] Get to that ten Harpreet: [00:25:42] Thousand hours. So just Speaker2: [00:25:43] Make sure you're doing Harpreet: [00:25:45] A lot of repetition, but make sure the repetition is not in the comfort zone. Just make it slightly beyond where it is that you're currently at. Right? So just make it tough just slightly a little bit tougher than than you're comfortable with, and that's how you kind of pull yourself forward [00:26:00] in the long, so to speak. And you got a question. Let's go for it. Speaker4: [00:26:03] Yeah. So this morning I just I saw the machine learning assessment on LinkedIn Speaker2: [00:26:09] And I was like, Maybe I should take Speaker4: [00:26:11] This. Let's see what this is about. And a couple of my friends that are taking it taken it, and fortunately, I passed. I was actually a little bit worried about that. I'm wondering what people thought of it. Speaker2: [00:26:21] I'm going to I'm going to Speaker4: [00:26:23] If anyone is taking it, I would love to hear your thoughts. If not, I'm happy to share my thoughts. But I don't want to like Speaker2: [00:26:32] Bias Speaker4: [00:26:33] To anyone with with my beliefs before, before I open a discussion. Harpreet: [00:26:37] Then you want to hear taking the LinkedIn skills assessment for machine learning. I don't know if I've taken it or not. Speaker2: [00:26:42] I think, yeah, I have Harpreet: [00:26:43] Taken the skills assessment. I don't remember anything about it. Speaker2: [00:26:45] I took it quite Harpreet: [00:26:47] A long time ago, but apparently I was in the top 30 percent of the Speaker2: [00:26:51] Seven hundred twenty six Harpreet: [00:26:53] Thousand people who took it. Speaker2: [00:26:56] All right. Yeah, I don't remember anything Harpreet: [00:26:57] About this assessment. Like what Speaker2: [00:26:59] Was it? Harpreet: [00:27:00] What was it like? Speaker4: [00:27:01] Well, I think that that's that's an important point that you just raised is that you don't remember taking it. It's a 15 question assessment that you're supposed to. I know this is being streamed live on LinkedIn, Speaker2: [00:27:12] So hopefully we don't get Speaker4: [00:27:13] Booted if I say something. But you know, it's a 15 question assessment and they're supposed to appraise some level of machine learning skill from this. Speaker2: [00:27:24] To me, I thought that that was like laughable, right? Speaker4: [00:27:27] Like, I don't know, like, I just don't understand what the point of this is like. The questions were, Speaker2: [00:27:34] You know, Speaker4: [00:27:35] A lot of them were like naming. Speaker2: [00:27:39] They were like memory Speaker4: [00:27:40] Based, right? They weren't function based. They weren't showing you a graph. They weren't asking you to interpret. They're only a couple of questions like that. Speaker2: [00:27:45] And I'm just Speaker4: [00:27:46] Interested at broader Speaker2: [00:27:47] Level what is LinkedIn looking to to achieve with this? Speaker4: [00:27:50] Does this help anyone or does it make like landing or hiring a data? Scientists more confusing Speaker2: [00:27:56] Because literally, you know, 700000 Speaker4: [00:27:58] People have taken this. [00:28:00] I mean, at least 300000 people have not quite that like two hundred some odd thousand people have passed this in the top 30 percent to be able to show it on their profile. Speaker2: [00:28:10] You know, does this, Speaker4: [00:28:11] You know, my biggest thing is, does this help or does this make things more confusing for Speaker2: [00:28:15] Everyone? Speaker4: [00:28:15] I'm a little bit in the camp that it makes things more confusing. Harpreet: [00:28:20] Let's go to Ben. Let's hear from Ben real quick, and then I'll give you my two cents on this as well. Speaker6: [00:28:26] So I've done this a lot when I bring in a new client Speaker2: [00:28:30] Or get brought Speaker6: [00:28:31] In, excuse me, not bring in when I get brought in to a new Speaker2: [00:28:33] Client, I'll look at the Speaker6: [00:28:34] Hiring Speaker2: [00:28:35] Process and I'll go Speaker6: [00:28:36] Through and look at all of these assessments, you know the things that you're talking about. And then I'll have the team that does the hiring take their own assessments Speaker2: [00:28:45] And I kid you not the majority Speaker6: [00:28:47] Of the time, everyone. Not like most people, everyone fails at some point and these are people doing the job. Speaker2: [00:28:54] They can't Speaker6: [00:28:55] Pass their own Speaker2: [00:28:55] Assessments, and I think that's Speaker6: [00:28:57] Where we're at. Is any assessment we think is valid is probably too hard and probably too restrictive. Any assessment that the majority of people can pass very easily probably is going to tell you a whole lot. And that's where I find most assessments are. I don't think you Speaker2: [00:29:13] Can put a quiz up Speaker6: [00:29:14] Online and assess a complex skill that takes a level of intelligence that I'd like to think that what we do for a living takes a level of intelligence. And so I think it takes a level of intelligence to assess it. I think Speaker2: [00:29:29] That we have to come up with Speaker6: [00:29:31] Something that's a little bit more, Speaker2: [00:29:33] A little bit more Speaker6: [00:29:34] Comprehensive and a little bit more leaning Speaker2: [00:29:36] Towards assessing what people know, Speaker6: [00:29:39] Not trying to figure out what they don't know. And we exclude too many people because they don't know everything that we could potentially think of asking them. And we ask all of these massive, broad questions because we don't really know what we're going to hire the person to do in the first place. And so there are a ton of problems that I see kind of hiding under the covers of your question. So every time somebody asks an assessment [00:30:00] question, it's like there's, you know, the iceberg. You see the tip here. But the real problem is what's? Underneath that assessment, what we're trying to figure out is so much more complicated, and we're trying to simplify it because we have recruiters in the process, we have air in the process, but we have a lot of different groups in the process Speaker2: [00:30:18] That don't really Speaker6: [00:30:19] Understand all of the assessments enough to do them themselves. And so we have to simplify in some ways, but by doing that, we lose a lot. So like I said, this is kind of this iceberg of problems that sit underneath every assessment. And when LinkedIn sends something out, I mean, I like LinkedIn. It's a great company. They got they have really good analytics around talent, Speaker2: [00:30:40] But I would like them Speaker6: [00:30:41] At some point to create a body of evidence to support what it is they're doing. I'd love to see, like science, real hard core science brought into their assessment methodologies, and if they have it, I'd love to see them publish it. Speaker2: [00:30:54] Well, to me, Speaker4: [00:30:55] That's the the most interesting thing is like Speaker2: [00:30:57] What? Speaker4: [00:30:58] What statement is LinkedIn trying to make with just a rinky dink assessment that you that anyone could take and put on their profile? Right? They are one of the companies that essentially like, coined the term Data science, right? Like they're one of the largest influencers in this space. Everyone looks up to them. And then putting out something like this, I think, is very problematic. If it was different, if it was like what Google did with the data analytics certificate, Speaker2: [00:31:22] Right where it's like, Hey, if you take this Speaker4: [00:31:24] Like you are qualified for a job with us, that is one thing, right? You're putting your money where your mouth is. Speaker2: [00:31:30] To me, Speaker4: [00:31:31] I just find this very gimmicky and saying, Hey, Speaker2: [00:31:33] Like, look, you can Speaker4: [00:31:34] Put this badge on your profile and and you know where the where the Boy Scouts or the Girl Scouts here and and like this is maybe that's not giving enough credit to the Boy Scouts and Girl Scouts scouts, honestly. But yeah, I don't know. I think that it's just a really interesting play from Speaker2: [00:31:49] Them knowing how Speaker4: [00:31:50] Serious the employment problem and this and like the credibility problem is within this domain. Harpreet: [00:31:56] I mean, if you look at like if I click on the assessment, I can't see [00:32:00] like what topics were covered, what percentage of each Speaker2: [00:32:04] Topic like, it would be Harpreet: [00:32:05] Useful to get some kind of more insight into what was actually on on the exam, you know what I mean or on this assessment, because without that, it's like, OK, well, you just ask me questions about just basic stuff, like what was the distribution of questions? I think that would be a little bit more helpful. That's for sure. Anybody else, any of the anything this or. Speaker5: [00:32:28] Well, I usually found the opposite that assessments tend to be, especially these take home ones that Speaker2: [00:32:36] Companies do way too much and they Speaker5: [00:32:40] Seem to be trying to assess too much at the same time. And it's also discriminatory towards people that don't have the time to devote to that. And in a way, it's also self-selection process because they're not going to be able to get people that already have a job and may be interested. But don't, you know, really don't have, you know, seven or eight hours to spend on an assessment? I haven't seen too much to the opposite where it's it's like the LinkedIn assessment, you know, too silly to even, you know, really have a bar where you're actually filtering some something. You know, I haven't seen that, but that's Speaker2: [00:33:23] That's Speaker5: [00:33:24] The whole other direction. And yeah, either extreme is not good. You know, Harpreet: [00:33:31] Comments come in here from A.. I agree with Ken took some of the LinkedIn assessments and they are joke compared to these actual certification programs like Google or Data camp. Speaker2: [00:33:40] Yeah, I mean, I Harpreet: [00:33:42] Feel like this is a point that we Speaker2: [00:33:43] Probably have discussed Harpreet: [00:33:44] Here before. Speaker2: [00:33:45] But you know, people talk about Harpreet: [00:33:46] Standardization of knowledge or standardization of of skill sets in data science. Like, does it make sense to have like do what the actuaries do, have a series of exams and then wants to pass a series of exams? You can become a data scientist. I don't know. I'm not for that type of. Definitely [00:34:00] not for that type of format at all. Mostly because those actuarial Speaker2: [00:34:04] Exams, you know, I Harpreet: [00:34:05] Still have nightmares. That's the reason I went grain after color, my beard and I closed up. Let's hear from you. And hopefully you guys can hear me, OK? Jonathan, I just let me know that my my audio is coming in a bit more muted than everyone, but hopefully it's OK. I know all the podcast will all be balanced out and everything you guys will be able to hear me, but hopefully this is OK. Speaker2: [00:34:24] Let's hear from you. Speaker5: [00:34:25] Yeah, I think when it comes to like standardized testing, right? Or at least for like interview testing, the one time and the only time I've ever found this actually useful is, let's say they give you a take home assessment and you Speaker2: [00:34:38] Go back and, you Speaker5: [00:34:40] Know, run whatever data analytics or whatever data science models you want on it and then you come back to them. Speaker2: [00:34:44] And then the following interview is actually Speaker5: [00:34:47] Effectively a code Speaker2: [00:34:48] Review right Speaker5: [00:34:49] Now. If they're going to actually give me some feedback on the way that I approach the problem, the way that I approach the data and the tools that I used to succeed, to succeed or fail on the task handed. That's the only time I see any value in it. But one pure reason now I like to flip interviews around personally Speaker2: [00:35:05] Now if I'm. To go to that Speaker5: [00:35:07] Effort and put six, eight hours of effort into a particular problem, I want to know that the team I'm joining is going to value that effort enough to give me actual feedback. Now, if they're going to turn around and give me as an interviewee, someone who's not paid to do this, if they're going to take time out of their day to actually go through serious feedback on it, then I'm convinced that when it comes to someone that they're actually paying for their Speaker2: [00:35:30] Day job, they still have Speaker5: [00:35:32] That culture of feedback, Speaker2: [00:35:34] You know, built Speaker5: [00:35:35] Into the company. That's the only real takeaway that I can take away from from doing an actual assessment is trying to figure out, OK, are these guys just going to say, Oh yeah, yeah, we have a culture of feedback and code review, but not actually do it? Or are they the real deal in terms of providing that feedback? Speaker2: [00:35:53] But to be honest, there are a Speaker5: [00:35:55] Lot more effort, inexpensive ways Speaker2: [00:35:58] Of ascertaining that Speaker5: [00:35:59] About [00:36:00] a company. So like the six to eight hours you spend modeling something like What's the return on it for me is someone applying for a job, right? And I'm not sure they're really getting the returns on that. Many people spending time attempting those assessments anyway. Speaker2: [00:36:15] So yeah, so Speaker5: [00:36:16] Far I haven't seen too many convincing cases Speaker2: [00:36:19] Of why a Speaker5: [00:36:21] Fake assessment is necessary. I guess the other reason Speaker2: [00:36:23] Is I Speaker5: [00:36:24] Have been in that situation where I've been given a technical assessment and then I realized that the person assessing me just doesn't know how to approach the problem anyway. Speaker2: [00:36:32] So am I joining Speaker5: [00:36:34] A highly immature team? That's the other thing that I can ascertain from that. Speaker2: [00:36:37] Those are Speaker5: [00:36:38] The only two things that I can really Speaker2: [00:36:39] See as value Speaker5: [00:36:42] For my time doing those assessments in the first Speaker2: [00:36:44] Place. So are we just going Speaker5: [00:36:46] Crazy trying to figure out who's the real deal and who's just, you know, promoting themselves the right way? I mean, like, is it? Are we actually facing that many applicants? It's kind of my question. Or do we just like the idea of assessments because I feel like every every way you turn, people like, Oh, we need data scientists, we need machine learning engineers. Speaker2: [00:37:08] And there's a Speaker5: [00:37:09] Shortage of skill set Speaker2: [00:37:10] In the market if there's a Speaker5: [00:37:11] Shortage of skillset Speaker2: [00:37:12] In the market. What are we trying Speaker5: [00:37:13] To separate from Speaker2: [00:37:14] Chaff from? Like, are we really Speaker5: [00:37:16] Sifting through thousands of applicants to go Speaker2: [00:37:18] Through that process? Harpreet: [00:37:20] So thank you very much. Coming here from Vigne Speaker2: [00:37:23] Says, I think Harpreet: [00:37:24] Our biggest unaddressed challenges. How do you assess someone who's very senior? How do you know someone with 20 plus Speaker2: [00:37:31] Years in a field? Harpreet: [00:37:33] Is there a track record enough to hire them without an interview? It's interesting point, right? Any other thoughts on this LinkedIn skill assessment? Rashad and he? Oh sorry, can I see your hands up? Go for it. Let me go to Rashad. Yeah. Speaker4: [00:37:45] I just want to touch on something that I've been just both touched on, right? Is that as you get more senior in your Speaker2: [00:37:54] Role, Speaker4: [00:37:56] Like you have better opportunities at different places, right? Like why would I go to a place [00:38:00] where I felt like they were wasting Speaker2: [00:38:01] My time with Speaker4: [00:38:02] A skills assessment or a really long take Speaker2: [00:38:05] Home test when Speaker4: [00:38:06] I have other opportunities lined up, there's like an increased opportunity cost associated with time applying as Speaker2: [00:38:13] You grow in your career. Speaker4: [00:38:15] Right. So if someone who has 20 years of experience and they're getting hired for XYZ, really Speaker2: [00:38:21] Like Speaker4: [00:38:22] Fancy job title Speaker2: [00:38:23] Like they're kind of like their time Speaker4: [00:38:26] Is too valuable for them to be fooling around with an assessment a lot of the time. And I actually like assessments Speaker2: [00:38:32] For like junior Speaker4: [00:38:33] Candidates, especially if you're considering their Speaker2: [00:38:36] Time and Speaker4: [00:38:37] You're Speaker2: [00:38:38] Like, you know, just just as you're Speaker4: [00:38:40] Saying before, like not wasting someone's time doing code review, whatever that might be. I think that there's a tremendous amount of value Speaker2: [00:38:47] In what people can Speaker4: [00:38:48] Do rather than what they've done in the past, Speaker2: [00:38:50] And the Speaker4: [00:38:51] Assessment is a way to show that. Speaker2: [00:38:53] But you know, Speaker4: [00:38:55] There has to be alternatives and and if someone Speaker2: [00:38:57] Has a long point a year Speaker4: [00:38:59] Track record. Speaker2: [00:39:01] We do we really need to evaluate Speaker4: [00:39:03] Their technical skill as much, I mean, I don't know, I mean, if they're still an individual contributor, Speaker2: [00:39:09] Maybe, but most people Speaker4: [00:39:11] Who have been in Data science Speaker2: [00:39:12] Roles that long, if there are Speaker4: [00:39:14] Any Speaker2: [00:39:14] Period, Speaker4: [00:39:15] I don't know if there'd be too many questions around that. Harpreet: [00:39:18] Yeah, absolutely. I feel Speaker2: [00:39:19] Like like just these Harpreet: [00:39:21] Kind of assessments that are put Speaker2: [00:39:23] On a social media Harpreet: [00:39:24] Profile kind of as like a badge that's probably more for Speaker2: [00:39:29] The user putting Harpreet: [00:39:30] It on their on their profile, right? Like just probably just like, oh yeah, passive assessment. Like if I saw that on a resume, though, if I saw somebody put down a resume, I would probably laugh Speaker2: [00:39:39] At Harpreet: [00:39:40] What nobody cares about this man. Like, where are you? Where are you talking about this? And it's not something that I'd bring up with a bunch of friends, Data science friends like in a bar like, Oh yeah, you pass this LinkedIn assessment. Yeah, well, you know what I mean? Speaker2: [00:39:52] Yeah, I think it probably Harpreet: [00:39:54] There just for the the actual person. More so that's taking it. But I'd love to hear from Rashad. [00:40:00] Go for it. Speaker3: [00:40:01] Yeah. So I used to have like a take home thing when I was just copying what everyone else does. And of course, I've taken a bunch of them myself, but I started to think, you know, leading a Data science team like, wait a minute, what's a facsimile of the actual job and what's a facsimile of how I would work with this individual? Would I be breathing over their neck, micromanaging them as I watched them write lines of code saying the live coding? Or would I be doing coding coding? Yes, I do think there is a discrimination thing where people, the more senior the person, probably the less time they have to take assessments. I personally like to give verbal cases because not only can you, it's a facsimile of how you actually work with people, I think, but also like you can talk through more models and not like, go into intense math that's not needed for the job per say, but say, OK, which parameters might you look at if you don't? That's a quicker way to test that than like having them to have the right to code. And then they only do a single simple model, which is what everyone does in code interviews anyway. And you know, you can cover a broader range of things in verbal case. So I think basically facsimile the job. I also think complementary is already there. Right. So if I was a different kind of leader, I would probably want to pick a different kind of person. And you want to complement like what the teams are you? They're like team composition. They should be complementary, like everyone should be better than the leader at something. Speaker2: [00:41:30] Right? Speaker3: [00:41:30] I think that's like a starting point. And then the things that they're better at should be different. I mean, generally, assuming that you're everyone's working on similar projects and then, you know, finally, I guess I would say testing the capability to work autonomously. So you're testing like, OK, given this problem, Speaker2: [00:41:48] What would you do? Speaker3: [00:41:50] How would you advance it if you were faced with something like that? Like, Oh, the performance is bad and you're like, OK, well, what? What are different things? You can look at them? You know, that's like, that's [00:42:00] how I think about it, because the leader's time is valuable to the person managing the data scientist. So you want to make sure that they can work autonomously when needed or as much as possible, given their level, of course. And then over time, it just grows. I PhD also feel like personally verbal cases can adjust to the seniority of the person, relatively simply. But in order to give one, I think it's the onus is on the leader to define a concrete enough Speaker2: [00:42:27] Problem, concrete enough Speaker3: [00:42:28] Problem. Because if you're just doing like, we don't know what we want, so we're going to put every tech stack thing on the on the job description. That's like you can't really come up with a cogent business problem that you could talk through in like forty five Speaker2: [00:42:40] Minutes to an hour. I think Speaker3: [00:42:41] So. It really puts more effort on the leader to come up with something that that really does get close to what Speaker2: [00:42:47] The job is, Speaker3: [00:42:48] But can still cover breadth and depth as needed and adapt. That's how I think of it. Harpreet: [00:42:53] I love the way this conversation is unfolding. A lot of great insight here. Russell, you've got some great comments here in the chat as well. Talk to us about this. Talk about Speaker2: [00:43:02] Unorthodox. Harpreet: [00:43:04] If you didn't get the Russell, I could read out what you're saying. Speaker7: [00:43:08] Oh, there you go. Sorry. I think, yeah, yeah, I'm okay. A little window came across my mute button and I couldn't get rid of it. Yeah. So I was just responding to the great comments from Ben and Ken earlier on about assessing senior talent. And, you know, one big wildcard there is unorthodox. I think now I really value on Orthodoxy highly so long as it is well conceived and well implemented. You know, it's not done recklessly or loosely. An orthodoxy can be really good, and what I think drives Unorthodox in more than anything else is creativity. Now a creativity is another one of the things that I prized very, very highly. And I think that is that's a secret weapon of any data scientist is a creative mindset. So. [00:44:00] Why not to be stuck in a regimented routine of the problem, A B has solutions c and that's it. You know, if you can find a solution in that solves 20 fewer steps and you're going through that process a hundred times, you're going to reduce the processing burden of your overall combined elements by a huge amount. So creativity is a huge superpower and that can lead to an orthodoxy. So an orthodoxy is only unorthodox because it's never been done before, or it's chosen not to be done because it doesn't have the right results. So if you can utilize it and improve results that have been done previously or come up with a really new way of doing something that that achieves better results or more efficient results, it's it's a win win. But once you do have that paradigm of habitual AIs reliance on the Orthodox to get over and that that can be a huge hurdle in some cases, Harpreet: [00:45:04] I thought, thank you so much. Great. Great insight there. Like what Ken says here, we should do at least one interview completely over Slack or teams chat. We communicate like that 40 percent of the time, anyway. Very, very, very true. Rashod saying that creativity equals advancing quickly in your career separator. That's hard to teach. Absolutely. Speaker2: [00:45:25] Absolutely. Great discussion. Harpreet: [00:45:28] Anybody else? Have any comments on this? Greg, what about you, man? Any comments on this? It started off as the discussion regarding LinkedIn assessments, but now just kind Speaker2: [00:45:36] Of more morphed Harpreet: [00:45:38] Into assessments in general throughout, like the, I guess, the interview process to assess, you know, candidates seniority or assess their capability. What your thoughts are on that? Speaker3: [00:45:47] I was connecting with a lot of what others were saying, especially Russell. I do believe Speaker2: [00:45:52] That and in Speaker3: [00:45:53] Rishard creativity is kind of hard to measure, hard to Speaker2: [00:45:57] Teach. Speaker3: [00:45:59] I feel like [00:46:00] a lot Speaker2: [00:46:00] Of people just go around checking a, you know, a checklist Speaker3: [00:46:07] To say whether this person is smarter or not. We'll be able to to excel at a given position, and we're not doing enough assessment to Speaker2: [00:46:14] Understand Speaker3: [00:46:15] Where creativity comes in. Regardless, there's always going to be a risk that people will take when they Speaker2: [00:46:21] On board, you know, a new Speaker3: [00:46:23] Data science or whoever. But at the end of the day, I think creativity is, is, is weird. It's more almost like an instinct that you Speaker2: [00:46:35] Can use Speaker3: [00:46:37] To surface those who can be creative at their jobs based on what they tell you they've done Speaker2: [00:46:43] In the past. So overall, it's more like Speaker3: [00:46:46] Enjoying what what the team before Speaker2: [00:46:48] Me here was saying. So yeah. Harpreet: [00:46:52] Great discussion. Thank you so much for asking. Can kicking off that discussion? Any questions coming in from LinkedIn or any questions coming in right here in the chat? Please let me know. Happy to. Happy to take those questions. Greg, what was the unexpected big win for Speaker2: [00:47:05] You this year? Speaker3: [00:47:05] Expected big ones for me. Harpreet: [00:47:08] Unexpected, unexpected, unexpected. Speaker3: [00:47:10] Yeah, yeah. So in a sense of Speaker2: [00:47:12] What, like just in Harpreet: [00:47:13] General life or, you know, professionally, whatever, just in general? Speaker2: [00:47:17] Yeah, yeah. Speaker3: [00:47:20] One that was unexpected for me Speaker2: [00:47:24] Is probably Speaker3: [00:47:25] Like Speaker2: [00:47:26] Being able to talk to Speaker3: [00:47:28] So Speaker2: [00:47:29] Many like startups Speaker3: [00:47:31] Who are Speaker2: [00:47:31] Trying to to to Speaker3: [00:47:33] Penetrate the market, right? So I've seen, like a lot Speaker2: [00:47:36] Of them, try Speaker3: [00:47:38] To bring different use cases to the table Speaker2: [00:47:41] In terms of like what they try to Speaker3: [00:47:42] Feel in terms of gap. I've talked to a fair amount Speaker2: [00:47:46] Of sort of Speaker3: [00:47:47] Founders co-founders this year, and I was not expecting that for sure. And in a sense, it got me to think Speaker2: [00:47:53] About what Speaker3: [00:47:54] Makes startup feel, especially an AI startup feel. And because I [00:48:00] can already think that in the next five years, a lot of them Speaker2: [00:48:03] Will not exist anymore Speaker3: [00:48:05] Or will be, you know, digested by somebody else. But it was quite interesting to Speaker2: [00:48:13] Learn like the different Speaker3: [00:48:15] Tools they build to tackle different use cases. So that was quite the learning for me to point where I would definitely love to kind of create like some sort of Speaker2: [00:48:26] Series where I can either Speaker3: [00:48:27] Film it or do a live session and things like that. Speaker2: [00:48:31] And I know who to go to for advice, Speaker3: [00:48:35] Whether it's Harpreet or Speaker2: [00:48:36] Can I know you Speaker3: [00:48:37] Guys got my back floor or starting to do this kind of thing. So that was that was the biggest, biggest one for me. Harpreet: [00:48:43] Here's Greg Lake. How does how did somebody get involved in like in in angel investing? Like, is there? I don't know. Like, like, how do you do that? Is it just do research a company reach out to the founder and say, Hey, do you guys need funding or is there like some type of proper process you have to go through to be like registered as? Angel, that's like, how does this work Speaker3: [00:49:04] If you don't want to go through the whole angel investing into Speaker2: [00:49:07] This thing? I think it Speaker3: [00:49:09] Starts Speaker2: [00:49:09] With like friendship, Speaker3: [00:49:10] Really like who? You know, you believe in this friend's idea and you bring in like a couple of bucks there that you know your feelings won't get hurt if you lose, it starts like that, right? Like conventional unconventional ways. Speaker2: [00:49:26] And then the more formal, it's kind of like taking part of like Speaker3: [00:49:30] Limited partner groups like war registered on angel list and kind of like get into meetings to understand like what kind of startups they're interested in. There are different LP groups, interested investors that are interested Speaker2: [00:49:46] In different kind of startups, Speaker3: [00:49:47] Whether they're Speaker2: [00:49:47] Focused on energy Speaker3: [00:49:49] Startups or finance startups or AIs startups, you name it. You have to just know what group you want to be part of, and then you have the whole, Speaker2: [00:50:00] I [00:50:00] guess, Speaker3: [00:50:01] Compliance that you have to do with the government, right? Because you have Speaker2: [00:50:04] To have certain amount of Speaker3: [00:50:06] Earnings to be able to be an Speaker2: [00:50:09] Official LP, Speaker3: [00:50:10] Which a lot of people have complained Speaker2: [00:50:12] Before because it's Speaker3: [00:50:14] Kind of like they feel like the government shouldn't Speaker2: [00:50:16] Tell them whether they Speaker3: [00:50:18] Should take risk with their money or not. It's kind of like it's a protection Speaker2: [00:50:23] That government Speaker3: [00:50:24] Puts on Speaker2: [00:50:25] People saying, you know, if you if you're Speaker3: [00:50:27] Coming to invest you, you have to be comfortable with losing Speaker2: [00:50:30] Money. And if you Speaker3: [00:50:31] Don't have a certain cap Speaker2: [00:50:32] On your income, Speaker3: [00:50:34] You can't be an investor. Speaker2: [00:50:36] So even though it's not really Speaker3: [00:50:38] Controlled by the government, whether you think the cap is like, you have to be able to make 200k for the past two years to become an investor, right? Speaker2: [00:50:47] And you just Speaker3: [00:50:49] Click that Speaker2: [00:50:50] Button and Speaker3: [00:50:51] Nobody will verify it. But it's kind of like an honest system. Speaker2: [00:50:53] But the government Speaker3: [00:50:54] Has put it there to kind of protect Speaker2: [00:50:57] People from losing Speaker3: [00:50:59] All their money, especially people who Speaker2: [00:51:00] Don't have enough available cash. Speaker3: [00:51:03] So there are so many ways to go about it. I like the particular Speaker2: [00:51:09] Way of not Speaker3: [00:51:10] Getting to know people build Speaker2: [00:51:12] Friendships and Speaker3: [00:51:14] Take a bet with a thousand bucks here Speaker2: [00:51:17] And there, which is the Speaker3: [00:51:18] Unofficial Speaker2: [00:51:19] Way for me. Speaker3: [00:51:20] But it's kind of like playing lottery, right? Speaker2: [00:51:22] So you have to be comfortable with Speaker3: [00:51:24] Losing that and Speaker2: [00:51:25] That it won't come back to you someday because Speaker3: [00:51:28] Here you are talking to somebody you just met who became your friend for the past three Speaker2: [00:51:33] Months, who gave you an Speaker3: [00:51:34] Idea that you Speaker2: [00:51:34] Believe could win, but Speaker3: [00:51:36] Also could fail because 90 percent of those Speaker2: [00:51:40] Ideas they do fail. So it's always good Speaker3: [00:51:43] To do research and believe in the Speaker2: [00:51:45] Vision that that co-founder Speaker3: [00:51:47] Or founder brings forward. Speaker2: [00:51:50] But I can leave this with anyone. Speaker3: [00:51:51] The way I see it as somebody with Speaker2: [00:51:55] 10k who can write 10k Speaker3: [00:51:57] Check Speaker2: [00:51:57] Easily, the way Speaker3: [00:51:59] They look at it is, [00:52:00] you know, even VCs kind of look at it this Speaker2: [00:52:03] Way, too. Do you have Speaker3: [00:52:04] The Speaker2: [00:52:05] Dream team or Speaker3: [00:52:06] They see the dream team who's passionate? Speaker2: [00:52:09] Do they have the vision? Speaker3: [00:52:11] And can they convince me Speaker2: [00:52:12] On that vision that the Speaker3: [00:52:14] Vision is big? There's a big market for it. Speaker2: [00:52:16] And then the third one Speaker3: [00:52:17] Is, can the team do have confidence that the team can execute on that vision? So when they have those Speaker2: [00:52:21] Three, that's like Speaker3: [00:52:23] Three good signals for Speaker2: [00:52:24] Them to kind of give you money Speaker3: [00:52:26] To go to the next step. Lots to Speaker2: [00:52:29] Learn there. Speaker3: [00:52:30] And I'm hoping next year I can continue to learn more on this space. Harpreet: [00:52:34] Awesome. Thank you so much for sharing that. Yeah, there's I know there's like these syndicates of like higher level senior level data scientists that come together and they invest in companies like, you know, I've seen a number of them and they all kind of invest Speaker2: [00:52:45] In it might be Harpreet: [00:52:46] Niche companies or something Speaker2: [00:52:48] Like that, but I'd be interested Harpreet: [00:52:49] In getting into this game. If anybody wants to start the artist of data science Angel Fund and, you know, start investing in some cool startups, I'd be down for that. Actually, I was I got approached to be Speaker2: [00:52:59] A advisor Harpreet: [00:53:01] For a startup in Twenty Twenty One and met with the met with one of the founders and started chatting with them, whatever. And then the guy just never came to me for for advising or anything. And I was like, All right, well, I guess that one kind of fell through. But if you guys any startup advisor and you think that I could advise you, let me know I'd be happy to do so. Let's see if anybody else got questions or comments on anything at all. If not, I guess we can begin to write it down. Oh, this week, the episode I release is actually one with Dana Mackenzie, who coauthored the Book of Y With You Day of Pearl. So that was a great, great episode. Hopefully, you guys get a chance to check it out. That one's about. It's about an hour and a half long. But man, great conversation. He's a, you know, he's a great writer. He's he's he's a great mathematician. So we got into a lot of cool stuff and that was definitely a great book. Probably want to Speaker2: [00:53:55] Visit again Harpreet: [00:53:56] Early next year, but go about your day of Pearl and then [00:54:00] we can. See? So definitely check that out, Ken wants to get advised on growing Speaker2: [00:54:04] A beard and Harpreet: [00:54:08] Genetics. What could I say, man? It's just the Punjabi heritage Speaker2: [00:54:11] Will get you hairy as Harpreet: [00:54:12] Fuck as is what it is Speaker3: [00:54:16] Only to check. Like I wanted to ask, like, did you guys discuss this already? What are your feelings for it for next year? Do you see a continuation of of the same Speaker2: [00:54:24] Things like we're going to keep Speaker3: [00:54:25] Talking about the same things Speaker2: [00:54:26] About science or Speaker3: [00:54:29] Because I see three levels, right? You have here's what you need to study to become a data scientist. Here's how you what you need to do to become good at maps and Speaker2: [00:54:38] How to Speaker3: [00:54:38] Deploy. And then what else? We're going to wait Speaker2: [00:54:41] For Speaker3: [00:54:42] Companies to come up with like a brand new data set that you can build on top of it. Or here's this new model that Five-fold has developed to do things better. What are your guys expectations next Speaker2: [00:54:57] Year that would Speaker3: [00:54:58] Be like? Holy crap. Like what is going on? This is changing the course of everything right now. Harpreet: [00:55:05] That's a great question. I'm I'm excited to hear what people think. I remember. I mean, look, if you've been listening to @TheArtistsOfDataScience for this long, you've been tuning in for this many happy hours. If you show up and ask a question about how you can get a job and data science. I'm going to just immediately eject. You think you should have that figured out if you've been tuning in to the podcast for a for for this long, we've covered that like somewhere Speaker2: [00:55:26] In the Harpreet: [00:55:28] Bank of two hundred and thirty hours of content that I've released is the blueprint. You've got to figure it out. Maybe I'll distill it for you. I don't know. But we did this. Our roundtable with comment we got just yesterday on the 9th, we had a tech lead from Uber, A.I. and work fusion and the real world and asking the same question I was like going into 2022. Like, what do you think is going to be some interesting developments? What are you most excited learning about? One of the things that it was an old [00:56:00] DJ from from Uber, who's the tech lead there on their platform, and he said he's curious about a future engineering is dead. I thought, that's interesting. I thought, That's interesting. It's feature engineering dead. And also, I didn't know that there's such a battle going on between, you know, this debate between trees versus nets, right? Like why you use deep learning. We can just use xG boost. And so I'm personally curious to see what kind of developments will happen for deep learning on structured data, if any, and what implication that might have for future engineering. Because if we don't have to engineer features and can start using more and more deep learning on tabular data, that could probably save a lot of time. But that'd be interesting to see how that would work. Definitely an area that I'll be exploring early next year as well. Let's go to Russell. And then after Russell, let's go to let's go to Serge and then Vincent can Speaker7: [00:56:58] And then Rashad. Yeah, I was I was just wanting to to comment on on Greg's question about, you know, what will be different next year. One thing I've noticed significantly in the last few months has been GPT three has been made widely available. You know, it's been available for a while. I've been playing with it for a little over a year now with some trial availability and hacks that I've been a part of, but it's now gone wide scale, so I'm looking at that in two different ways. One is it's going to go big or the second is they've made it publicly available. Does that mean as a GPT four on the horizon? You know, are they doing similar to what Apple are rumored to do? You know, they've got three or four different versions under the, you know, behind the closed doors that they're still working on now. And they've done as much as they can with GPT three. So they'll they'll give it away to everybody now. And, you know, within six or 12 months, they're going to be a new one coming up. Anybody got any thoughts on that? Harpreet: [00:57:59] That's another [00:58:00] interesting aspect to the future of Transformers. Like, what are Transformers like? How are they going to evolve in 2020? What are we going to see happen with that? Speaker2: [00:58:09] Definitely interested in Harpreet: [00:58:10] Learning more about that as well? Let's let's do this. Let's go to Richard Speaker2: [00:58:16] And then Harpreet: [00:58:17] Then Ken. Speaker3: [00:58:18] I don't have an immediate answer to the question Russell posed at the end. I simply have two things that Speaker2: [00:58:23] I think Speaker3: [00:58:25] Would be our will become bigger deals next year. One is the AutoML debate. So like my personal opinion, is that you can automate more and more with enough computational Speaker2: [00:58:37] Power, like the testing Speaker3: [00:58:38] Of different models. But you can't. I feel like you can't really automate experimental Speaker2: [00:58:42] Design, especially Speaker3: [00:58:44] Within a company which has a very specific use cases, very specific Data with different origins. I just feel like there's a lot of thinking there, and you could AutoML could be the window to doing egregiously wrong and misused things. So I I'm wondering. How we our industry will eventually solve that to put ML into the hands of people who are not like Data scientists, you know, in a way that's not just like generalized intelligence, like self-driving or like text extraction or Speaker2: [00:59:14] Something like that. Speaker3: [00:59:16] Yeah, another thing like it makes sense Speaker2: [00:59:18] That someone Speaker3: [00:59:19] At Google, I would be like this feature engineer and obsolete. I mean, in most companies, there's just not enough data, I think, to to make it obsolete. I find that a funny question or a funny statement. I don't think it'll be dead for a long time, although maybe I'm missing something big in the industry that will transform it. But I just feel like big, deep learning just requires too many, too many examples. And a lot Speaker2: [00:59:45] Of times like companies, Speaker3: [00:59:46] Data position is just not in a it's not built to take advantage of that at all. You know, it's it'll be about like connecting the Data. It's still about connecting the data and figuring out this lineage and legacy and making sure that it's like good enough [01:00:00] to even try to answer the questions that you have. So I feel like some parts will be automated. We're we're we're abstracting more and more of our jobs like metal flow and the like, right? But I feel like that will put more weight on the other things that you can't automate, mostly like experimental design. And what questions can you ask and answer of that data? And what can you not? That's like most important skill. So maybe I think people will focus more on that going forward in data science Harpreet: [01:00:28] Speed projection into next year. Absolutely love that. I'll actually be releasing a piece of content, a little experiment and write up sometime Speaker2: [01:00:37] Early in Harpreet: [01:00:37] Q1. Comparing trees versus nuts on a number of different Data says it'll be really, really fun project. Really fun right up. So keep an eye for that. Let's go to a Venn Speaker2: [01:00:50] And then Ken. Speaker6: [01:00:51] I think Joe Russell's point just really quick Speaker2: [01:00:54] That Speaker6: [01:00:55] Microsoft has done something really interesting with GPT three. And what they've done is they've monetized the model and they did it in three different ways. They're using it in their products, so it's supporting features. They are providing access to the model and allowing customers to use the model and build features with the model and actually incorporate it into their products. And they're straight out Speaker2: [01:01:21] Selling the model like they've Speaker6: [01:01:22] Monetized it in three different ways. And I think that's the interesting thing about GPT Speaker2: [01:01:26] Three and everything that's going to come Speaker6: [01:01:28] After that is that companies are now Speaker2: [01:01:32] Proving a Speaker6: [01:01:33] Monetization model for all of this. And so, you know, when it comes to how effective is the model, what are the flaws in the model? I think those are all pretty well known, pretty well documented. So it's not like that's the end game. But I think what's interesting about GPT three Speaker2: [01:01:47] Is the Speaker6: [01:01:48] Fact Speaker2: [01:01:48] That they're selling Speaker6: [01:01:50] It and they're making money like they're actually Speaker2: [01:01:52] Succeeding. They've done it. Speaker6: [01:01:54] And I think that's the interesting thing about GPT three is that it's making cash. Speaker3: [01:01:59] My [01:02:00] God, we did it. Speaker2: [01:02:01] And I Speaker6: [01:02:03] Think next year, what's really interesting, this is the one piece of data science that I think is interesting and it's a collision with business is, I think, machine learning models and sort of this intelligent automation buzzword is making time to particular Speaker2: [01:02:18] Business outcomes Speaker6: [01:02:19] So short that people can't be in the loop in every process anymore. I think time to business outcome is now so compressed that the speed of decision making, the speed of change, the speed of business, you could say, is faster than human scale decision making, human time scales. And I think next year, the year, we're going to see that become obvious Speaker2: [01:02:43] Where businesses that use this sort of Speaker6: [01:02:47] Machine learning based automation can make things like pricing decisions faster. They can resolve customer issues faster. They can do things so quickly that other companies simply can't keep up, Speaker2: [01:02:58] And customers are going Speaker6: [01:03:00] To basically say, Look, you're a dinosaur. If you can't handle my Speaker2: [01:03:02] Problem, you know real time, not, you'll get back to me. Speaker6: [01:03:06] Look, if you can't handle my problem real time, I don't want to deal with you anymore. Speaker2: [01:03:10] And I think Speaker6: [01:03:10] That's like next year. I think that's the machine Speaker2: [01:03:13] Learning trend is that Speaker6: [01:03:15] We are going to like, be accountable for actually driving results where we're going to have to produce things for the business. And one of the biggest business outcomes is that the speed Speaker2: [01:03:28] Of business is going Speaker6: [01:03:29] To become so quick and our models are going to become such a central part of the way that business is function that people will start talking about near the end of next year. This concept of business time scales being Speaker2: [01:03:45] Faster than human Speaker6: [01:03:47] Decision making time scales, and we will increasingly like next year, the year we start losing control of our businesses. Harpreet: [01:03:55] That's the nicest scary prediction. Speaker3: [01:03:59] I do have a follow up. Says [01:04:00] to that, especially with Speaker2: [01:04:02] Gpt three, but I'd like to hear Speaker3: [01:04:04] Other people's opinions, so I'm Speaker2: [01:04:06] Sorry for it. Yeah, yeah, definitely. Harpreet: [01:04:08] We'll go to go to Cannes Speaker2: [01:04:11] Than than Harpreet: [01:04:13] Serge, then bring it back to Greg. I think I want to see the intersection between MLPs and blockchain. I was reading this white paper earlier today by it was that I made X White Paper for the International Model Exchange Consortium members from 2019, and it was reportedly a repository of models in Data on blockchain. I was like, Oh, that might be an interesting Speaker2: [01:04:35] Application of Ml Harp Harpreet: [01:04:36] Plus blockchain, which would probably make sense for really, really highly regulated environments and even just versioning and Speaker2: [01:04:43] Controlling your models and stuff Harpreet: [01:04:45] Like that. But let's go to let's go to Ken, then Serge. And then when we go circle back to Greg for his Speaker2: [01:04:52] Question about GP3. Speaker4: [01:04:53] Yeah, so I agree with Ben. I think we are on the same wavelength in terms Speaker2: [01:04:58] Of like what one of the largest Speaker4: [01:05:02] Problems coming up in the next couple of years is going to be and it's that we cannot. We're going to have essentially like runaway models that impact our behavior, Speaker2: [01:05:09] That impact a lot of these Speaker4: [01:05:11] Things beyond what we can understand. I think we see this a lot with social media now. You know, you're on TikTok, you're on Instagram, you just keep Speaker2: [01:05:17] Scrolling, right? Speaker4: [01:05:18] We're literally being Speaker2: [01:05:20] Trained by the Speaker4: [01:05:21] Machine learning models. Right. It's the other way around. And I look at it less of like a learning what I am. All these types of things Speaker2: [01:05:30] Are and how we use Speaker4: [01:05:31] Them and how we use them effectively and how we can use them in in a as as non-threatening a way as possible. And honestly, that's something that really scares me. You know, like I have my tinfoil hat Speaker2: [01:05:43] Right nearby because Speaker4: [01:05:44] I think that there's a lot of implications of how the U.S. uses A.I. versus how other countries, namely China, use A.I.. And that is Speaker2: [01:05:52] Terrifying to me that, you know, we Speaker4: [01:05:54] Have certain cultural things around us that make it very where we protect [01:06:00] personal information, personal data and things like that in certain ways. And other countries don't believe in that. I don't think one is right or wrong, but one of those is very advantageous Speaker2: [01:06:10] For a future where AI is Speaker4: [01:06:12] Directly integrated with our Speaker2: [01:06:14] Life, Speaker4: [01:06:15] Directly integrated with financial markets, directly integrated with even something as crazy as is like war or conflict. Speaker2: [01:06:23] And I think that we're going Speaker4: [01:06:26] To really have to think and, you know, like we're moving so much slower than the machines. For us to be able to think and solve these problems in real time is very dicey. But I think we need to in the next couple of years, start having more of these conversations and Speaker2: [01:06:39] Figuring out, OK, how do we either speed Speaker4: [01:06:42] Ourselves up or slow technology down a little bit? So we don't, you know? Speaker2: [01:06:47] So I don't have Speaker4: [01:06:47] Kids in in five years Speaker2: [01:06:49] That are like that never look up from their phone, right? Speaker4: [01:06:52] Or that only know things because of the advertisements that they're being told through through whatever social media it is. And you know, that's a scary and sad future. But I also think it's one of the most fascinating problems that we could possibly solve is we've created this, this crazy burst of technology and these incredible capabilities. How do we wrangle it? How do we make it our own again before before it kind of slips through our Speaker2: [01:07:18] Fingers to start Speaker4: [01:07:20] To end on something a little ominous? Harpreet: [01:07:22] But oh no, no, no. These are important things to consider. Speaker5: [01:07:26] Go for it. I love that. I love that I am. It's actually a good, a good way to transition to that. I'm going to say I think Speaker2: [01:07:33] It's I Speaker5: [01:07:34] Think the time of reckoning has come for, you know, like the way we're approaching A.I., it's very brute force. Gpt is just, you know, another example trillion parameter, a billion parameter models aren't sustainable and they have too many gaps, too many things that can go wrong, depending on your use case. And I'm [01:08:00] just hoping we, we we hit that wall and we will, you know, like Moore's law is no longer going to be true Speaker2: [01:08:07] As of next year. And we Speaker5: [01:08:10] Have all kinds of Speaker2: [01:08:10] Resource issues, you Speaker5: [01:08:12] Know, like the kind of energy that is consumed and training these models. You know, even on the on inference, you know, it all adds up. And so I think it's we have to go back to the way things were at another time where like people had to come up with creative solutions like back in the day when I was growing up. And I'm sure some of you were also, yeah, like you could only have foul names with a certain amount of characters, you know, and that's and database is the same. You know, no field could be over and we have to get back to that, you know, Speaker2: [01:08:53] We're too spoiled and Speaker5: [01:08:54] That that keeps us from actually fulfilling creativity on the. Level, you know, thinking outside of the box, Speaker2: [01:09:03] Trying to solve Speaker5: [01:09:04] Nlp problems in not a brute force way. How would that look like? You know, and I honestly think that if there is a pathway to AGI is through all that sort of, you know, through those means through creative solutions that actually use the least amount of resources possible that think of Data in a more Bayesian kind of causal way, then, you know, purely like correlation, you know, tensor mathematics. You know, it's not it's not going to get us to that level, I think. And also, I think in the meanwhile, something that can help also pave the way is a greater use of XIII, of the tools it offers. And yeah, if I may make a commercial for my book, yeah, I discussed that in the book [01:10:00] of how you know, it's it's coming, it's coming. The need for the using these tools which already exist, which aren't perfect, and I think they can only get better, especially if people are paying attention to it. Harpreet: [01:10:14] That's what your book. Real quick search, which Speaker2: [01:10:17] Was Harpreet: [01:10:18] It called working people find it. Speaker5: [01:10:19] It's called interpretable machine learning with Python. Harpreet: [01:10:22] Ok? Is that the only book you got out is another one with better? Speaker5: [01:10:26] Well, the second edition is coming out, OK? Harpreet: [01:10:29] It's coming out. Nice. Definitely. Check that out. Speaker2: [01:10:31] Yeah, yeah. Definitely, man. You know, talk to Harpreet: [01:10:34] Me about, you know, send the copy my way and let's get you on the podcast and talk about this. I'd love to learn more about this. Have a proper episode? Yeah, a lot of lot of lot of interesting topics going on here. I don't see any more questions or comments coming in from anywhere. I guess we can begin to wrap it up, guys. Let's wrap it up next week, some of the last happy hour of the year, because the week after that is Christmas Eve. The week after that is New Year's Eve, but then we'll be back after next week. So next week, December 17th and December 17th, Speaker2: [01:11:08] Will we'll have the will, have the Harpreet: [01:11:10] Officers happy hours. Got to keep it short, though. We'll keep it on LinkedIn Alex. I got to get to a hockey game after that, but then we'll be back on January the 7th, which is the first January in 2022. So looking forward to it, we can call it next week the the holiday party man. Hopefully you guys can make it. All of you guys listening, please, please come and hang out. All the guys that haven't been in the office AIs for a long time. I do miss you guys. Please come and hang out. It's been a while since I've seen a lot of you guys. So do come do hang out going live next week on Tuesday, twice on Tuesday. Talking to Dr. Laura Pence, she coauthored this book with Joe Dosani 10 Rules for Resilience. So we'll be talking about her role in this book. And then also, she's [01:12:00] got something called Life Box, and we'll be talking about that Speaker2: [01:12:03] Then that same day. Harpreet: [01:12:05] That's the actual day where I'm talking to the folks from one small team. It was not this week or next week, so for those guys are waiting for that. Sorry if I was leading you guys. But yeah, we'll talk to Jonathan Javier and Jerry Lee of one small team next week. And then on the Speaker2: [01:12:18] 18th Harpreet: [01:12:19] Of December, going live at 10 a.m. central time with with Jeremy Anderson, we're going to be talking about his Speaker2: [01:12:27] Book Minding Harpreet: [01:12:28] The Machines, Building and Leading Data Science and Analytics Teams, a widely publication book. So this is going to be a great discussion. I'm looking forward to chatting with him. Speaker2: [01:12:38] You guys, thank you so much Harpreet: [01:12:39] For joining me. Be sure to tune into the podcast released today with you. Oh, sorry. Dana Mackenzie, who coauthored the book of Why with Data Pearl, so be a good time. Ken says he's going to be in L.A. next week. If anybody wants to hang out with him, send him a message. I wish I was there with you. I wish I was there. I'll be in California in January at some point supposed to go for the holidays, but did not give my son's passport in time. But I'll be back Speaker2: [01:13:06] In Sacramento in Harpreet: [01:13:08] In January. So then I'll make a trip out to Reno, man. I'll keep. Yeah, yeah, we'll definitely come to Reno and help you guys take care. Have a good rest of the afternoon evening, wherever it is. You all remember you got one life on this planet. Why not try to do some big cheers, everyone?