comet-ml-march21.mp3 [00:00:09] All right, what's up, everybody? Welcome, welcome to the comet and our office hours powered by the @ArtistsOfData Science. Super happy to have all of you guys here. Thank you for joining us. I know it is a bit of a unusual thing that happens every year with daylight savings. And some people around the world are probably wondering why when they show up the offices that are happening, because it's probably happening an hour earlier than what you're used to. But happy to have you here. We got the ASHO here so far and that the the attendance will slowly start to pick up. But before we start getting to some questions, ideally, let's talk about being a generalist versus a specialist. So what do you think or what's what's kind of your view on this? [00:00:52] Yeah, I personally, I think there is a lot of power in being a generalist generalist. But I think it depends on the kind of industry you're in, in the work environment. So that's where I've thought about this split between having a specialization in something and super deep specific knowledge or having more of a general and and knowledge. I think that I have been somewhat more of a generalist in my career. But at the same time, I still see value in spending extended chunks of time working just on NLP or just on computer vision. So it depends on where you're coming from, job history wise and where you want to go. I'd say, [00:01:34] Yeah, we're talking about this during our office hours as well and on Friday about generalizing and specializing. Like for someone like me who is very, very intellectually curious, I just want to learn as much as I can about a number of different topics. So my kind of philosophy and viewpoint is really develop an understanding of the basics of the fundamentals. Right. And that's everything from, you know, coding to stats and math and just a general understanding of the machine learning lifecycle and process. And with that solid foundation, learning anything new isn't too much of a stretch. Right. Like, for example, I've never once done a NLP project or deployed an NLP model because I haven't had an opportunity to do that yet in my career. But I know that when the time comes for me to actually start implementing some of this stuff, it might be three days of research time and understanding time and then maybe four or five days of playing around. And I'll eventually get it and be able to start delivering value for my for my company. So that's kind of like my view point of view. Like I'm very much for the generalist just because when you're a generalist, you get exposed to many, many, many different problem statements. As a result of that, I think you might be able to better match a strategy to a problem type if you're exposed to a bunch of different things. [00:02:51] Exactly that. I think that every time I've worked to something new, like computer vision or NLP, I get exactly what you said. So I'm kind of presented with a new problem and like, OK, I have got a couple of days of pure research trying to understand the other ways and solutions to deal with these problems in the space. I think that's the one thing that's hard as a generalist is keeping your ear to the ground, to some of the things that are really deep inside a specialization. But aside from spending that those days on research and then toying around before you start doing like serious modeling, I don't think it takes that much necessarily for a generalist who's got really solid core skills to then apply that to a specific area or a specific Data type. [00:03:38] Yeah, and it's it's also a kind of a trade off for for I don't say try it out for time, but it really is like, you know, as a as a specialist, if you just dedicate yourself to just studying one thing and just go deep on that, that's great. But then do you kind of pigeonhole yourself as to being the person for this, like for this particular type of problem? Yeah, I don't know exactly the point I was trying to make, but I guess from my point that I'm trying to make was that when [00:04:04] Hey, Susan [00:04:06] Walsh, you're here [00:04:08] Looking at me like I kind of have Harp on his own. [00:04:12] But thank you for joining. Yeah. Yeah. We're trying to get trying to get people into the discussion because I've [00:04:17] Got my Matear here, so you'll have to excuse me while I'm on tour. [00:04:21] Oh yeah, absolutely. Do don't worry. Yeah. We're talking about generalizing versus specializing for for your career. [00:04:28] Oh I've got loads of ideas on that. [00:04:31] Yeah. Let us know when you want to, when you want to chime in. I'm definitely excited to hear what you think about that. So let's do it. Yeah. So what do you think generalizing or specializing in [00:04:41] Super finishing down has paid dividends to me. I started out pretty niche anyway, but trying to be everything to everybody, just just from a business point of view. But I think it works the same with careers as well. As soon as I really honed in on a specific skill and started talking about that, people started to take interest. [00:05:04] And what is a specific skill that you decided to kind of double down? [00:05:09] So I classify spent Data, so I take financial data from my clients, normally comes in a suite and it can be anything from a couple of thousand rolls up to a couple of million rolls of Data. I have developed my own methodology in a tool called Omni Scope. So I normalized the I clean the duplicate if needed, reformat the columns. It comes from multiple sources and give them back a wonderfully cleansed and classified data set that they can set for the first time in their lives, know how much they're spending on office supplies or legal services or whatever and. [00:05:53] Yeah, yeah, yeah. Because that's in. You go deep into. But the range that you're able to apply that in terms of industries and companies [00:06:03] Across the industries that I work across includes manufacturing, market research, retailers, broadcast companies, pharma, medical education. So yeah, it's niche but it's, it's wide. And I think I've been extra lucky in that I started out just specifically targeting procurement people because that's who my clients are. But as a wonderful side effect, I've fallen into the Data world because actually a lot of what I'm talking about is also relevant to the wider Data world as well. So when I started, I might have been doing offering the classification and cleansing as well, whereas I now say I did classification. I also do other bits, but I focus in on the classification. [00:06:54] I like that. So that that's that's an example of meeting that just has a wide kind of applicability. Whereas if you Neshin just like one specific thing like that, I'm just computer vision. It's like, OK, well you're kind of limited to very one small. [00:07:08] You are, but are you increasing your value because there's only one or two of you. [00:07:12] So yeah, that's an important point. That's the that's the next point I want to make is, is to really maximize value to this concept of combining skills that are different. Right? Yeah. And being the best at the intersection of skills I think is extremely, extremely valuable. Right. So for example, like I don't know how many. There's a handful obviously of data scientist who are podcasters and mentors like this little weird intersection that I've created for myself. Yeah. As a data scientist who does this type of stuff, it just creates opportunities for me that might seem lucky to other people. But really, it's [00:07:50] Yeah, I'm right in a Data book. I mean, that's like even a year ago, if you'd said that, I would have been like, no way. Yeah. [00:07:57] What do you what do you think? [00:07:58] Yeah, I have a question, Karen. I'm sorry. Susan, sorry. I'm so sorry. Yeah. Yeah. So, Susan, when you say you cleansing Data, that is your your specialization. Right. Can I ask you, how are you cleansing your Data? Are you using such a certain type of tool to cleanse it or. Yeah. Is that, is that your specialty cleansing the the Data. Yeah. And I think the difference for me is that I really enjoy it, so I enjoy the process of doing it. So I do it in a tool called Omni Scope. So it's a Data modeling and visualization too. And I can have like aggregated tables at the top and then the tables of Data at the bottom. I can have visualizations and charts at the top, interrogate the data, find something's wrong, it in the table below. That is the raw data. Immediately it's updated. So it's the only tool I know that can do that. And so it means I can like change. I can I've got methods to check and change things really quickly that you just couldn't do in other tools. And interestingly, I'm right in the book at the moment and I'm actually doing the show and everyone how to do it in Excel, which is a really long and lengthy process. But to make it kind of quite widely available to everybody so that anyone could pick up the book can do it. I've done it in an Excel, but I'm going to have a section around the tools that I use, which literally Sheas days of of of certain projects, you know. And when is your book releasing or Target really? Well, the deadline to get it right, the second of April. So then it should be out within six months. So, yeah. OK, so somewhere in the end of the year. Kind of. Yeah. Yeah. OK, cool. [00:09:56] That brings me to this like this, this thing that I'm working on at work. Right. Which is it's developing a Data strategy for like this massive organization. Right. And this is not a skill I don't think that many. Scientists have we're I mean, most Data scientists really at the end of the day, were end users of Data. Would you agree with that? Ideally, I would say so. [00:10:15] In the ways that we touch it, we are less so in any of the planning and phases of the organizational pieces to to have actual Data strategy. I don't think I've seen a lot of data scientists on that side, and [00:10:32] This is going to be a unique intersection for for me after I go through this project, like developing a Data strategy, understanding kind of the lineage of Data from how it is generated in the real world to then being able to model it like that whole kind of spectrum. The whole lifecycle is something that most scientists do not get exposure to, and that's like a unique intersection. Now, I become a strategist who understands data scientist. Have you ever had to implement like a Data strategy? And I'm wondering if you had any challenges that you foresee me facing in the near future? [00:11:06] You know, truthfully, I have it. I have I think you're right in saying it like the vast majority of polls are end users and building new things. But yeah, I haven't really played a role on the initial side to set that up. [00:11:23] Yeah, it's like I've got I've got like books here that are referred to the modern Data strategy book and the data management tool kit. And then recently I've been reading Scott Taylor's book telling you Data story, and I find it like this book is relevant to where I'm at right now because I've got executive like stakeholder buy in and I've got plans for how we're going to start identifying our business drivers and where we're going to scope things out. And from there, it's now OK, once we have clarified our scope and clarify the business units we want to approach for this particular prototype. Now, it's a matter of me selling this to them. So I have to tell a Data story. And Scott's book has been tremendously helpful for that. I highly recommend it. If you guys find yourself in this position, which is an uncomfortable position for me to be and do, like I'm just I'm just I'm a statistician, a machine learning practitioner. Like this is really out of my element. And I can't say that I enjoy it, but I know that on the other side of this is going to be invaluable, invaluable skills. [00:12:29] Harp, you said you are uncomfortable with the Data storytelling. What does that mean? What what what does Data storytelling mean? Is it just [00:12:36] So in the way Scott talks about in his book? There's two different types of Data storytelling. There's analytic storytelling when you're trying to communicate, convey insights, and then there's storytelling in the sense that, OK, we've got to get people to start changing their behaviors and changing their processes and then telling them a story about how their lives can benefit if they get their Data in order and manage properly. And for somebody who doesn't really understand their business processes, it's going to be a lot of digging and interviewing and communicating and then articulating their pain points as a Data story and saying, well, you know, if we had proper data management and practice, this is what life could be like. [00:13:17] I yeah, I guess I'm in that same situation, too, with my current company trying to get their buy in and, you know, trying to see if we can look at a strategy. But first, I need to prove to them that, you know, just taking the data base, whatever they have and seeing the benefits that comes out of it. First, I need a story for us in the analytics side. I guess that's what I'm trying to say. And then perhaps then maybe they'll be buying into the data in terms of, you know, hey, if you do Data, this is what will happen to your company. [00:13:46] And here's the thing. Like, you don't necessarily know every company needs to be super advanced analytics. It could start with something as easy as finding a instance where two people are creating a report on the same information. But yet the reports differ. Right. That's a good use case. Like, you know what you guys are talking about. The number of customers will finance says we have this many customers and customer service says we have this many customers, like why are there discrepancies? Have we not agreed on what a customer means or we not counting them properly? Do we have multiple versions of the truth? How do we reconcile all of that and make it consistent throughout the organization? Because without that, like your work as a data scientist, as is complete shit like this company that I used to work at as a tech company, a tech start up, zero Data management whatsoever. And it just made my life complete hell because I could not get them answers. They'd come ask me questions, a question as simple as how many people are using our product. And it's like, do the guys like not even collecting your data properly, like your overwriting rows in a table and oh my God, it was horrible. And I couldn't get answers and told me that nobody else knew what the answer was either is frustrating. [00:15:00] I think that's definitely what kicks off a lot of organizations recognizing that they need a Data strategy is that they've got. The process they've hired Data scientists because they hear they're supposed to be doing machine learning and then you go and read a basic report and either your databases aren't set up correctly or it's not a good source of ground truth. And it really is like Data scientists stuck in the mud and stuck in a really hard place. I don't person who has a lot of other resources around corporate Data strategies is Lillian Pearson. I believe she's the author of Data Science for Dummies. But I've seen a lot of her work that is really based on how do you start this Data strategy wherever you're working. So that might be another good resource. [00:15:50] I'm interviewing Lillian next Saturday, the twenty seventh, and we'll be talking about that. This going be a very a very prevalent theme in our conversation. This is Data strategy also, [00:16:03] By the way, Harp, I love your podcasts. You have some very interesting people and I just want to let you know. So I listen to it while I'm walking or you know what, I'm having dinner or something like that, especially while I'm walking. So just keep doing what you're doing. Great people, great podcasts. I love it. Love it. [00:16:20] I thank you. I'm so glad you enjoy it. Like there's moments where I'm just because it's a lot of work and it pulls me away from a lot of different things. And there's just moments from this work that is what I like. Why am I doing this? What am I hoping to achieve from this? Why am I doing this? But then I hear people like you make these comments and it makes makes it all worth it. Like there is one of my mentees from Davison's dream job, Naresh, who I'm surprised to hear is usually in all the meetings. He was saying he liked that episode. He shared it with a friend at one particular episode with the friend. And the friend was just like, wow, man. That was like really insightful coming. Helped get him out of a rut. And I was like, all right, well, there you go. That's having an impact. That's had an effect. Yeah. [00:16:59] Yeah. Keep doing it. I love it. [00:17:01] Thank you. Yeah. Yeah. I've got I've got a bunch of cool stuff happening. Like my one year episode should be dropping three or four weeks on April 9th. And that episode is going to be released with Robert Greene, author of Forty Lots of Power. So that's going to be a big one for me. But yeah, let's open up for questions. I know you've been sitting here patiently for quite some time. I'm happy to take on your question if [00:17:24] You have a question. OK, yeah. So so Data scientist, right? It's a broad area. I'm just wondering, is there whether it's Data scientists or data analyst or any other different flavors of data scientists like our data analyst? I don't see that spoken very much in in where the LinkedIn in my readings, nobody kind of Sigmund's them in different groups. Are there different flavors of them? And if so, what are they ideally? [00:17:55] What do you what do you think? I'd love to hear your perspective on this. [00:17:58] Yes, I think I definitely think there's a lot of different flavors specifically of data scientists, because it is, I think in business, way more broad than most analyst roles, I'd say about which flavors there are. I think that's hard because we could consider a lot of like the specializations of flavor. Like if you are a generalist, you maybe, I think flavors of also like industry as in specific industries that are really hard to break into without prior knowledge or experience. So a lot of people who are doing data science, like the big tech companies, are doing a lot of like product data science. They are looking at churn rate analytics. Those are very basic things that you would be expected to know. I'd say things like health care and finance would probably be their own flavors. And that's at least as far as I've seen resumé wise, a lot of data scientists in these roles had other job titles in finance prior or in health care prior. And it just has a I think it poses a little bit of a higher barrier to entry without a little extra like it's not really specialization, but that extra domain knowledge. So I'd say there's probably still a lot more, but I think that you can look at it as skillset or industry based as a couple different ways. We have flavors of data scientists. [00:19:27] So for somebody who wants to break into Data, what will be a good industry to get in first maybe, or should I put it in a way if the programing language OK, two two ways. One is what would be a good industry to break in the first brand new data scientist? That's one one way of looking at it. The second way is if you are not strong in programing or stats or algebra or whatever, what would be a good industry to get in? I guess I'm looking at two ways. I'm just kind of curious, based on your experiences and so forth, what are your thoughts on those? For the first part, really? [00:20:06] I think. Personally, I think product analytics, and that's the that's a lot of what you would be looking at it, a lot of big tech companies are really well known or most Data scientists are looking at their product and not really working on, like, deep learning on self-driving. Right. Like, so if you're looking for the broadest range and probably the area with the most jobs and the most entry level jobs, those roles will give you an understanding about things like churn rates and conversions, all of the things that are important to a large number of companies as far as like industries that are slightly easier to break into. I'd say that's hard because it's dependent on what kind of experiences you have prior. So if you have and I've said this to a lot of people, but if you like me, I worked in marketing. So I said as a data scientist in the marketing department and I've done a lot of data science on marketing Data. So whatever your either kind of prior career wise or what you're transitioning from, that's where I would start. If you look at the closest industries adjacent to or companies that do similar things that maybe have openings for folks in Data science. [00:21:27] Yeah, I'd say even like it sounds like you mentioned is a broad umbrella category. There's so many different pieces to array. Like to me personally, I consider B.I to be part of Data science, like that's in the Data science umbrella. Right. So if you want to find a role where, like you mentioned, you're not doing like hardcore coding, you don't you're not doing a bunch of research or having to know all these quantitatively rigorous concepts, and maybe that is an avenue that you might find interesting because you get to work with Data, you get to help communicate insights and and drive the business forward. So that might be an avenue. But Data science is such a huge broad umbrella category. I find there's that the I've seen through this ice cream business, people that come to it from a software engineering background who probably are more tailored towards like Emelle engineer type roles. And then there's people who come from an academic background or a stat's background who then have to learn coding. And those people might prefer a research environment or something like that. [00:22:28] Do you see research on Harp? What do you mean by that? Like in academia or like research in a company or startups? Yeah, exactly. [00:22:37] Like to me, I would consider any role that doesn't work on a two week sprint type of environment. To me, those are research roles and I personally love those Harp rules because if you're working in these two experience, you're typically on a product team, I would say. Would you agree with that? I would [00:22:54] Absolutely. I think especially for companies that are smaller, it's easier to it's easier to do some of this research in that your data science work is based on experimentation. So you may be spending a lot of time obviously creating models, but you're not really working too cool. I have a model that's at version one. We need to get version two tested and ready to go to be in the product for the next update. So I'd say that's kind of the difference between more research focused data science and industry. [00:23:27] And so experimentation like research is what you see. And some models. [00:23:33] Yeah, yeah. I definitely like the role that I have right now. Like at a Price Industries. It's very much it's a research focused role. They brought me and said, you know what, we've got a problem that we think would be well suited for Data science machine learning. Can you investigate this problem and see if you can come up with the solution? Right. And involved, you know, quite a bit of research and and lead time. Like I wasn't building a product and having to constantly update that product and add new features to it or remove features or do a B testing anything like that. It was just I can this problem be solved and can I use my methodology to solve this problem? And once I develop a prototype, OK, great. Let's put it into the system that it is going to be serving predictions as a part of. [00:24:17] So so you do need some you definitely do need coding skills in that area, [00:24:22] Am I right? Yeah, because I mean like I built a ready to deploy model. Right. So I just it wasn't just like something I developed in a Jupiter notebook and save a typical file like it was an entire pipeline from grabbing the raw data from one place to then, you know, developing the model and then after that getting all the metrics to make sure that the model was performing adequately once it was deployed. [00:24:45] So I have another question. Sorry, I'm just like bombarding questions, you know, so so I was listening to a NLP as a panel talk maybe. I think it was on Friday last week. And I was posing this question where I to be honest, I'm still learning Python. I'm not very strong at it just yet. I know the basics and stuff like that. And it will take me a while to kind of get strong at it, so then as I was posing this question, I was asking them different Data related roles and where I can kind of get in quickly are waiting for me to get really good at programing and then become a Data, AIs a year to which I don't have the time for. I want to get in and get a feel for Data and start working little projects. So someone was saying in that panel, maybe Mellops is one way to is one way to kind of get a feel for Data and stuff like that. So I was kind of confused by that. Why would this would be really tough for somebody like me who's not strong in programing, just looking through the pipeline, just seeing that the model is working and highlight that. And it was not working like the Data scientists know. Is that what it is? I mean, I'm just kind of curious to get your feedback [00:25:58] From everyone who is going to be the expert here since coming. Powell is. [00:26:02] Yes, I would say it's still really important in that I can speak as someone who has not had experience with Mellops early on in my career. When you start to get Data science jobs where you are deploying model of building pipelines or even working on teams that are at that stage, it gets so much more messy and complicated, I think, than we really expected it to. So I think a lot of and I can speak to my graduate school experience and we got to pick our models and then, like, throwing it to somebody else to do it. But what I'm Mellops allows us to do is first see documentation. So I've been on teams of Data scientists are like we have 40 people on our team. As you can imagine, there are a lot of times where we are overlapping in our work. There's a lot of times where we are human. And so we slack on that. Like Data said, documentation. You slack on our model documentation, but we go through all the steps to train a model and at the end we're collaborating with our teammates and oh, I just can't get it to work on their computer or those. And those being the hang ups that take days and days has obviously cost like companies a lot of money. So and a lot of would get us to first better documentation because it logs all of the environments, the hyper parameters. When you build a neural network, if you're working with any kind of modeling, it'll save up all of your model metrics. And so then at least for comments product, you can go and compare how these models did against each other. [00:27:42] So if you you're in this research experimentation role, you built 15 different models. It's so incredibly hard to pinpoint which is the best model exactly when it converges when we're not using, like, tools that just track this for us. And I think, like, I've seen some really like senior deep learning people who are like, oh, I just use spreadsheets and note down and copy and paste my model metrics all the time. I'm like, that's really difficult. That's not a good process because it's manual and we mess up. And I've had copy and paste issues with I mean, it's common. So and despite the fact that you may not be like doing a lot of the heavy coding, it's important to understand that like a lot of the pain points you would face, you don't have to because you can use tools to, like, avoid you having to deal with those problems. I think that's why I think especially early in your career, it's good to understand at least what my lapses. So it's documentation tracking of your experiments as well as tracking models when they're already in production understanding because we can have accuracy draft over time or when we get real world data that hits our model. It's hard to tell why they start misbehaving. So it helps us pinpoint and debug, first of all, because I think that's the biggest problem in building machine learning models and deploying them is trying to then debug why they aren't working. [00:29:13] Okay, so is this is this area fairly new Mellops? Is that what people are trying to understand? The models and trying to document it? And this is a fairly new field, so. Oh, I see. [00:29:27] This is then maybe I would say maybe about five years that it's kind of been talked about a lot more recently. But because really large enterprise tech companies are putting models into production and they realize, OK, we're making decisions about really important things, we're not able to understand why they're going wrong. And I think a lot of tools like Common have come up because really, really important models like for health care, for policing, for all of these things, we should be able to investigate why. And by using a tool, it's easier than relying on individuals and. [00:30:07] So if you're using a tool to fight the Mellops just to see how the model is behaving and so forth, what skills do you need? What school does a person need in order to? And I know one thing for sure. They have to understand the truth for sure. What are the things that they need to be educated or be aware of? What skills? [00:30:26] That's the biggest piece. And some of this does come with experience is an understanding of what looks right and what looks right for your Data, because it's not going to be the same, obviously, as someone in a different company. But when I say that, I mean, it's hard to identify, like if something is going wrong, if you don't have really, like, domain expertize. So I do think that's part of it. I would also say like a good statistical background to know, you know, certain that, like, you want a lower like R squared or something like that, like you have to understand where you want your models to be. So it's easy to lump that into overall just Data science experience. But in understanding specifically about evaluation metrics, a deep understanding about like machine learning, evaluation metrics, I would say is B is the most important thing to then use a tool like comment effectively. [00:31:24] Ok, just an example of what life looks like when you're it to it. When I would do cross-validation, I would be writing everything to the log. Right. Which type of parameter configuration I got, what were the evaluation metrics and how long it took to run. And then I would literally take line by line from the log. And you know, at first I would just put it on a spreadsheet and then I got smarter with programing that I learned to put it in your panties Data frame and do it myself. But like a tool like it just it does all of that for you. Like, I don't even need to create graphs or anything. Like, it streamlines everything. It's quite nice. [00:32:01] Ok, and I see a lot of machine learning more Mellops is developed deep learning. Is there such a thing? Just kind of curious. I think thus [00:32:10] Far it's been lumped in with Mellops and that I haven't seen much since. Deep learning specific. OK, thank [00:32:17] You. I'll open it up to anyone else who might have [00:32:20] Something good to see. Mark your man. It's been a while. Let me it over, Tasha, because I know she was the first one here waiting patiently. So if you got a question, go for it. You are currently muted. [00:32:29] So that's been happening a lot. Yeah. My biggest question. OK, one question. My question is what's the biggest challenge you've come across during your whole field, like as a data analyst? And what what are the things you've learned that you could have avoided this? What are the little hacks you've learned along the way? [00:32:46] Oh, man, that's a good question. I might take some time to think before I flip it over to whoever else wants to go first. So the question you're asking is, throughout your career in this Data world, what are some little hacks that you've picked up along the way that have helped make your life a little bit easier? Is that kind of the question? Awesome. Let's let's see. I really do want to start, Mark. [00:33:07] Yeah, I would say this is hard because I know there's there's a lot. First, I would say the it's not really a hack, just a thing I would suggest you do. But if you are a standard data scientist, data analyst and someone is coming to you with projects or problems they want you to solve, I'd say first, if you can spend some time with them understanding where they're coming from before having to look at the data, before really working on it and trying to ask if you are truly answering, if they're asking the what they really need is a first step, because I think sometimes a lot of people don't know how to ask for what they need being in other parts of the organization. And second, kind of in the same vein, I would ask why a lot? And like, you can go with the five wise. You can whatever method works for you, I would empower you to feel comfortable challenging some of the assumptions they make and then understand that your modeling process is not necessarily handcuffed to their ask. So if they ask for something and I think I might have talked about this before, I've been asked to build a gender detector after understanding or predict someone's gender based off the first name. So after understanding, OK, marketing just needs direction on how to segment users, I turn that into let's create a customer segmentation problem based off their behavior. So yeah, I would go with it. It's a lot of like the interpersonal skills and managing the expectations of people when they are working with Data scientists and analysts. [00:34:53] Yeah, love that. So I would say the biggest hack I've learned is not to just jump right towards offering a solution without first understanding the question and just assuming that I know exactly what needs to happen in order to make this thing work. And also, it's not really a hack, which is the subtle mindset shift is just because I know about Data science methodology doesn't mean that because I because the model says it should be this, that that is actually right, that actually intuition and business sense has a huge role in anything that you're building. So without operating that kind of context of what's the business situation, what does a stakeholder know, how could I develop something to make their life easier? That subtle mindset, mindset shift has been huge because once upon a time I just be like, oh, Data man, I have to Data says this this must be right. Why are you what are you going off intuition when the data says this like that that was a mindset shift whereby you [00:35:52] Use both the Austal the words right out my mouth, learning how to ask good questions before you start coding or even looking at the data will save you so much heartache and headaches. Early on, a lot of my I'm still new to this field, but when I was new, new, new, a lot of my early mistakes were just jumping straight in and trying to build exactly what they asked. And I spent all this time and that gave it to them like, oh, actually that's not where I want it and I'll be because this is what you ask. And honestly, that was on me for not not picking out the right requirements. And so for me, kind of like a mental hack was like shifting my perspective from trying to deliver as data scientists of being like I am a startup within this or delivering products. And so I need to figure out requirements like why am I why am I getting, well, my building. But even on the other end is like making sure when I deliver a product, people understand it. So to communicate like for example, I have this as a good example for my job. [00:36:58] I NLP model that I didn't even use an existing model to create a pipeline for it. When you say NLP like that's a fancy buzzword by like people who are actually using it and like actually like customer success or sales selling this product to our customers, they don't know NLP means. And so I have to figure out how to communicate that in plain English, what it is and what's the business problem it's solving. And that's all part of a handoff on the other end. And then for like a hack hack hack kind of thing, Google Sheets are my best friend because, like, I can ask great questions by like once I show them, like, hey, this is how I interpreted this, spending five minutes to hack together. Quick table like this is the output you expect or even like paint. Right. And I show that to them. They can quickly say, like, oh that's correct. Or Oh, that component, please change it and save myself hours of coding. [00:37:54] And for something also more along the technical lines, I would say adopting a framework for all of my projects, something like cookie cutter data science or Khadra. More recently, it's been controlled since that's been newer. But adopting kind of a principled workflow has made life a lot easier and along with asking a good question, really clearly defining an analysis plan for myself. So I just don't continue going down the rabbit hole and, you know, not setting up guardrails for myself and also making good use of extensions and a good idea like this code having like a linta in place and just finding useful extensions in code to help make my life a little bit easier. So if you want, like, some more tangible kind of technical hack's, that's that's what I kind of picked up over the last couple of years. Jill, I see that you're unmetered. [00:38:43] Oh, that's not intentional. [00:38:46] Oh, I thought you had a question about that. [00:38:50] I just I just got on and. [00:38:53] Yeah, no worries. If you got a question, definitely let us know. Betore if you want to share some hacks. So you've picked up along the way in your career just in general on how to improve my own workday. [00:39:04] But I think the key, as mentioned earlier, is like how to to really understand the results. I mean, that's or what people are looking for that's key and understanding [00:39:16] What they're looking for. Sometimes people, [00:39:18] Like I've said, is that they come in to us and say, we want this. But at the end of the day, one time you really talk to them. And to understand it's not what they're asking, what they're asking for, it's not what they're actually looking for. So you have to really try and understand. And that is key. The other thing is don't jump to conclusions. That is a mistake that I made tons and tons. I have my opinion already concluded. I know exactly. I'm always right. But, you know, jumping to conclusions, that's normally [00:39:52] Where I've made a lot of mistakes. I talk to [00:39:55] Someone I think [00:39:57] I understand, I conclude and then I make. [00:40:00] And then you deliver and nobody is happy. There's also been a. Implications and consequences and, well, that's life. What can you do so but the other thing also is like explaining people what to do. I mean, when I'm listening to you guys for David lyrics, I mean, I'm doing that constantly. I'm in my own I'm developing a budget now for a resource project and, you know, with projections and what a situation it's. If I have more users, less users, renewable users and all of these things, to me, the key is really to just really start to the end where you are or at the beginning and then learn as you go. But not that there's no way around it. Don't make conclusions. [00:40:54] I think that it shows that with some good tips for you. Is there anything else that you wanted to dig in on for those responses? [00:41:02] I know that that's pretty much it. Thank you. [00:41:05] That's really a great question so far. I really, really enjoyed these questions. That's coming up. So let's turn it over to the Jill Meagher Kossuth, if you guys have questions. [00:41:15] Ok. [00:41:16] Sorry, I sorry I jumped in late and I didn't hear the original question, but it sounded like it was like Hack's. And just like take me for the answers that I heard, it sounded like, you know, what is what is their goal for the the project. But just like on a smaller scale for like trying to solve a problem. And it's hard to just I I found if I if I'm trying to solve a problem and I don't know, like what algorithm I need or what I need to do that just writing out like what I want in words brings so much clarity for, like understanding what I'm supposed to do. I don't know if that was what the question was about. [00:41:58] Yeah, I see more pulled out. Was that a dingbats notebook. Because that's the AIs. Yeah. Like so writing stuff out man that is super helpful. I've got like no fewer than like five notebooks sitting around that I that I write with and it just helps you think so much more clear that diagraming stuff out. Yeah. That's, that's a huge key. I don't, I don't think this younger generation spends enough time writing. You need to start writing more. Yeah. Mark, if you had a question or anything, if anybody asks question, go for it. [00:42:26] Yeah, I, I'm, I'm just really curious, especially for those who have put models in production. I'm in a unique position where I'm one of the first few Data scientists at my company. So I'm getting to chart the territory and define like what is Data science look like at our company, which is really exciting, but also like overwhelming. And so we're not doing Emelle right now. And that's very intentional because we need to build products and get more Data features to actually enable that. But now we're starting to think, OK, that's the future. We want to start building infrastructure today to get these Data features to enable us to actually do those more advanced analytics in the future. So we're have the same plan when we actually want to start. And so my question is, looking back to when you're kind of now you're building a model, now you're kind of production, what were the hangups that that you came across where you felt like if you tackle this early on or maybe more so, had a clear mindset when you first built this infrastructure would make your life easier [00:43:28] Writing modular code that can easily be cleaned and replaced and just not spaghetti code. I think that's like the biggest thing I would advise is writing your code in such a way that you're not just like ideally you don't want to be in a position where you just build the entire pipeline in a Jupiter notebook and then you have to copy and paste everything back into PI files. Right. Starting starting from developing pipelines right from the get go. I think it's been helpful. I'm not sure if that's answering your question, but that's like the biggest thing I I've done to help make it easier to deploy models. Yeah. [00:44:02] Even going a step before that is like I'm working on the pipelines is like Data the creation and data storage. That's where we're currently at. And to kind of get context, like we the the web, you know, SQL very nested data. And also when we build a new feature, it's like, hey, when you get this out to a customer and it's not, think about the overall picture. Like what? It's like a Data story or like a Data like what metrics we want. And so, like, even before building the pipelines are even more foundational to that. You have a sense. [00:44:32] Oh, man. So I just kind of think about what Data we should actually collect because we can collect so much. Which one do we actually need to help inform our decisions that that type of question? [00:44:42] It's still raw thoughts. I have a clear question. I think between that, [00:44:48] Yeah, I was working, um, like I was showing these books early because I'm working through something similar myself. And I'm pretty sure that at the end of this thing that I'm doing at work, I'll have many lessons learned to share with you. But building out like a Data strategy looks like modern Data strategy has been an excellent book, so I highly recommend this one. This is all just about data management. Everything from metadata management, where you have data about the data to Data governance and all the other buzzwords related to to the data management sphere. But I'll give I'm rambling a little bit, so I'll give you the before as far as better insights than I do. [00:45:25] It's funny, I was actually going to pretty much just double down on everything you said. That is such a difficult stage to be at and especially when it's growing pains and it really is. Honestly, I can't think of anything specific outside of what Harpreet already mentioned, really to ease that process. Right. So I would also like to see if I have a book on my bookshelf that is helpful. But I may have I may have a recommendation coming in into the soon, but I can't think of anything else off the top of my head. [00:46:01] So to give for further context as well, like a current project that's like really tied to this is like I'm currently exploring how to restructure a data warehouse and build pipelines to enable us to actually do analytics faster. Right. And discover our data quicker. So that's like the core component. I'm currently out right now. [00:46:19] So you do some proper, like Data architecture type of stuff [00:46:24] For you when you say discover Data through a pipe. How do you do that? I mean, like, how does it work? Like, how do you discover Data that you don't know exists in the company? I guess I'm kind of curious about that. [00:46:38] Yeah. So essentially, like in a day where a startup we're building a plane as we fly it. And so, you know, as our product matures and we try to get that product market share, like what's the correct approach, we may shift and our product may shift and our Data therefore shifts as well. What log's we have. And so before we may have had something like these are separate product lines, but now we're maybe shifting to like actually we need to combine this little platform. So now we have separate data sources that were initially thought to be combined. Right. And so now when the comes into the data warehouse, how do you combine these data sources that were originally designed to be separate and now that's where our product is shifting? Right. And so so like the data is there. But like, you know, engineers are like myself who are actually going by. Right. Python scripts to pull the data. Like we can do that. But that's a lot of work if you want to answer, like how many users did X, Y, Z for both of these products? That to be a simple question you should ask. The Data discovery of that is really hard after write some crazy SQL queries just to answer that. So have [00:47:43] I looked into like dimensional modeling techniques, like the Kimbal dimensional modeling type of techniques that [00:47:51] I have not. And that's a nice keyword for me to look up yet. So that's all part of the process right now. [00:47:58] Are you part of the slack community at all? You might not [00:48:02] Need for a how to how to get connected to that and to figure that out. [00:48:05] Yeah, it's a sloppy link in there, but I have a pirated copy of the book in the community, but it's the Kimballs dimensional modeling and maybe I can pull it up and just kind of give you a, um, a idea of what he talks about in there. But it sounds exactly like the stuff you're you're dealing with because you're going from how do we take all this raw transactional data and then move it into a data warehouse so that it's aggregated in a smart way. Right. So share my screen location missing, but yeah, I'll get this book to you somehow, some way. Um, I could look that up in the meantime. Yeah. [00:48:45] Yeah. My job a super supportive so I don't get the bug name. My job would be like here, whatever you need, we'll buy it for you and make sure you build something for us to make it work. [00:48:54] That's called the Data Warehouse Tool Kit. It's the definitive guide to Dimensional Modeling and it's by Ralph Kimball Tor. I see you get some insights here. [00:49:05] I'm referring to my ongoing project. I am in the same spot right now where I'm basically developing a structure, the tools to assist people like myself. Chaviano, the industry just streamlined work and right now I've launched three projects or three product and but overall they all have to fit into each other, meaning that when I decided this over time, by experience, that started drawing things, the things that I always asked myself, is there any restrictions or what I am? Right. So in other words, when you are creating a table, you're surprised at what type of storage you're creating with the appropriate language. I always ask myself, what are the limitations of this of water and what I normally start the process of these things. I normally very raw, I start very broad look at. All of the problems that I start narrowing it down to the things you want when you have you or [00:50:18] Or your, uh, your audio is cutting out if you want to. [00:50:21] There you go. Yeah. So technically, that means that when you have a music yourself or whatever you talk about all the hours later, all the problems have potential. OK, so you bring it down and narrow it down to the actual action you going to make of this particular time the next time, two years from now or two weeks from now, somehow those discussions sits in the back of your mind. So it gets easier when you're dealing with new things. So this is how I approach it. And with all the tools that I'm developing now, I have to make sure that I don't build any limitations and restrictions. And if there are limitations restrictions, I make sure I have a note of that. And I do a lot of this discussion with my developer because he's anything can be done. That's not a problem. [00:51:07] But I am really afraid that if I do something, that [00:51:11] It's not going to work in the future. So always on my mind. But that's my recommendation to think about limitations and restrictions before it starts [00:51:18] Up with some sage advice from toward. Thank you very much. Anything to add here? I really. Yeah, I think I think looking at dimensional modeling and taking into consideration all the other advice might be super helpful. But at this stage. So let's, uh, any follow up questions that I [00:51:36] Know that that was super, super helpful. It's just like I'm trying to make my life six months, one mount or one year out significantly easier. So when I do start doing more small stuff, I'm not pulling my hair out. [00:51:48] Yeah, yeah. And I feel you going through very, very similar things right now. So let's see if there's we got time for maybe a couple more questions of of either Jill or it looks like Josh has a question if you want to go for it. Yeah. Yeah. [00:52:00] Oh, my. My main question is, when you're coming into a company, when you're fresh, that's for lack of a better. But even in my time, my brain is closing down. So when you're given a job, sometimes there's an easier way to do it. You can quite easily do it with something easier. But when you come in, a lot of the times are trying to prove to yourself they're trying to show that, yeah, I know this. Then you start doing something that could be done way easier. You start doing something way harder for something that could have been done with maybe Excel. That's a how do you deal with proving yourself once you're in a new company without having to? Because I've come to realize once you stop making it a bit more complicated, then you'll have to explain it. Then it just goes round. It's a never ending thing. [00:52:45] I take an opposite approach that instead I'm trying to show how much I know about something. I just start by talking about how little I know about everything and that I need you guys to fill me in on some context in order for me to start making something that'll make your life a little bit easier. So I take the opposite approach like I'm completely OK with sounding stupid at work and asking dumb questions because it's those dumb questions that will set me up for future success. And that's also just kind of shows a little bit of humbleness. Right? Like, I never, ever want to feel like I have to prove anything to anyone. I just want to feel like I'm doing things that are helpful. So that's kind of my philosophy on that. I really think. [00:53:21] Yeah, I think it can be really hard and it feels like you need to prove yourself in that way. I would say try and just communicate your work really well. I think that, A, it will help you build relationships with people you're working with and stakeholders. But a lot of times we tend to want to do some of this analytics work in a silo. And at my old company, I actually had a framework for communicating my work because and I was used to running into problems. And so I would have a new request and they'd say, cool, we want to work model by the end of the week. So it would be midweek last week and I'd have an email and I couldn't schedule a meeting and really try and sit down and walk people who aren't technical through some of the things that were difficult. So I started with this method. I was challenged here. I had to work with this team to get access to this data. I went back to what I was challenged here. But at the end, I have this report for you. I think that sets the stage and they can have a better understanding of it. We didn't just take five minutes of work to create this report, but you focusing on the ways that you are able to communicate what you're doing, it will be helpful for you so that it's easy for them to see it as well. It does go easily unseen, and especially now that we're all remote. It's not as visible as a lot of other kinds of work. [00:54:52] Marc, put into this book the first 90 days. That book is absolutely amazing. I use that when I started this kind of job that I'm at a price and follow that framework literally for the first 90 days. And it set me up. So nicely for success in building trust with my boss. I highly recommend that book and most of that, mostly what that book is saying is in the first 90 days, you're going to be asking a lot of questions. And those questions are specifically around what setting expectations for you at that book is super, super amazing. What do you want to add some commentary around that and how you use that? [00:55:26] Yeah, specifically something within the first 90 days called the Stars model. And essentially, it's like for a company, there's different stages or within the event department, there's different stages within it. And given the stage, that is something where you should there's certain like tactics. The best approach that. And so example is like if it's a four game acronym number posted in there, but like if it's like a dumpster fire and you're like an expert and I bring you in to fix it. Right. They're going to be super open to you like suggesting whatever to try to solve this issue so you have more leeway to jump in and take action. But say, for instance, I think the term is called turnaround, not turnaround. I'm blanking on the on the acronym. But say, Francis, like you're coming in and you're see it's a problem, but it's not a problem right now. And so the things are working right now and the future might be an issue if you come in and the first 90 days you jump in like this is what we should do, X, Y, Z. Right. Everyone's going to be angry. They're going to be like we've always done this way, way, you know. Right. And so instead, to tackle that problem, you need to build trust and rapport and a lot of credibility before you even make a suggestion. [00:56:41] Specifically, the problem I'm working on right now, the Data architecture, when I came in from a mile away, I'm like, oh, my God, gives me a headache. As a Data scientist, this is an issue that needs to be resolved. But after talking to a whole bunch of people, I knew that like there was a lot of politics and like embeddedness. And now that that is always politics, more so like conflicting needs. Right. And I knew before I even tackle that problem, I bring it up. I mean, you build up a whole bunch of rapport. So I waited like six, seven months before you went to my manager, like, look, this is an issue and these are the people I talk to this I want to work on. So highly, highly recommend the stars model. And then going back to your original question, like a hack that I use the first kind of meetings I do, I get a new job. I don't even talk to technical people. I go straight to customer customer success and sales because they're the closest to the customer. And I just pick their brains because they know what their customers pain points are. And that gives me a really good overlay of kinds of problems that I'll be seeing in the future. [00:57:44] And just to add to Mark's name, right now, I'm working on a project related to customer complaint database, and I purposely chose that product because, one, it's close to the customer and that's where the revenue is coming. Customer equals money. So, yeah, and trying to get those Data and they have this database sitting there quietly, nothing's happening to it. It's just information storage, storage. And I can tell that last week I spoke about this very briefly with poor and Harp with everybody else. And as I was cleaning it, I could see a lot of problems in the database where think the shipping part of things is the shipping. The shipment is where the problem is and I can already see it and they don't know anything about it. So that's why I was trying to build a case, trying to get a Data project going for the company that I work for. So hopefully, you know what I want once say showcase that. And I highlighted to them say, oh, wow, they're spending a lot of money shipping, spending a lot of money, paying the customers for a shipment that they don't need to pay for and things like that. And anything that saves money for the company all creates money for the company. Is it is big. It's big for the it's big for the CEO. So everything boils down to revenue. So if you see a pin point, I'd say go tackle it. And I know in the beginning no one is going to believe you. But if you create a case and maybe show them, take a tiny, tiny project based on all the mentoring that I received on Friday and on the Sunday calls, just pick a tiny project and see the value that comes out of it. It's huge that way. You can definitely win them over to you and they begin to trust you. But yeah, I agree. You need to understand the business was the company's mission to see what what they are after and what they want to do. I think that's key to understanding the customer end and the and the business. [00:59:38] I'm looking at this notebook that it's my notebook for my work. I started my job October twenty first twenty nineteen. And these are notes coming from the twenty fourth of October. And these are just things that I wanted to talk to my boss about at the end of my first week and to kind of help you get a sense of the type of stuff you talk about. Right. So I was trying to identify the biggest challenges that the company is facing or will face in the near future. Why are we facing this challenge? Is one of the most promising unexploited opportunities for growth in these areas and then in a meeting together like that, made him work through with me for a particular project that they had hired before was a diagnosis of the situation. And then I had to demonstrate that I was understanding the key priorities for this particular situation and then trying to quantify and articulate what a win is going to look like for this project. So we have clarity and alignment on that and then set up a plan every week. This is what we're going to do. This is this is our meeting frequency. These are the things that we're going to discuss and things like that. So that's all within the first week of working there and actually have the Astarte model written here in my notes is a startup turnaround realignment, accelerating growth or sustaining success? So it's tough, man. So hopefully that was helpful. So we got time for a few more questions, maybe like two more questions. I'll turn it. Oh, there's people who've been waiting for a long time. We've got a sheet. And Jill, I'll give you guys first priority on the questions, so go for it. I know Depeche and Nesrin had joined, but we might not get your questions. So we'll see how this goes. Either A or you can take the floor. [01:01:22] I have no questions. I whenever I'm on, I just always end up hearing something I needed to hear. So thanks, everyone. [01:01:29] I hope you heard something today. I should think even I don't have any questions right now. So that leaves open for Depeche or Nisrine. If you guys want to just admit yourself and go for it, this is it going to push. How can we help you? [01:01:46] It's going well, so I don't have any question about that. First, just wanted to get some perspective around when we were about executing any science project which is intended to support the business, do something. So if anyone has any experience that would share some perspective at all of those insights actually translate at the implementation level. So save this a novel perspective that comes out of an analysis. So from that point onwards, how does that insight actually go through the different phases if it goes to the on their decision making space within the imagination, if, you know, someone has some visibility around a project, which they would have, you know, in which they were given some insight and they were actually witnessed the impact of the change back in other things they would have. [01:02:36] So, yeah, I want to take this one. [01:02:38] Yeah, I think that's a good question, at least in my experience when I found something interesting, like in a data set, I've been able to communicate that upwards. So definitely to my manager that their manager and then to stakeholders who are making decisions. A lot of times they what we have found doesn't really actually change the course of our direction of our organization, but sometimes it does. And in those cases it is stakeholder is typically asked for a little bit more detail and more granular level of detail. For example, if we're trying to predict like app downloads, we might want to see this on a weekly or a daily basis. So the first kind of step, at least in my experience, is they want to drill down even further. The next kind of step is really trying to decide what changes that we can make organizationally impact this so we will have more app downloads. Do we change our actual app icon or do we change our description? Do we have different screenshots? Those are the things that the other teams in being able to talk through and then go across at work across different departments to implement. So asking things from design. And then finally, having made these changes, I'm typically asked to then track how significant they are or if they matter at all. So from from my experience, that's kind of have been how that process works. I hope that kind of answers your question mark. [01:04:18] Anything that you're. [01:04:19] Yeah, this is coming from a startup perspective where we're trying to build our our sales pipeline heavily. Go talk to sales because they'll give you key insights of what analytics would help them significantly. So like we have our own cars for for data science teams to provide twenty sales factoids from our Data to help sell enable sales to sell harder. And specifically. Well, I'll go talk to sales, be like, hey, you know, you've had a few meetings with customers like where are the hang ups? Like, you know, the customer is saying like, hey, how do you know your your your product is working in the square? How do you know people engage in this way? Right. And so that's kind of more so like this this question building a. Where some key insights at the Data doesn't say the Data and sad, but many times they have so much Data don't know what to ask. And so using that as a kind of a barometer of where some high priority questions to answer. And so instead of saying, like, I have this Data, what's the value this day can give, I kind of flip my aversive, like, what's the value that my customer, which is the sales team needs and how can I deliver that value with this Data assets I have and sometimes even saying like, no, that's not happening. Right. Is very helpful because then the sales team can better redirect to a better kind of fits. Right. So. So I love Tora's come every time there's a sales job. One hundred percent. [01:05:48] Yeah. If you get an opportunity to pick up the book to sell is Human by Daniel Pink at that book is amazing. It helps you really realize how important sales is and measure what matters. The objectives and key results like this is one of the books I have and it's a great, great recommendation. So hopefully that that answered your question then. [01:06:10] Yes, I just had one follow up question. It's floating around. I haven't been to the application phase of of applying 22 percent so far. This is a new Data, which I have just started stepping in from from an analytical perspective. So if you know something, if someone can give some perspective at all, how and how much emphasis do you like in science arena have on the impact of your project and sort of the process? If you would have to say if I did them Data my phone or live like a pet project, but it didn't have any impact. So does that mean am I at a disadvantage because I can't show the results that know my project were devastated? [01:06:57] And I don't I don't think so. I mean, as long as you learn something through the process. Right, like we all have projects that don't get implemented. I've done a ton of stuff that just didn't get deployed or implemented for whatever reasons that ended up not delivering value. But we learned something in the process. It's that I don't think that necessarily puts a disadvantage. We think I really yeah, [01:07:20] I tend to agree with you. I think what we often see is people who are more experienced have maybe we have 80 percent projects that don't get implemented. And the 20 percent is what you see on the resume. So it looks like every single work is being implemented and actually move the needle. But I think when we're being realistic, you know, that's not always the case. There's a lot of Data work that doesn't have a business impact. So I, I don't think from a recruiter they have maybe enough insight to really have that be a disadvantage for you. Hopefully that [01:07:58] Was insightful, Mark, anything to have [01:08:01] This come from the other end. So I think last August, a junior and I was laid off and went through this whole applying to Data science process again, actually really enjoyed it. It was really fun. I learned a lot. And I think the biggest thing is like is only rely on your resume. I think you're going to lose many times. If anything, the I took a different college and this coming from a very US centric approach. I don't know how applicable is to other other arenas for other people listening on this. But, you know, the the best way is for me to get around that is the hiring manager, the recruiter, and specifically ask where your pain points, where you're trying to solve with this Data, because, like, maybe my project didn't get implemented, but I have this skill set to solve their problem. And so the emphasis is in their say on how my project worked or not. It's more so on how my project built the skill sets and enable me to solve their problem and drive value for them. [01:08:58] And to to add to that, like if you don't have work experience doing Data science projects at work, I think having personal pet projects can make up for that. As long as you're doing a good project that is well executed, well thought out and high quality work, it could definitely serve as a virtual internship of sorts. So you don't necessarily need to have a huge impact in terms of dollars and cents from those personal projects. But you can demonstrate that you were able to go from data to decisions in a principled fashion with well written code. So hopefully that was helpful any other than just any of the last minute questions from next year. No problem. All right, cool. So may drop a link in the chat right here. This is a link to register for the DFS Go conference, which I odelia myself will be a part of. I'll be hosting a panel discussion along with I Would Leave and Vicious and a couple of other really, really awesome people. Go ahead and register for that. I believe we are on deck for April 11th and that's happening early in the morning. April 11th and then immediately after that, we'll be having our office, our session, I believe I said it'll be a fun day, so definitely go register for that. Also, guys, that'll be included in the show notes for everybody listening as well. Also, don't forget to vote for the Data Community Content Creators Award. Guys, we need your help making this a big event. We can't do it without you guys. It'll be awesome. Be a lot of fun. And anything else to add ideally before we call it a day? [01:10:32] No, I don't think so. I think we we covered a lot. We covered a good bit of a range of questions today. [01:10:39] Yeah, I really, really enjoyed today's question, guys. So keep an eye out for this episode. It will be released, I believe, on Thursday. We clean everything up. Transcriber will get it out there for you guys. I'll be available on the podcast on YouTube. Thank you guys for hanging out. Take care. Have a good rest of your weekend. Remember, you've got one life on this planet. Why not try to do some big cheers? Everybody, I.