LaunchPod - Robert Henkhaus === [00:00:00] Jeff: Alright, Robert, welcome to the show, man. Thanks for joining us. Robert: Hey, thanks for having me. Jeff: , You got an interesting background here, right? For a start you've been in the oil and gas industry since high school, Uh, You have VP of product at and Veris now, but you spent time at, everything from giant companies like Kenco Phillips to more kind of innovative, , companies like Petro ai. But all that kind of after a run in the army, , as a sniper. I am certain, you have some slightly non-traditional learnings to, to bring a product from that. I guess let's just kick it off and, and do you wanna give a little insight into, , and VARs and what it is y'all do, and just also your, your background, how you kinda landed where you are now, like what's the 92nd TLDR on, on Robert Hanchos here. Robert: Yeah, sure. I grew up in Midland, Texas, which if you don't know, is. Basically the center of the universe for oil and gas in North America. I went to school, for my bachelor's twice, first time for architecture. , And I'll admit it, I, uh, did not value the education, that I was there to get. And, so I had it in me enough [00:01:00] to say, you know what? I need to take a break a little bit. And so I went to the Army, , became a sniper. I was a boy scout back in the day and, you know, in the Army I literally had a map taped my leg for like six years outta the eight. And, , just really enjoyed, , you know, geography and, geoscience as an adjacent subject. And so that's what I ended up studying. . Right outta school. I went to Conoco Phillips as a GIS analyst and developer and, , I really enjoyed that and I ended up in a space where I was developing applications, , as my own dev team. And I really enjoyed that talking to what were my customers at the time, which are, people at Conoco Phillips who just had, urgent needs for these things. I actually really enjoyed that. , So much so that I said, you know what? I need to do software full-time. And so I moved from Conoco Phillips. I found a small startup that was basically pre-revenue, called Petro ai, did some really innovative data science stuff, , in the oil and gas space, , for about two years. And I did find that as we hired people, while I can code, I'm not a [00:02:00] CS major, , What I did find is I can talk to people, , and I really enjoy. Developing solutions for people's, you know, pervasive, compelling, urgent problems. That's product management to a t in my opinion. So that's where I kind of cut my teeth as a product manager. Eventually I kind of graduated from Petra AI into a company at the time called Drilling Info, which then became inves. And Inves is, the most trusted energy dedicated SaaS company. , On the globe. and high level summary of what we do is we unify, , proprietary data analytics, software research into a single decision platform, , built for energy professionals it's not just oil and gas, but it's also solar and wind and, you know, renewables and power. , And so transmission and all of those types of things. And so it's a really interesting company to be at. , And I'm proud to, , represent inves, , daily. It's, it's, it's really rewarding. Jeff: I mean, basically it sounds like you guys give, , teams in, in multiple energy sectors, the data to [00:03:00] help them deploy. Millions, tens of millions, hundreds of millions of dollars in capital, , and very important decisions, right? Robert: Absolutely. This could be, Hey, I'm a, , wind farm developer, and I am, running a siding workflow in, in the application, and I'm trying to determine what are the best places that are near areas where I can, take the power that I've captured and, and put it into the grid. And I'm, you know, deploying tens of millions of dollars doing that. Or, hey, I'm, doing some science because I'm drilling a bunch of wells and I'm spacing 'em this close and seeing this amount of output. But if I, you know, increase that spacing in such a way, , and complete the wells in a different way, I may see a different output, maybe more capital efficient output. Jeff: ~So I feel like I'm, I'm, I might be an expert in this field now 'cause I've been watching Landman, which I~ Robert: ~Oh, there you go. Yeah.~ Jeff: ~right? Um, I think that's in Midlands, Texas, right?~ Robert: ~It is.~ Jeff: ~Yeah. Yeah. So I mean, so I know this area, but, all jokes aside, right? That's, that's the level of expertise. You can't bring into those decisions you need to actually know what you're talking about.~ And I think this is an interesting thing because, , it's not just. Getting it right. Right. Like you can't just come in with a magic answer. you, you kind of need to be able to persuade people that you have the right answer and you have evidence people are not just gonna willy nilly throw, you know, tens of millions of dollars here and there just 'cause like some black box, , SaaS solution says to, , do you focus on, on oil and gas or do you [00:04:00] focus on a specific area? What's kinda the area that your, your remit is in? Robert: I support, an application called Prism, which is one of the premier applications at Inves. , It is mostly oil and gas focused, although, over the past few years have been getting into a lot of the, , more power and renewable workflows. In the space. So it's, it is a little bit of both. It's a little bit of both. One of the really interesting things that AI is very good at, , that we've actually implemented, it's one of our more important models actually right now, . Is, , having AI explain the past, right? So all, all of these leases , that an oil and gas operator might take, or a renewable developer needs to take all of these leases and, land descriptions. From the 18 hundreds or before in some cases, and it's all in cursive, , hard to read, and they all have like weird directions on them and stuff. AI is very well positioned to take those, extract that data, and then put that into a format that's more modern. We can actually start [00:05:00] making decisions against. Jeff: And so that's where, that's where this product Prism kind of sits, right? Is it's able to kind of give the insight of where and how you should deploy. Capital based on a lot of, I assume, you know, some of that data there and also other data you can ingest around like ground, , scans , and subterranean mapping and stuff, right? The thing I thought was the coolest is back to right. No one's can just trust you to just say, oh, the software said do this. , But that's for so many things. That's what AI does. It just goes like, oh, here's the answer. so how did you guys get around this? 'cause like, I, I assume you can't just go to some investor and be like, yeah, I need 40, 40 million. And, , the AI magic box over there says it's gonna be good. Robert: you need explainability. Like let's look at, you know, one persona that, that we serve a geologist, , a geologist, their entire job is, interpretive work, right? They take subsurface data that they have access to, they interpret what's happening there, and they make assumptions based on the data, that inform these million dollar decisions potentially, right? Well. Guess [00:06:00] what , if you do interpretive work based on something you can't really see, you have to explain yourself in great detail. If someone agrees with you, you have to explain yourself. If someone doesn't agree with you, you also have to explain yourself and defend , those decisions and those inter uh, interpretations. So a geologist, this, their whole career is just defending themselves. , Well, AI gives them the same thing. You know, they give AI gives them an answer, and well, guess what? You have to. Give them the plan. How did you arrive at that answer? And so explainability is huge. And it's not just a tool tip or in a sidebar. The explainability has to be baked into the workflow. ~Anyone can toss a, a chat out there, chat's cheap. So, you know, everyone did it and we did it too a couple years ago. We learned really quickly that, that people need the explainability, and again, not just in a tool tip or anything like that, but actually part of the, the workflow that you're executing.~ Jeff: What did that path look like to getting there and then figuring out this is gonna be the most important thing to, to making people uptake this new way of doing things that can help them, you know, be better at this. Robert: most B2B software, looks like a 7 47 cockpit in my opinion. And it's because of years of, hey, we need this and a customer is important and needs that. And so you end up with this huge thing with a [00:07:00] bunch of buttons on it. and this was a clean opportunity for us to practice progressive disclosure, which is basically, hey, don't get rid of those things necessarily. 'cause people do need them, but hide them in a way that's intuitive for people. That don't need that. Right? Because not everybody needs the hardcore explanation and stuff. They just need the quick answer. And that's good enough for now. But, you know, we found that a large cohort does need some of , that background and the explainability. So the result was in the chat at least. There's a couple things. So, , we have a team called Intelligence that writes a whole bunch of research, , that has, broad and wide trust, , across our customer base. We were referencing. You know, for a lot of questions, we do reference the research that's been written, right? That's, that's gold data for us to, you know, to have as sources. , The, the interesting thing is when you pull that in and you write a summary that, you know, pulls from a couple of those research reports, , [00:08:00] if you don't have the links back to them. Trust is gone. Right? We have a place where you can say, yeah, I do want all of those links. And boom, here you go, you've got 'em all. That's one area. The other one was you know when you do the deep research, you see all the analysis happening and things like that, but it's being collapsed in the chat. we provide the plan that the AI has taken , to get to the answer. You can copy it out and look at it. You can actually look at all the SQL statements that, that are being made on the backend. And so if you need that level of explainability, well boom. Here you go. another way that we've done that is. We actually link into, 'cause the chat is related to the dashboarding area in the actual application. So you can say, yeah, make a dashboard out of this. You click a button, it opens up the dashboard, and now here's all your analytics based on your question. The issue with that though is, , , , before we had this interaction, you know, we didn't have to think about how AI was going to change the dashboard. Underneath you. Right? And so we've kind of adopted this AI first user experience. , It's kind of a philosophy, which is, you know, if the AI is making those [00:09:00] changes for you, you need to present those changes to the user when they open the dashboard. So the result for us was instead of having our filter panel and a couple other, you know, kind of key drivers for the whole application hidden, we actually do need those out in front of your face. And so when you hit the application, you see exactly what AI's done for you. Jeff: So, and, and that can be changes to, , I guess what, what are people tracking there, or like what does those changes look like? Robert: yeah, so it could be, Hey, you know, instead of looking at the entire entirety of North America, I just wanna look at Texas and inside of Texas I only wanna look at a certain vintage of development. And inside, you know, that vintage of development, I only wanna look at, this operator and that operator. 'cause I'm doing a comparison between the two. And so AI can apply all of that, zoom into that geography for you. , But if you're just dropped into it and you don't know what's been done, it's quite confusing and pretty jarring. Jeff: I think historically one thing across product that we all looked at [00:10:00] was there were multiple ways to kinda gauge. engagement and, and are people using it well, and, and things like that. And, you know, one of those may have been right, like they go and they, , edit analytics boards , to, you know, drill into the question they're asking. , They, you know, maybe go and read our research it obviously makes sense to have the UX change with, , how you're presenting value, but I guess , from your end on the product side and your team side, how are you showing to kind of, you know, your, your managers and your leaders, Hey, this is working and, and, , we're getting the behavior we want. Like, what does that look like on your end? How, how have you had to change on that? Robert: Yeah, that's a good, that's a great question. , One thing that we've implemented over the past, , I don't know, two and a half, maybe three years, is a thing called product engagement score. So we can actually measure the specific value moments , that a user crosses, , in their session. We can use those, , to say, okay, in this cohort, the success looks like this, or the rate of success looks like this. A lot of that stuff [00:11:00] is, you know, the ultimate outcome, ? It's not did I, , create a filter or did I use AI to make a dashboard or anything like that. It's more like, can we infer that you made an insight, , on that thing because you saved it? You know, if I go through an AI workflow and I open up a dashboard and then I hit X, well, was that successful? Well, maybe for ai it was. Um, Jeff: did something successfully. Robert: it did, yeah. Yeah. The thing worked right? But did it work where the, where the user actually got the value out of it, so we hang our hats on things that we can really infer that the user actually got the value that we think that they, they got and, you know, saving it or sharing that, hey, like if I save or share something, Jeff, , and I share it with you, I probably did something worth sharing, right? , And so that's how we're starting to, to measure success here. And so when we talk about adding AI to the mix, all we're doing is accelerating your time to success, right? It's not AI for AI's sake, it's AI for the user's sake so they can get to success faster. Jeff: So [00:12:00] were you guys already measuring, , like save as a success point or, or a share or something like that as a success point before That's, I, I feel like you guys were more prescient on that than many. Robert: Yeah, like if I, if I see a post on LinkedIn and I repost it, it probably means something pretty deep to me to want to do that, you know? , And that might be the kind of north star for my product team, right? Um, all the things leading up to that, like. Scrolling X amount of scrolls or whatever. I, I'm not a BSC person, . All of those things are the funnel to that, right? And so we see, , if AI can open up a dashboard and make a dashboard for you really, really fast, that just widens the funnel for us to get you to that share, right? , or get you to the point where you're extracting the data that you actually need, Or, or saving that thing so you can come back later. and so we're, we're looking at how we can change the application in a way that drives a user behavior in a way that hits our outcome, and those are our [00:13:00] outcomes. Jeff: What we do on our end is we do obviously, , session replay enabled by AI now, and the biggest thing is we have, you know, agents that watch every session tell you what's important. , And it's interesting 'cause one of the things we found back during, you know, we, we have a lot of. Proprietary data that we've, we've produced on our end to kind of make this better. Right? And one of the really interesting, to me at least early takeaways, was that humans act like humans. Kind of across a lot of different verticals. , But this is a great kind of instance of that again, where it's right LinkedIn, it's, you know, if you're sharing something, , probably the same reason it's important if you're sharing something from inves is because you're putting your name on it, right? You're telling other people, I trust this thing, and, and you are, you know, using it as a saying, this is a valuable data point. says, you know, Robert Hank house or Jeff Wharton, and, and so you're putting your name on it. And that's always been the thing, right? Like, if you can get, that's why referrals back in the day were so important for like, Robert: There you go. Yeah. Jeff: any kind of, you know, e-commerce stores. It's really interesting to see how that all comes [00:14:00] together, where human behavior, at some level. Matters whether you're, you know, the most B2B I think you can almost possibly get or B2C, like LinkedIn. I want to circle back real quick 'cause you mentioned Akash really quickly. and I think this is a really interesting mall that you guys have over there where Akash, basically runs your like AI center of excellence. , And so you have a bunch of these different product teams and then you have a caution, his team kind of driving AI innovation. I guess what, what bred that model and was it always like that? , Or did, do you guys come to that through other means? Robert: , It wasn't always like that, but, , what we've seen is this model, this COE model. Lets the experts stay experts and at the same time doesn't force every product manager to be an AI researcher. and if you do it right, which I think we're doing it right, it allows people like Akash and his team to develop some really out there stuff, bleeding edge type of things for our industry. then I consume that, [00:15:00] right? You know, Hey, this is some proven tech, let's go use it somewhere. And we go, okay, great. And that just helps us, quite a bit. So I, I think we've got a really good, , thing going here that lets us act really small and agile and startup-y but at the same time, , reap those rewards, and get that return on investment, , using the model like this. Jeff: if you look at like your team, then what is that kind of. Engagement with, with AI look like both, I guess, at a application to product standpoint, but are are, is anyone in your team kind of looking at fundamental, research too? Is it, is it really circled around the kind of like center of excellence model there? Robert: I'm a proponent for AI and, and, and, and getting deep with it. I think it's part, it's gonna be part of our future. We already, we're already seeing that. Right. you know, because we have the COE , model, I think we allow that team to explore a little further. , For us, we absolutely baked some of that into the strategy that we've put together. Strategy to me is basically in [00:16:00] what way do you want to be successful? Well. how we wanna be successful is with ai. We want to add that into those types of smarts into the application. And so the research isn't necessarily the, you know, how deep seek is outperforming x, y, z other model. It's not necessarily that, but more like, Hey, from a ui ux standpoint, how might we change the UI so that. AI feels more natural in the application. So it's more of that type of design ish, , if that's a word, research. Jeff: I've talked to teams across the board who can have all sorts of different models. Everything from, we only, you know, only that one team thinks about it even until they design something and give it to us, to everyone does it. , there's a couple people really, really their entire job is thinking about innovation here, but then across the board, everyone's kind of thinking about it lesser or more levels depending on the project they're on and, and what they're doing could, because to really, I guess probably best. Leverage like what, what a team like cautious is doing. You probably need some [00:17:00] context on it outside of just what they tell you, but like you also don't, like you said, you don't need to be like measuring deep seek against, chat GPT 5.2 to see which is Robert: We, we have people thinking about that. Right? And, and I think we, you know, if, if we're gonna be leaders in the space, we need it. , But it's a happy medium. , When you look at the whole organization. Not everyone needs to be doing that. My job and my team's job is to take some of that awesome intelligence and turn that into repeatable product. And so we've kind of got our focus areas. Jeff: Right, like at at heart, this is like my soapbox on product, , we're not here to build software, right? Like none of us are actually here to build software. That's not the goal. So far has been the best way we have found to deliver a lot of these solutions is, is via software because it enables, you know, a lot of different things and, and it compute and, and insights and understanding, and AI is another tool in that, Robert: Absolutely. And, and I think that can be extended. This might be a bit of a hot take, but there's analytics everywhere, right? The bar charts and you know, pie charts. [00:18:00] Oh my gosh. You know, maps, all of these types of things that help, like they, those things exist because they help you derive insights, That's why there's a bar chart out there, and a time series graph and all of those things. Right now, if AI can skip that step and just give you the insight. Somehow allow you to trust that thing, then guess what? We don't need the analytics anymore. Dashboarding might be going away. Now that's, maybe scary for some people, but like that, that could be a reality that we see soon. , And you know, of course , the trust piece of it is the big caveat there because, you know, that's why you have this analytics still now. So the, the role kind of for analytics, I think and dashboarding changes a bit to a supporting character. where AI is giving you the insight, but then saying, Hey, look at this dashboard. This is how we arrived at the answer. Instead of, you don't have any of the AI and you have to come up with that stuff on your own. kind of an interesting switch or, or, or a change that [00:19:00] I see coming. Jeff: It's kinda the same way, right? If you think about it, charts made it so you didn't have to, just like raw read. You have a graph so you don't have to raw read a spreadsheet of 10,000 rows over time. It just, now you can just look a picture and go like, oh, it's going up. That that's good. Robert: Yeah. We're gonna go through that same cycle again, but with AI being in there. Jeff: Right. No, I, I, I, I don't think it's a hot take. I think it's a , pretty like prescient well put take. I like it. I'm here for it. , Alright, now, now we can, you know, maybe move on to, to something more, maybe there are hot takes or, or more hot takes, , because, you know, all this is fine. Good. But, so, you wanna innovate, you wanna go quickly. And Veris was recently acquired by Blackstone, right? A PE company. Robert: That's right. Yeah. VPE company Jeff: yeah, definitely one of the top ones at the very least. , Right. And, there's a lot of like people who will, will talk negatively about pe. I've had good experiences with the right companies, , and some, you know, like anything else, some, some do, do it well, some are less good. , But I think the one thing that's constant regardless [00:20:00] is anytime you have. Uncertainty. It can breed slowing of work, slowing of innovation. And this is , where people kind of at the top of teams, like in your role kinda really, really, really matter in a day-to-day basis is how do you keep things going? How do you kinda manage through that kind of change and, and you know, keep confidence going, keep work going because the same way that you don't need an entire team thinking about AI 24 7, really, most of these people don't need to think about the machinations of, of this. Event 24 7. It's not what's important. There's not much you can do about a lot of it in the, in the second to second basis. How do you kind of like, keep people focused? How do you drive that energy during, it might be this, it might be other times of change as well, but like, how do you kind of like manage a team through that? Robert: It's a tough place to be, , just to be, you know, kind of candid about it. , You know, I've been through this a couple times now, and when these changeovers happen, I think teens because of the, , the uncertainty in the near term future, like what's gonna happen in six months, , [00:21:00] because of that. Things start to shut down. You know, decisions start to shut down. And it's not because , the teams are lazy or anything like that. It's not that at all. It's just 'cause we don't know, we don't know if this decision is gonna be right, , for us in six months. Right. And so that makes it really tough to operate. , Especially if you've had a strategy for a while and now there's some strategic ambiguity, right. What I've found, , to be successful is, , to, instead of, you know, thinking about what's gonna happen in the next year or, two years or even just six months, , reduce the horizon. So I think the short horizon, , framing preserves a lot of the trust and preserves a lot of the flywheel motion that you've already had going. In practice what that means is. You focus on the known knowns, like what will be true in 60 days from now? Well, probably there's a bunch of customer truths that we've uncovered that will be true still. , The constraints that we're operating on are probably the [00:22:00] same. , Maybe they've changed a little bit. you know, the theories of how we're gonna grow or retain or whatever. Those things probably all stay the same. And so given all of that, What's the strategy for the next, literally just the 60 day window. , And I think that if you frame that in the right way to a team, that inspires the flywheel continuing to move, ? , And at the same time gives your leadership or my leadership, something to look at and go, okay, this is how the teams are thinking about the world today. So you as, as the, the leader or me in this case, you become the interface between the teams still practicing in the right ways and focused on, you know, the 60 day goal, , and what those metrics are and things like that. And at the same time, using that same tool to communicate upwards and to say, Hey, everything's okay. This is how we're operating. This is how we think about the world. These are the decisions we're making. Here's where we need some help. I've done that a couple times now where you establish this bit [00:23:00] of interface and, , I've seen it once in the past , go really, really well. And so now we're, we're going through that again right now. Jeff: there's a couple reasons why you can get acquired and they're very diverse. It can be everything from company was mismanaged. , And, and they're coming in because they think there's, you know, high levels of. Efficiency be gained. And, and in that case, like probably maybe you can say less of those things you cited are, are going to stay the same. And then in other case, like the company's doing well, they see an opportunity to move even more or faster into a bigger market. In which case probably, you know, they're acquiring partially because of what you were already doing, in which case like lean in and double down. Um. And, you know, looking at, in various progress, it seems like, , from, you know, a little outside knowledge on my part, it doesn't seem like you guys were doing badly. So it's probably not the first one there. So in that case, right, it's, it's, the worst thing you could do is actually. Slow down too much because of that uncertainty. So you wanna, you know, figure those things out. And hopefully you're getting some intel on, on at least some, like you said, the, the shorter term priorities, but it's, you know, certain context, certain constants. Can you give us an [00:24:00] example of some constants that you think, , I guess can be publicly stated, Robert: you know, we've been talking about basically this whole episode. It's just, which is the, the AI piece of things and the integration of AI into the user experience. That's not gonna go away. prism's not gonna go away, which is the application we've been talking about. , And you know, there's a high degree of alignment, I think between what in various sees as the future and what Blackstone sees as the future for us. , And so to your point, it's, it's fuel to the fire kind of situation. Blackstone didn't buy us because they wanna change the model. They like what they see, which is why they bought us, you know? And so yeah, I think, ai, you know, being baked into the UX in a really intuitive way, I think that that does not change. There's some other things, , that I can mention around, the enhanced collaboration inside of the application. So those are some signals that we've heard for a while. And, absolutely things that we need, just need to execute on. So that doesn't change either. Certainly not in the next 60 days. , And [00:25:00] so some of those things are kind of on the short term roadmap and stuff that , we're actively exploring. Jeff: The, the ai, I'll be honest, was kinda the one I, I assumed was, uh, like no matter what you wanna do, that's not gonna change. So that's, that's kind of a almost act of God at this point. Robert: And it's not even just, hey, AI for AI's sake, it's like, oh, the, the feedback we've gotten is incredible. People are making really big decisions, using this as a tool, and it's like, , let's continue on that successful path. And so that's something that definitely is not gonna change for us right now. Jeff: It seems like the acquisition was triggered because of, of success on, in various aside. So with that in mind, Because I've been through this, well, at, at a prior company we were acquired we were part of a larger group, but basically acquired the whole thing because one was very cash cowie and one was seen as a growth industry kind of piece. It was interesting because what I saw was I knew that we wanted to keep going the path we were on, like we wanted to double down 'cause what we were doing was right and that's why we were here now. , But what you saw at kinda the slightly more junior level was maybe [00:26:00] people. Not even worrying about their career, but, but feeling like they had to ask at every step, every decision. And that alone can create a lot of friction. And that that can also change culture and, and cause a lot, like longer term problems if you don't nip it. So how do you, how have you been addressing that of, not just, I guess the ability to people to feel like they can move quickly, but like they don't have to ask it every step and just the trust level of, what we talked about yesterday is going to be today and tomorrow as well, kind of thing. Robert: I think that the, the shorter time horizon helps aid in, in a lot of that. It's like, oh, the, we're just making decisions for the next 60 days here, and you can reassess that every 30 days if you need to. And it happens. , Absolutely it happens. if you define some clear lines in that strategy. You know, Hey, here's the strategic intent and here's how we're gonna measure that outcome. Right? And anything that you need to do in this space is good. This is like fair game. You can make decisions here. You literally have to have conversations like that sometimes to make sure that people can still [00:27:00] make decisions. One interesting thing I've seen, , not recently here, but this can happen at any time, not just, you know, when there's an acquisition, , coming, , is. When there's some type of strategic ambiguity or uncertainty, , or we need to escalate a question, , to the executive team or something like that, , that can often lead to a habit of that habiting. like, Hey, it looks like the team , , needs help making these types of decisions. So we're gonna lean in and help and it's not necessarily coming from a bad place, it's I want to make sure the team is making the right decisions. Well, the executive leaning in can signal a bit of a trust alignment there, right? A lot of times when this stuff starts, that's the first thing that you see is. Decisions that should be made at the product manager level are being escalated, , up a level or two, and that turns into it a bit of a slowdown. That is your first tell that some of the [00:28:00] empowerment, , is eroding a little bit Jeff: it, it seems like a very interesting time over there between, the rise of, of being able to use AI to kind of fuel these, , multimillion multi $10 million, , insights and that at the same time. You having a company as as big and ubiquitous as Blackstone coming in and kind of having this kinda change going. There's a lot to be managing over there, both up and down. And, , I think unfortunately, with that in mind, I can't keep you here all day. I I really appreciate you coming on, Robert. This is really insightful, I appreciate the time. , If people wanna reach out and learn a little bit more about how you guys are looking at this kind of like white box, you know, clear box strategy of ai, , is is LinkedIn a good place to reach or is there something else or Robert: LinkedIn is Perfect. Perfect. And Jeff, thanks so much for the time. This has been incredible. , And I, I really enjoy the podcast, so I was a listener or I am a listener and so yeah, I appreciate the opportunity for, , for jumping on. Jeff: Awesome. Glad to hear that. Thank you so much for coming on. Appreciate it. Have a good rest of your day. , Best of luck. Continue with Prism and yeah, hopefully to talk to you in soon, man. Robert: All right. Thanks Jeff. Jeff: Cheers. Bye. Robert: Bye. Bye.