James: 00:09 Frank, you and I have not spoken to each other in weeks, but no one would know because that's how podcasts work. Well, I still had you in my ears because I listened to your great little episode with Mads about a C sharp. What are we up to? Eight, nine, eight. Uh, it was a good episode. I wasn't on it, so it was a terrible episode. But aside from that, I actually rather enjoyed listening to it and there weren't too many times where I wish I could have interjected, screamed at you to ask a certain question. Nice. I tried to throw in some, you know, Frank: 00:45 sharp and machine learning AI topics and they're just for you and I, and I think Mads oblige and, and uh, and that was, that was a really fun, fun interview and I hope people go back and listen to that one. Cause it's a, we don't do interviews that often, so it's a very rare occurrence. So I'm glad that you liked it. James: 01:02 Yeah. Next time I, I'm, I'm gonna ask him about events. I've decided I want to ask if we're ever gonna get love for C sharp events. C sharp nine. I want it to be called love for C sharp events, which fixed events a nice, all right, well if you haven't gone and listened to that, definitely go and listen to it. But I was gone for a few weeks and I left a nice and easy layup for us this week. Cause this lightning talk week. That's right James. Instead of picking one topic, we have to pick six. So something that takes us a half hour now it takes us 18 hours. I think I did the math right there. That's pretty true. Pretty true. In fact, yes. Frank: 01:44 Even though it seems as though sipped six topics, five minutes each wouldn't be that long and we would just crush it. We've already spent over 30 minutes just debating on what topics do you start with. Um, but we, we went out to Twitter and onto our discord and thanks for everyone for submitting topics and for people that emailed us too. We, we always have so many topics in our topics list that we want to cover. Um, and often there's things that you, our listeners want us to do, but sometimes we're like, I dunno if we can fit a whole 30, 40 minutes in it, but five minutes it feels just right James: 02:17 Frank. That's right. And usually we hit it. We've been getting pretty bad lately. I, I'm curious to see if we're going to keep extending them out or if we're going to be strict with the timer. Who knows? We don't know until we do it. Frank: 02:32 We'll find out. So let's kick it off with a little uh, email on dev ops. Follow up from our dev ops dev ops dev ops episode from Glenn R wrote in and he said, I want to comment on your dev ops podcast. He says, one, I can see no reason why the Xamarin forms developer would not use app center instead of Azure dev ops for all operations and dev ops be only for a repo that can generate app center build only maybe use it for source code for instance. And it says also that anN for an organization, it seems like Azure dev ops is really good for web apps. Are there similar things, but I'll Libba less for mobile. It says really the simplicity of app centers really attracted him to it. So it kind of caught him off guard to hear me say I'm all in on Azure dev ops compared to some of my earlier episodes with Abel on, uh, with app center and both of them together. Frank: 03:30 So he says, please clarify a little bit and just wanting to give you that feedback. So all right, let me clarify this a little bit. I love app center and I love app center build. In fact, I use app center for a lot of my applications that are just really simple standalone applications that need nothing else. The big draw that I have to Azure dev ops to do the build and release is that I have a lot of fine grain control over all of the different steps and commands that go into building my app without having to write like bash scripts. And I really like one central location that I can build my mobile apps and my backend apps and deploy my services all together. So even Hanselman forms where I can super easily build that an app center, no problem at all. I'm also building now a blazer app. Frank: 04:27 I'm going to be building Azure functions and my mobile apps and I want to do those all together and orchestrate the rollout. Now I will say though, app center, I use the distribution analytics, crash reporting. I use all those services, right, that are there. So I use them in harmony together. It just really depends on what you need. And I will say, yeah, app center is like the drop dead easy solution where as your dev ops, it's, it's a larger, more, um, granular approach to your dev ops process. So I love both of them. I don't, I don't want to deter anyone, but it's really based on what you're building. How did I do Frank? James: 05:05 Yeah. Uh, during all of that spiel I was writing down my thoughts and you've pretty much got to every single one of them. So that was a pretty good, I won't call it a rebuttal, but a description of why you would engage in this seemingly a more complicated thing. And I think that was my hesitation to DevOps seemed more complex. If nothing else, simply in user interface, you know, app center is pretty darn nice. Click on the click on your app name, click on builds, there's a list of your builds, click on a build, see its status, it distribute. If you want to send out that app pretty great. The thing is, it's manual a lot for that distribution unless you turn on, um, you know, continuous distribution, but that just annoys everyone. So you don't want to do that. And some of my apps, it can't build. James: 05:54 So I have to do stupid, crazy custom build steps because I wrote my apps in a weird way and so I have to deal with that kind of stuff. So I, I think that, uh, the best thing that I learned though was that the dev ops UI isn't, uh, as complex as I thought. It can be simplified down, uh, to a more reasonable level. And then I really do like the level of control you get. But to cap that all off chains, my dev ops and builds are just in a complete flux. Now I'm spread between a bit rise, get hub, dev ops and app center. I use all four and it's kind of crazy. And so I think I would like to settle down at some point and decrease that number. Frank: 06:43 Yeah. And I mean I would honestly say that some of my applications are definitely still built in app center because um, um, there was just like simplicity at the time and my app remained simplistic and I had everything there and I was like, I don't need, I don't need anything extra than what this is doing. So I just sort of kept it. And I mean, for a long, long time I had everything in bit rise to right and I've been interested in GitHub actions, but I'm like, I don't need yet another thing. Um, so it's really what fits you the best for your application. I think Frank, it's okay if you have them in all four of them because you're usually working on one thing at a time. So you could just go to the distribution channel. But it would be nicer if they were all together. But again, use what you need. I think is my summary. James: 07:30 Yeah. Um, and it's just for a UI consistency cause you're just like, how do I do that thing? And this one, Oh yeah. What's that step called in this one. Oh yeah. What parameter do I always change in that one? It's, it's that kind of stuff. And so I think I'm trying to settle down on get hub actions, at least for all my open source projects. And then it'll just be my app Frank: 07:51 that are kind of spread around. Yeah. I think for live or if you create a lot of something like me, I have tons of libraries, you know, and I have everything in one have one in one Azure dev ops project, which has a bunch of pipelines for everything. And I'm like, I standardize, I'm like, this is it. Right. And maybe I'd get up actions would be the, the next best for the next thing. I'll be interested to see how that goes. I have not played around with it, but we're at our five minutes. So let's follow up with something from Damien in our discord chat. He says, Frank, um, talked a little bit about his experience with submitting the metal performance shaders neural networks update to X code 11 to the Xamarin iOS slash Mac repo. He said, what would you do differently? What are your lessons learns, learned, learn, learning lessons and suggestions for other people to submit PRS to this type of large project. James: 08:44 Ooh, this was quite an experience for me, James. So way back in the day when Xamarin was called mano touch, I used to contribute to it in tiny little bits all the time because it was pretty easy to download and build and add a little something here or there. But these days it's so big. And so I was quite intimidated. But to give everyone context, uh, this is an API that hadn't been bound yet by Xamarin, not because they weren't intending to, but just because it was pretty much at the bottom of the priority list. I think I might be the only person out there that wants this API. And, um, even I think in the Apple world, not too many people are using this API. So either way I wanted it. So I was crazy. James and I volunteered. You know what they say? Don't ever volunteer for something. James: 09:38 Right? Let's, you're going to, unless you got a lot of time and you want to see it all the way through, I mean if you're going to, if you're going to volunteer, got to see it all the way through Frank. That's the worst part. So, um, thankfully I didn't see it all the way through, but someone helped me see it all the way through. So let me first start by saying thank you. Rolf Rolf has been working on Xamarin and mano touch for ever since the beginning and he is amazing and he helped get this thing through because I'm incompetent. But thank you. Roll. Thank you Ross. Thank you Ralph. Okay. Pros and cons, pros and cons. Um, it's wonderful once you get Xamarin building on your own machine because you realize you can change any API and bind anything into an edit. Fix what ever you want. James: 10:27 So step one, you've got to stop yourself from changing everything because, gosh, I want it to w what didn't you like to just go in there and change something? Yeah. So tempting. Yeah. So number one, absolutely do PRS. It is a bit of a bear setting up a machine. You got to get the right X codes installed, but uh, there's a document to walk you through it and there is a very helpful error messages and you're going to get a lot of them and you'll get through it. Um, something I didn't realize James, which is classic Frank. It's a lot of work to bind an API. I'm just like, Oh, I'll write out a few class definitions, hit compile, pop, pop, pop up this PR, it'll be approved in an evening and we'll move on with our lives. No, this has taken months of work. That's pathetic. James: 11:17 It's not months of continuous work. It's probably like a week of continuous work, but it's taken months in real time. Um, what else to say? Where else should I go with this? Well, I mean, I guess the, the one thing that I'm very fascinated with is now that you've done it once, do you think you're more likely to do it again? Yes, absolutely. Uh, number one because it's still unfinished. I still haven't gotten to every API, um, but there's still a lot of lessons I had to learn. Um, the, there, there are a lot of tests that Xamarin runs against things and I'm still not clear why I was breaking some of those tests. So there's a lot of infrastructure stuff in dev ops stuff that I still have to understand. But, uh, from an API perspective it was all pretty reasonable and easy to understand and doable and the builds, once they got working work just fine. I had it running on devices and everything. And so I would say from a developer's experience, it's fine from a dev ops experience and you know, getting the PR through, it's a little bit more of a nightmare. Frank: 12:29 Got it. Got it. That makes sense. That makes sense. Um, such a thrill to build the product change anything as true and you now ship code that everybody will else be using, like you did that that happened. James: 12:44 Yeah. Yeah. Except the API that I bound is kinda terrible. So I'm working on a, a library that wraps it and gives it a more friendly, a very much more.net API. So I hope that we'll talk about that if I ever finish it on this show. But, uh, this is a multistep project and I'm glad it's made it finally through this step and uh, can keep moving up. Frank: 13:09 I like that. I like that. All right, well onto the next topic, which came from Twitter, I want to say, did it come from Twitter? No, it came from our discord as well. I'm trying to, it's always hard to remember where all the, we have so many different things coming in. So from Mike are simply said, system, threading, channels, API. And I said, I don't even know what that is, Frank. And then you're like, I knew what that is. I know everything. Um, and, uh, and, and, and this was a brand new blog post that came out from, uh, Steven TOB. Um, I'm going to say that's his name. Um, Steven, if you're listening, I'm so, so sorry. Uh, and this is something that is very similar to another topic that we talked about, which is system IO pipelines and, uh, specifically people in the discussion here. Frank: 13:58 Uh, we're asking is this the same thing or not? No, we're very unsure. And what system threading channels are, and I'll, I'll, I'll tee it up while Frank goes into the deep dive because after I tee it up, I'm basically lost. I'm never gonna use this where Frank does work, where things like this actually are useful. So system IO pipelines was really about producer and consumer models for handling streams of data. And this is used a lot in um, the asp.net, I think like signal, our Ash type of things. And the idea is that it really helps with working with streams of data, being able to move those segments of bytes around between different components of your application. So this consumer and producer model that's there where system threading channels is more about, um, handing off arbitrary data between them and that, that's like a high level I guess. And I'm, I'm even reading through and it's really just about like reading and writing data. And I always look at whenever I see, read and write, I'm like, Oh, this must have something to do with the file system or something like that. That's always like my, my queue. But um, maybe you can go a little bit more in depth about this producer consumer problem that people run into. Frank [inaudible]. James: 15:17 Yeah. Um, uh, your build system is a producer consumer problem. Uh, something, find something, produces something, queue something up, something else has to process it. Uh, where did, where did it go with this? I like how you were talking about pipelines before and before we even talk about producer consumer, I just actually draw, the distinction I find is that pipelines are about performance. Making sure that there's basically zero overhead. When I'm doing something, you know, IO heavy, I'm trying to do zero copy kind of operations, whereas this new API, this system threading channels is still about the producer consumer thing that the pipelines is doing. But it's about making tasks friendly and programming friendly. So it's not about performance though, of course, they're trying to make it perform it. This is more about let's just give it a nice API. So I think that that's kind of the distinction you can draw between the two libraries. James: 16:16 But yeah, fundamentally they're both this producer consumer. You could think of it as queuing someone's, um, and queuing something, something, someone else's de cuing it. This happens a lot if you build, um, pipelines, like, uh, everything goes back to machine learning, right James? So he always have to do some pre processing of your data. Well, I run it through a converter than another converter than another converter than another converter. Each one of those is, um, you know, one is locked, waiting for the others. They're all tied together in this pipeline. And sorry for using that word and um, you know, some are, um, some are submitting data, some are pulling it down. So it's a very natural concept that I don't do it so much in apps, but in scripts and things that are coordinating many different activities that comes up all the time. I think we both did a terrible job of explaining this. It's a good article though. Everyone should go read it. Steven Tov. It's a good article. Frank: 17:22 Yeah, pretty much what you're going to need to do here is, I'll read it, but I will say here's the really cool part about that is that while it's part of.net core itself, the shared run time, it's also delivered as a new package. And I think this is something that you are hinting at that is really cool about how the team is working. James: 17:42 Yeah, I was, it was just an off the handcuff compliment the.net core people. They often say this library is a.net core library. And that's very much true. It was developed by that team, but they've had this great pattern, this great history of releasing those libraries has not net standard and not even just.net standard two they're going on in this, in this case.net standard 1.3 I think. So you can use this library on some old stuff. And I just, I just am so happy that we're living in this world now. Just remember like five, six years ago before.net standard and how every library didn't work anywhere else and no one could agree on any platforms. It was just so nice to see that this new library out there works on pretty much everything. Yeah, it's super duper rad. Even if I don't understand it at all, I will never understand it or maybe ever use it. James: 18:36 But if it does for you, because there's people in the comments that are like, this is perfect, this is amazing. So I need to actually read this full thing and then meet up with Steve and apparently and figure it out. So basically anything that makes multi-threaded programming easier is good. And this is one of those things. It's a new, it's a good primitive to have if you're trying to do super high scale stuff. And I feel like where this will be used as in library creators a lot because that's sorta how I saw the pipelines and more like I will use your API that will sort of hide this stuff from me. That's sort of my assumption here. But I could be wrong. Yeah. And this, this library is more about safety. I mean you can implement what this library does and just a few vines of code. James: 19:22 That's what's so neat about the articles. He walks you through that. But this one's, you know, approved by engineers, spend a lot of time with tests and all that stuff so we can just drop it in and use it and not worry about having too many bugs in it. Good stuff. So James, now a topic, I'm very excited about GP use this. Wow. We're really doing some machine learning in this episode. I'm very excited. So I was on Amazon today. Well no, let me start over. I was complaining on Twitter, we are this week and I was complaining that I was paying for cloud computing because I run Macs and you can't do any neuro network training on max cause life is terrible because that's why I'm working on that library. But so I ended up spending 100 bucks and I was like, you know, what is the difference between spending a hundred bucks or just going and buying a super ridiculous video card? And that got me onto Amazon. Do you ever have these moments of weakness on Amazon? So about three days ago, two days ago, two days ago, I was sitting at my computer. I have two desktop computers. One has a nice GPU in it. It's a nine 80, um, Nvidia and my other computer has this built in Intel. Now the builtin Intel, um, Frank: 20:48 like this, this processor is like way better than the one with the GPS. I have like I have like an older CPU with a great GPU and then I have another computer that has a great CPU but just built in crappy GPU. And it had me thinking, man, I want to play some games. And unfortunately the computer with a good graphics card ha has, has a bad CP is, I can't really play games on it, but my streaming computer, I do like my development and stuff like that where I can play games. It doesn't have a GPU and I haven't needed one because I'm just doing normal mobile development without GPU and stuff. So, um, I then led down the rabbit hole of Nvidia versus AMD versus all these four gigs, eight gigs, this version, that version the, the rabbit. Oh, do I get the, this, this one or the five 60 or the five, seven year, the 5,700 and I'm like, Oh, there's so many version numbers. Right. Um, and then I bought a graphics card. That's what I did. James: 21:48 Ooh, on Amazon. Has it arrived Frank: 21:53 for the Nepal? No, it will be coming in tomorrow, I believe. I wanted a relatively not what you wanted. I wanted a relatively introductory mid level. Good for 10 80 P to 1440 P resolution gaming. And I landed on the Nvidia five 70, which is maybe a generation, two generations back, I believe. James: 22:18 Classic card, classic card. Pretty much every laptop has that chip in it. Yeah. $150. Nice. That is significantly cheaper than the one I want. Yeah. What do you, um, I, I'm, I'm starting out in a better position than you. I have a GTX 10 70 and it's good. It's fine. It does stuff. It moves things. But um, so the cloud dust GPS that you rent are these amazing things called the V one hundreds and in a real moment, a weakness. I want an eBay to see how much of you 100 would cost and the cheapest one from the dodgiest person was four grand and they went up to 10 grand. I was like, Oh, not going to buy one of those. So, so I got depressed and moved on and did what I said about the cloud computing Frank: 23:13 [inaudible] but then James: 23:16 I did some research. I'm like, I know they came out with some new video cards. Time has progressed. I wonder how these new video cards are performing. I wonder if anyone's done any benchmarks and it turns out the people that are my cloud provider also have GPU benchmarks and so you can go see exactly how fast these video cards are at the exact tasks that I want. Training your own networks and it turns out the [inaudible], excuse me here at R T X 2080 ti, all those letters are important I guess runs at about 80% the speed of the V 100 and that's pretty awesome cause this is a gamer consumer grade card, not a crazy high end card. So that's pretty awesome. How much do you think the cost $1,000 a fall? Only 1200 Oh my goodness. I was close. I was close. Okay. It's a ridiculous amount of money but I also spent $100 in cloud computing and essentially one evening. James: 24:22 So I'm like, Hmm, can I do that 12 times a year the card would justify itself. Plus James, I was listening recently listening to John Sarah CUSA ATP debating, buying the Mac pro with pro monitor and pro stand at. He is in for so much money that like it made the RTX 20, 80 ti. I see. Not so bad. That's true. That's true. Yeah. I mean that's where I was sitting right was I could totally play my games with this thing or I could, you know, figure out something. And you know, the AMD five 70, actually it's actually that one is equivalent to a, uh, in between a 10, 60, and a 10 70 on Nvidia. So it's very close to the graphics card you have by the way. Oh, okay. Um, just Romae AMD. So the numb, the numbers are weird, right? Because they're all funky. So the, the five 70 R RX is very similar to yours. James: 25:17 And, but I think what the question I have for you is, did you, you are experimenting with cloud GPU compute? Like do you think you, you like use a two powerful of one endless so you can get the results back too soon. Like, what have you waited longer and also like, do you think if you buy this graphics card, will you get the same exact results of what you paid $100 for? Yeah, let's start with that last one. I don't know. Um, because they have crazy setups and these cloud things. So specifically a machine I was using had four GTX, 10 80 ti eyes. So for last generation cards, so that, by the way, that is still a very expensive card. I was looking it up. It's still like $800 or $700. It's very expensive. Yeah. Actually that's kind of what put me over to the 2020 ADTI is cause like if you're in for a penny and for a pound, if you're up to $800 already, just pay the extra and get the newest low all in. James: 26:19 Go all in. Go all way. Um, yeah. Yeah, yeah. I, I don't know the answer to that. Um, I forgot the previous question now. Sorry. Oh, um, if you think that, that you were using a two, two powerful Dick, did you actually need that powerful of a machine? Like was your hundred dollar spend actually overkill? Oh God, no. And once you get started in your own networks, you just keep making them bigger. You just bigger and bigger and they do more and more because it's functional. So you just call the function again and then again. And then again. Uh, so no. Um, so this network I was training on my iMac pro here using its CPU and it took a week to train it. I trained it on the IMAX GPU and it took five days to train. I trained it on the cloud thing and it took a half a day to train and you pay by times. James: 27:21 So the faster the machine, the better. It's, it's, this is pure power, this is pure big iron kind of stuff. The faster the better. So are you happy? So we're extending this one into two to two lightening talks I think. Sorry. So do you think that, um, I mean in this regard, I mean time is money and I mean, I, I know that some people on Twitter were going back and forth with you on on-air, right? Just like the, you were tweeting about your span, how is going up and X, Y, Zed. But I mean it sounds like it saved you six and a half days. Yeah. $100 for $100, you know, which seems like a small investment. If you break that up over how many co like how many coffees of is that a day? Like that's a two copies a day, right? Sure. James: 28:11 Yeah. Okay. Or, or a costs $100, however you want to look at it. Um, okay. Is your time, is your time worth anything? Um, so I was trying to make a weak argument that this is a hobby. Therefore, um, my time is free in some ways. It should just be a background thing. What I'm paying for is anxious. I just want the answers now. So I'm paying money to get answers now because I want the answers now. And the question is how often am I anxious? Am I anxious 12 times a year and is it going to cost me $100 12 times a year or am I anxious once a year? And you just kind of don't know and it's hard to predict yourself. All I can say is I didn't like having the ticking clock, the dollar 50 an hour every time I wanted to run something because it just makes you paranoid. James: 29:04 I made the argument that would we, any of us be programmers if there was a dollar, if we had to pay a dollar 50 an hour to do that. I don't think people can get into machine learning and neural networks, but with the, with the industry in this state, it's bad. Yeah. I think that that, that's hard when, when you don't know how long it's going to take to finish and you're just sitting there waiting and waiting and waiting and no, that's actually fun. That's fun. Like watching the neuro networks is always fun, but that goes, you're watching the dollars. Yes. Yeah. The dollar sign and yeah. Yeah. I think Frank: 29:41 that is the, that's the difference that you are watching it, but you're also watching your dollar ticker go up, go up. It's like filling a gas tank. You don't know. You don't know when it's going to stop. You're like, okay, it's going, it's going. I was filling up in New Zealand and it's in leaders, which is, is is kind of what happened here. It's like one, I don't know how many leaders the gallon it is. And also it's extremely expensive. So it's going and then it's also a New Zealand dollar to U S dollars. So there's all these conversions in my head going and I'm like excited but then not excited and like that's way too expensive. But then is it expensive? I don't really know. And then you know, I gotta get gas to move so they're there. It is. All right, last topic cause we're skipping one, which is actually really, it's a blazer topic. It's so big. We're going to do a whole episode. I think I'll probably do it justice. Yeah. So, okay, here is the last thing that came in from Twitter, from Arlene. What are the next hot topics in technology coming from? Like what do you think the next hot tech topics are and where are they coming from? James: 30:44 Oh, that's tough. I think, um, what is the current trend we're living in right now? I feel like we're in the IOT bubble right now, so I guess what's coming after the IOT bubble and it's always hard to kind of predict that stuff, but I think, um, we're gonna see smaller, smaller, lighter, smaller. It's just what we always do and make things even more and more mobile. Um, I'm going to start with that. Toss it off to you while I think some more. Frank: 31:13 Okay. I think for me, I think you are right. I think for me it's an IOT slash AI machine learning buzz. Um, currently like there's tons of bugs and I don't think that that's going to go away anytime soon. However, what I do think that we're gonna see, um, is hopefully a transformation in our personal computing because it really hasn't changed since the introduction of the smartphone. Things have gotten smaller, they've gotten faster, they can do more things. The apps have gotten better. We've redesigned things here and there. For me, I do get really excited about, um, any change in any form factor, um, that will change how I work on a daily basis. So I obviously work at Microsoft, so I'm a little bit biased, but I think that some of the stuff we're on the surface with like the detachable keyboard where I can have this really lightweight machine that is a tablet. Frank: 32:08 And also is this like the iPad pro, right? Also very similar like can I transform what is a traditional device into something and more. I also get really excited and in that vein of thought around the Neo and duo devices, the dual screen devices from Microsoft that were announced earlier in the last few months. And, and if those will really transform some of how we work on a daily basis or even how we rethink building and shifting our development, it could enable some new types of experiences that maybe only target those devices and are really only specific for those devices. But I think that, uh, um, really interest me quite a bit. Very similar to how VR and mixed reality and augmented reality sort of was a look at this technology sort of kind of existed. But how do you bring it mainstream to transform how we engage, um, the there and I think maybe these dual screen types of devices could, could start to shift a little bit, um, in general as something new. Right. I think it's, it's not brand brand new, but I think there's a push at least sort of like IOT has always been there. Like you said, it has kind of this huge rush, right? Well, we see this big rush. I'm not sure if we will, but it does excite me at least a little bit. James: 33:28 Yeah. The, the thing with IOT, and maybe everyone hasn't digested this yet, but we're just at the beginning of it. There's going to be wifi and literally everything or some form of it and it's going to be freaky. The future is going to be freaky. So all these silly nerdy things, all these ridiculous devices that we have in our houses are just a precursor. Things are just going to get more and more black mirror on us. But the good thing is we're going to get flexible screens out of it. And I'm really looking forward to that. I was just thinking about it when you're talking about your dual display devices. I'm like, I don't, I'm not even sure I care about dual display, but I do care about flexible screens. So anything that makes my device more rugged I'm totally in for. But I think in general, I'm most excited to see, um, what the next few generations do with technology. James: 34:22 I had a five-year-old, uh, kind of typing on an iPad and drawing with a pencil. I had a two year old baby kind of drawing on the iPad too. And I want to see, um, what devices and apps and expectations and technologies that that generation has because they had access to the internet from babyhood. They've had access to touch devices since babyhood portable knowledge from the entire world. Um, I'm just so excited to see if anything comes to that or they completely squander it. Like we all do. So I guess I shouldn't have any expectations. Yeah, I would, I would say that Frank: 35:04 there is one other piece of technology or things that I'm really, really fascinated in which, um, doesn't to me it's um, kind of near and dear to my heart, but I think everyone tries to pay attention to this type of stuff, which is sort of global impact on like the world, right? Like with trash in energy and ocean cleanup. And I think the one thing that has really draw my attention as far as new technology goes is how do we create new tech and use some of these things that already exist, trade new technology to help clean up and improve the environments. Like the ocean cleanup project is something that I've been following for so long. I think we might have even talked about on the podcast, but they have this amazing brand new river cleanup, um, vessel that they put that are cleaning up these rivers and using IOT and like using all these sensors, like funnel it in there, creating new tech and understanding better like how water and rivers and oceans moved to help clean up this mess that's out there. So I think there's going to be a huge push for new technology in that space. Um, that really excites me a lot because that's stuff that's combines a lot of the stuff that I, I'm really geek out for. It is like a big machinery, things that are doing positive things for the environment and have all this crazy tech that we've been in and around for many years. I'm like, Oh man, I want to be part of that almost. I think it really, really interests me. James: 36:30 Yeah. Yeah. That and I want that Boston dynamics dog, that yellow robot thing. Yeah. I think that we're, we're finally getting to the stage where we're fine. We're going to have the robots we were promised in the 1950s, um, mechanically thank God to battery technology and modern motor technology. We can finally add some strength to these robots and they can actually perform some decent maneuvers. Plus on top of that, we have our machine learning, um, being applied to all of this stuff. And so I'm really looking forward to the next stage of robotics. Honestly. Uh, the field has been pretty stagnant for the last 15, 20 years. I mean the Boston dynamics book was written in 1983 so call that 35 years of stagnation in that industry. So I'm very excited to see that happen. And then, um, I, I'll probably play absolutely no part in this, but I can't wait to buy an RV that drives itself, that is solar powered and you know, has zero impact on the environment and just drives me around the grand Canyon all day. That's the future. I want Frank: 37:42 cyber truck, um, and baby Yodas baby ODAs and cyber trucks and dynamic robots. Boom. All right, cool. Let's get outta here. We went way over five topics, five to 10 minutes each. We totally did it Frank. Um, we will be back and next week or right before the holiday season. I will hope that everyone have is having a great December. Of course, please send in your topics, go to a merge conflict.fm. You can write us an email, there's a contact button, there's also a discord button if you want to come chat with us. And, um, also you can hit us up on Twitter at merge conflict FM. Uh, some of you, you know, may have, um, at any time put in lightning top topics. We put that in our list all the time, so, so definitely hit us up on there. Frank, thank you so much, um, for podcasting with me and um, um, I've missed you. It's been a long time and I'm glad that we're back at it every single now. James: 38:36 Yeah, we should, uh, talk about your trip on the next step of sewed, cause I have not heard anything about it and I'm sure it was awesome. Also nuclear fusion. I forgot. I'm excited for that. There you go. Perfect. Oh, also did you watch the bill Gates Netflix documentary? Oh no, no, that, that one I feel is a little too close to home. Should I do it? No, no, it's super good. It's super amazing. Also, the third one is about spoiler alert, nuclear fusion. So you'll totally watch gag now I have to watch it. Okay. It's super good. Super good. Let me know what you think. That's your weekend. Hope everyone has a great weekend. It's all next on there's been another merge conflict. I'm James Monza mag now. Speaker 3: 39:17 I'm Brian Krueger. Thanks for listening.