mergeconflict259 James: [00:00:00] Rank rank rank right now. I want to begin this podcast with something a little bit different, which is we normally do at Patrion bonus episode, shout out to our patrons. Uh, and, and we actually usually read comments, but in, in the last Patrion that none of you can listen to. Sorry, I must've become a patron. I thought that it would be really cool, Frank. Was to start a newsletter. Maybe you said you wanted to start a newsletter. And we actually got comments. We never get comments on anything we do. And I implore everybody. Whereas you do get some comments on the website. We read those and you can email us a merge conflict out of him, but everyone wants to see a newsletter. So I want everyone to implore Frank to have a newsletter. Cause I think you would, you would say it's about your stories of your journal, of, of, uh, of, of what was it that you were going to do, like history or whatever. Frank: [00:00:53] Oh, we had, uh, the, the terrible bronze age, but I think we did a mashup of another topic. Um, that's I, oh my gosh. No. Did anyone actually write in, are you gaslighting me? James: [00:01:04] No, no, no, no. Like literally, literally multiple people on our Patrion page said that used to start this bronch bronze age newsletter. So I want to implore. Are hundreds of thousands of listeners, which are not, but there are, there are plenty of you. If you want to see a weekly or monthly, whatever the cadence is. Tell Frank on Twitter, the Vigo, and there's a Twitter link in our show notes. You can click on that or you can go to whatever page, but wherever you can want spam Frank to start, it doesn't even cost them anything. Twitter has a free service that I use to get, get review. You are new bronze age newsletters of everyone wants that. Tell Frank, what else is he doing? He's not even doing any emails at do at Frank. What are you like taking photos of making 3d renderings of shoes or something? You got tired. Literally you got time, I've seen your Twitter feed, Frank. Frank: [00:01:54] I mean, it's, I'm going to have to fit it in between waking up staring at a wall for what 16 hours is what humans do and then going to sleep again, I guess, I guess I could squeeze it into that schedule. Um, nothing would make me happier than to write one of those, but I don't, I don't think I could do it. Uh, okay. But I appreciate all the comments are one that is very lovely of you to say, and I'll consider it. And then I'll sound like a moron in my own newsletter. Why did I pick that topic? Cause I love that topic. That's why, that's James: [00:02:27] why we'll do newsletters about stuff. Just like my newsletter. I just talk about coffee and then what a dark dub dub DC. Um, you know, I went to your, I went to your Twitter feed and there was a shoe in three dimensions. And I don't, there was no bottom to the shoe, Frank, so I was a little dizzy, but there was a shoe on your Twitter page. Frank: [00:02:51] There was indeed issue. I was very proud of that shoe because that shoe was the combination of me installing two different operating systems to get working that shoe. Um, what that was was we talked about WWDC last week and apple added a photo Gramma tree API to Macko Wes now to iOS. And I'm very proud of myself for pronouncing. Photogrammetry correctly right there because wow. What am I full of a word? If you are unfamiliar with that word is the wonderful technique of reconstructing 3d models from a bunch of 2d pictures. It's not the newest. Tech out there, but it's baked into the operating system. And I think that that's kind of amazing because this technology, it has a lot of applications and I thought we could spend maybe an episode talking about it because I'm excited. James: [00:03:55] I think, do you believe it was one of the coolest, there's a, there's a few cool demos that they did. And the one first one was, I think the, the app. Unity thing where they, like it went across devices and drag and drop a thing, but it's not really a developer thing. Right. And this one though, is a developer thing that you can use. I thought it was probably the second coolest demo that was out there because how they use it. So here's the real world is, imagine you are a. Or you are somebody that is building, uh, an application to help you place a monitor and you're set up your office. You can scan 3d objects in real time and you can put them into the application. You can place them in an augmented reality space or imagine you're there. They were like Wayfair, right? They could literally Wayfair could scan in every single item and you could get a real, not like a foe. Three-dimensional thing, but our real three dimensional, you know, object inside of that and you could place it and you could really do real stuff even more real than before. Frank: [00:05:08] Right. So the wonderful promise of photogrammetry is that mere humans that don't want to spend days and weeks and months inside a CAD software, uh, can create 3d objects. Uh, and that they're good objects. They're texture mapped to the point where. Yeah, I think really all storefronts should have 3d objects in them now because I'm pretty tired of the five pictures from five random angles. I really want a free form model that I can just rotate around in my browser. Oddly enough. Um, golly, um, Blanking on the name. There's a very famous hardware supplier out there and they have this wonderful online store. I think it's called masters and someone's screaming into their podcast player right now. And I apologize, uh, they have one of the best web stores and it's one of the best because they actually have a 3d model. Of everything that they sell and that's good for them. Um, mostly because they're dealing with suppliers and the suppliers can, uh, pay to have artists model things for them. But it's not something any mom and pop shop can do. You go out there, you make a new shirt, you take a few pictures of that shirt and you sell it on Etsy. Well, what if you could put that shirt on a dummy walk around it a few times, take about 50 pictures of it and put it. Really nice 3d model onto your webpage, your app, your VR experience, your AR experience. I don't know, man. It's I really hope that all storefronts have 3d models from now on. Yeah. James: [00:06:48] You know, I, I envision right now, I think of unity is a good example. Like unity has this whole asset gallery and there's other things that have asked, like, imagine you're a game developer. Imagine that. Right now how people are creating these objects is that they literally have to create them in these tools. And this may have sherlocked every single graphical designer in the world by creating these 3d minutes. But, you know, imagine, I mean, I'm just kidding. There's always a need, cause there's not, everything is real. Right. But imagine you're out there and you're like, oh, I need to, I need to create a bunch of shoes from my characters. Well, I just go and I go to the store and it just. Texture have a bunch of shoes. I don't have to pay for anything or imagine how big the asset galleries could incorporate this into, um, into enter their systems where now anything that exists in, in the, in the real world can be mapped into a three-dimensional object that you could place in games and this and that. And you could really have some really fun stuff, especially like putting an Easter eggs of really nifty things. Everything physical becomes digital instantly. Frank: [00:07:53] Yeah. And you mentioned unreal did GS, or you mentioned unity. Did you see that unreal five is out. Yes. Yes, yes. Yes. Unreal five. It has this amazing feature where it had adaptively down samples, all the geometry and up samples. I don't think I'd ever upset. We'll sit down samples of the geometry so that artists can basically throw in as many triangles as they want and find a way. This has been a promise. Since the nineties game engines have been trying to do this, it just hasn't been working. Um, the game engine will adapt itself and render just enough triangles to give a pretty much perfect image. It's like down to one triangle per pixel, basically. But at the same time, that's still far fewer triangles than ever used to be there. Anyway, all that's to say is. Geometry is becoming free in the artist world. And so we need to get to scanning everything in this universe. So yeah, I started with my shoe because, I mean, what are you going to start with? I thought about it. And I'm like, I had a million little cute apple products I should have started with, but for some reason I thought what's a little object and why not? My beloved shoe. I actually did have the wonderful thought of one. It'd be fun if I could 3d print my shoe. But then I found out my 3d printer isn't big enough, but I still really liked the idea of being able to scan something and then print it. I don't know why I actually don't have any real use for that. Just the engineer in me. Absolutely adores it. James: [00:09:29] Now on the scanning, uh, I've been in your apartment. Okay. Yeah. And, uh, let's just say lots of room one, not a lot of room to let's just say it. Isn't the, it's not like a studio where every wall and floor is white and it's on a podium. Okay. Let's just say it's not a free of all dust particles, if you will. Um, Frank: [00:09:57] you live James: [00:09:58] that people live at, but I, you know, I place where I live is not, not spotless. Um, so the question is, is, is. How did that impact the scanning process? Cause I, I saw the demos they did, and it was in this. Void of non-real life. Frank: [00:10:14] Yeah, it affects it. Hugely. Of course it affects it hugely. Uh, in fact, I got a little bit lucky with that shoe because I tried to scan a few other things afterwards and it wasn't going that well, but the problem was the process is quite a bit manual right now. Apple released two samples out. There, and you can find them on their developer pages. One sample is how to capture images, which isn't really rocket science. We've all built camera apps at this point. We know how to capture images, but there are recommendations when you're doing photography for this kind of stuff. One of the big ones is don't change the focal length of your. Camera, gosh, darn it. This algorithms already trying to figure out enough things. So like no zooming, none of that kind of stuff, it really should be allowed. So, uh, you use this app and it follows a few of those rules. Then this app doesn't really have a great way to get files off of it. So you gotta plug it into a computer, use the iTunes file transfer, drag the files over, and then they have another sample app that you can compile up and run. So basically that whole process is ridiculous and too long. So I started writing it down. To try to help me along there, but it really wasn't until I had my own app, that I started scanning a bunch of things and getting a better feel for what it's good at and what it's bad at. So this absolutely did not Sherlock any artists out there. Um, because the models that produces, I think are quite excellent and are definitely state of the art. They are not perfect by any stretch of the imagination. But for, you know, the gobs and gobs and gobs of background scenery in a modern video game, or if apple has an AR experience, uh, it's perfectly fine for that. Uh, Mo more about the cameras set up, but I've been talking to them. No. James: [00:12:10] Okay. So here's a, here's a realistic use case that I want to know if it is practical or not. For this technology. I look at an application like I circuit 3d. Okay. Where you have recreated three dimensional versions of a battery of circuit boards of circuits of little lights would something like this. Have helped you in any way, like, would it have sped up your development or do you think that those models in your use case wouldn't have been helpful at all? Frank: [00:12:45] Oh, it's, it's such a great question. Cause I think that's why I'm so interested in this stuff is because I spent two years modeling things, you know? Yeah. I don't think it would have helped. This is something I kept running into with electronics are a lot of the like default stuff is not set up for that scale. So a small resistor is actually. Pretty darn small, you know, it's size of your fingernail and length or your pinky, you know, your, your smallest, smallest fingernail you've got, um, And there is an aesthetic difference that comes out. When I hand modeled all mine, I was going for a weird blend between realistic and cartoonish. I know those sound like two different ends of the spectrum, but you can do cartoonish, cartoonish, realistic, realistic, or you can do cartoonish realistic. And that's the one I was going for. And if I brought in a bunch of these hand scanned things, they would have a, um, a much different aesthetic feel to it. So if I did do that, I would have to do the entire app that way. Otherwise things would feel really out of place my perfectly mathematically correct 3d models versus this thing recorded in a dusty apartment full of carpet and all that kind of stuff. Uh, yeah. Uh, because that noise makes a difference in the end. Uh, you are getting back slightly noisy textures with a little bit of light information baked into it, even though it shouldn't be ideally for a video game, you don't want any light information baked in, but it's there. And that would come up in the renderings and things like that. James: [00:14:26] Got it. That makes quite a lot of sense. Cause I didn't notice that about your application. Like the battery is not it's realistic, but it's not photo Frank: [00:14:34] realistic. Yeah, I want, I didn't want it to look cartoonish. I wanted it to look realistic, but I, I knew I couldn't get to a hundred percent photos. So if you can't do a hundred percent, you go for a different direction. And that's what I did. That's where the cartoonish part comes in. So what are ideal conditions? Well, uh, you really do want a museum room. You want your, yeah. You want to live in Johnny I's little white void. Um, that would be ideal. So what I found. I, I tried, uh, shooting on beige fluffy carpeting and that mostly works. Uh, the hardest part is getting things well lit. And when, because everything is actually photography based, it is using depth maps. And I'd like to talk about that in a little bit, but it is, this is predominantly a photography based algorithm. It's really just looking at the pixels and the 2d images, which means. If you have a black surface that absorbs a bunch of light, guess what? It has a really hard time figuring out what that surface is. So the things that are at our best at our objects that are naturally diffuse themselves, taken in a room that has a very simple background. It can handle some background, but. If you were a store and you were, or if you were using this for an app where you're actually devoting some time to it, you would just nest. So out a little corner of your apartment or your house, and just put up clear white paper there, put clear white paper on the floor and you don't exactly need a pedestal though. A pedestal would certainly help. Um, but you could, you can certainly create a good enough environment for this thing. I think honestly, pretty easily. James: [00:16:30] Got it that that makes, you know, I I've seen those, you know, a lot of people like on eBay, they'll just take a, a bunch of big white. You know, bored, you know, and it kind of slopes up and it just kind of place it there or whatever it seems as though you could, you could definitely do a pho, uh, version of this. So I want to know a little bit more about what you're thinking about this technology, where you're at so far, and what's next, at least for Frank Kruger in this technology where you think this technology will go, but let's. Then I take a break and thank our amazing sponsor this week, sync fusion, listen to, you know, sync, fusion. We talk about them all the time on the podcast. Now they give you everything that you need to build. Beautiful, lovely web, desktop, and mobile applications. For just about every single UI framework out there. So whether you're building an application with blazer or Xamarin or UWP, your asp.net core react view, flutter WPF, when you, ay, you name it, they got you covered. They have hundreds of. Just beautiful, stunning controls and charts and graphs and list views and data grids. And everything's super optimized per platform companies around the globe, trusting fusion to help their applications be even better. And they do awesome things too, such as have PDF and Excel and word integration too. So you can seamlessly integrate those into your application. I using fusion myself. Yeah. For example, inside of island tracker, I use all sorts of sync, fusion UI. In fact, they did a whole case study on me, go to the same fusion blog and check that out. But I go to sync, fusion.com/merge conflict to learn about all of the wonderful controls that they have for your application. Thanks to St. Fusion for sponsoring this week's pod. Frank: [00:18:13] Thank you. Sync fusion. I love having reliable. Advertisements on our podcast. Yeah, we James: [00:18:22] have, uh, been with us Frank: [00:18:24] or ship just cause we recorded a Patriot on previously. That's why I was searching for the word there. They've been James: [00:18:30] with us for a long, long time. So we appreciate its infusion and all of you listeners. Frank: [00:18:35] Fantastic. So, you know, a lot of things did get sherlocked with the CPI coming out. Do you know the very first place I ever did? Photogrammetry. James: [00:18:47] I do. Oh, no. Um, you made a bus app one time. Frank: [00:18:51] Oh, that would've been great. Oh, if I had, uh, I over obsessed on that map, way too much, I'm into mapping everyone. I don't know if you know that about me. I love maps. And I spent maybe two years of my life not making money, writing mapping software don't ever do that, but that would have been great, James. Oh, he just gave me a bunch of ideas, but no, uh, it has actually related to that. I got into drones. Photo grammar a tree. So you take a drone, you point its camera down and you have it fly in a grid over an area, and then you get a million pictures back home and you click it, click and a magical piece of software and outcomes like a Google maps, or, you know, an apple maps, 3d. Matt. I don't know. It's really, you know, a giant model of the area that you recorded, but it's all, it's a 3d map also. And I got really into stuff with that. Have you heard it? You must've seen some of that. It's so cool. James: [00:19:57] I. I, you know, I have seen and heard of people flying drones over fields, you know, like farm fields and mapping them. That's the use case I am rather. Is it similar? Is it different? Is it, is it kinda like that or no, Frank: [00:20:17] a hundred percent. It's that it is that field. So it's funny because when I did it, there was not much software out there to do this. And this is where I'm getting back to the sherlocking part. But there was one big software out there and they had a good free version. So I used their free version for a little bit, but it was interesting to see how much that software was set up for the agricultural industry. Specifically, these drones were looking for, uh, insect, um, And, and infections, I don't know, you know, when the insects are on the plants and they're looking for, uh, disease infections, both kinds of external infections and internal infections. Uh, so they're basically just looking for the crops to not be green. I think it's actually kind of funny. Yeah. Uh, the software was pretty advanced from the like, you know, use some drones to fly around and map all this stuff, but the way that you detected. Crop was healthy or not. You gave it these different levels of green. And I thought that was a little bit, so, cause like we could have a neural network figure that out a lot better these days then having someone calibrate to green. But uh, most definitely that's where these custom maps have been used the most, but they're coming up in other places like Hollywood is doing photo scans of cities so that they can do CGI replacements. I would love it for a bus app. I've always wanted to make rant F Seattle, and I don't want to actually model all of Seattle. So it'd be great if I could just find my drone over it and get a model of Seattle fun, things like that. So, uh, that software was very expensive and I never wanted to pay for it, but there was an open source version of it that was really hard to compile and get running. And this is years ago, so maybe it's improved. Uh, And it was just, it was just a little too hard to use and the quality wasn't that great in the end, compared to the commercial version. So after this apple announcement, of course dug through all the documentation that does not exist for this API, but in one of their presentations or maybe in like the comments for one of the files, they said something like this can also stitch together drone photos. And I was like, I don't believe you. And I still don't fully believe them. So that is one of the last things I have to find out if this supports it. But if this can also do drone photos, then. Game's over, you got sure locked and I feel bad for all those companies, but they did have a good run and I'm so excited to have this technology. James: [00:22:56] Yeah, that seems super neat because it would really unlock a lot of potentials of commercial use, even more commercial use cases for drones in a mass, uh, an a mass away. You know, one thing that I, I started to know when we were looking for houses a while ago is you would see people do. Drone shots or a drone video of a house. But if you think of house, you think of Redfin or Zillow, imagine mapping in three dimensions and entire house, you get a, you could get the, the, the idea of the scale of it, the rotation, like what every little nook and cranny and angle gives you in these. And you can even do that from a top down the height, you know, different dimensions, the depth as, as things go down, because the thing with. Uh, th this may never happen, but I could imagine it doing that. And imagine you then map the inside of the house and you place items inside of the house. So when you go to buy a house, you can see if all of yours cause what you could then do. Oh my gosh. This is amazing. Oh, Redfin. Get on this. What you do is you map. All the items in your house, like here's the couch, the TV, and you place your things instead of placing generic things, you place your things and you can be like, here's what the house would look like. And then you'd have a whole program that would automate this thing and you just drop it down. This amazing, um, Of course, this would be a lot of work for one person to go through. But if you did want to, you know, as this technology gets faster and you can just take a photo of something and it maps it, but imagine the drone footage of the house combined with the 3d aspect of it, and then like your, your 3d mapping, the plot that you're on and the trees, and imagine then you could say, okay, well, where does the sun hit? And then I can see, like, what if I placed a solar panel here and then does the tree get in the way? Or like, does it. You know, broad, you know, cause uh, those photos that they take in Redfin or Zillow or things like that, the photographers, they do them at all weird angles and they add weird filters on. It's not real. I want the real thing. And I think that'd be kind of cool is give me the three-dimensional house top down and the plot and I'm doing a bunch of stuff. That'd be neat. Frank: [00:25:01] Yeah, it's really interesting because they're both completely valid techniques for accomplishing the same thing. So like those virtual 3d tours of houses, I kind of love them because you're on this weird path, roller coaster going through a house that was maybe, or maybe not cleaned up before the photographer came through. I kind of love all that stuff, but I think we can all agree. There are times where. Like you said, they, they took it a weird angle. They just don't have the angle that you want, or you just want to freely move around or maybe you want aliens to attack and you feel, you want to get the feeling for, you know, what it's like to defend that house of aliens are attacking that kind of stuff, or just getting different perspectives of outside the yard. It's um, What they're doing is kind of similar to the old photo synth technique where you just layer a bunch of photos onto the screen. There is obviously isn't nearly as advanced as that. You're just on that little walk through the house, but. Those are the two kind of competing technologies as Photosynth where you layer a million photos and create the impression of 3d, or do you actually go through the terrible effort, which is honestly, probably more effort than maybe it's even worse sometimes to create this 3d model. They can be rendered on a GPU that's made up of triangles. It's just how computers have developed that. That is our way of presenting the 3d world. Whereas layering images on top of each other is not a very hardware efficient way to do it. It's not a very memory efficient way to do it. So we don't like it. Plus in the end, um, all those photo stacking techniques. They can't let you 3d print the house in the end. You know, I, I would, I've been meaning to do this forever. Do some little drone passes over my parents' house and 3d print them a cute little house. Yeah. Just as a gift, you know, this isn't earth changing or anything like that. I just, I love personalized gifts and things like that. So I've always wanted to, to give those kinds of things, but never had really had the software for it. So I'm really hoping that this works out so I can start giving everyone these terribly 3d printed gifts. Yeah. James: [00:27:21] You know, that is the 3d printing part. I know you mentioned earlier, but that is something that is. Fascinating because you could buy something or buy a part and then you could map it and then you don't have to recreate it in a, in a 3d piece of software and ideally it's to scale and it's pixel perfect. If it has the depth map and all of a sudden you can easily print replacement parts or replacement anything and not have to. Trial and error everything with people, building things manually. I think the house thing is, is really cool actually. Uh, and there's all different sorts of use cases for that. But I also think that you could do that with, if people could do this a lot with manufactured homes where there's a big plot of land, you want to put a house down, you want to do this thing and, you know, I I've been on a lot of websites too. Uh, we've been looking at like sheds a shed builder. Okay. Yeah. Frank: [00:28:17] Shed builder.com. I love, I love it when our episodes turn into app pitches. So James: [00:28:22] I said builder.com. Okay. Right now, today, they're all these really terrible low poly count sheds. But imagine like they, they built the sheds given you the sheds and then you got a 3d shed and then you can. You could add it. I'm just saying someone buy that domain name and that'd be great. Frank: [00:28:39] So I'm not quite into sheds yet at this point in my life, but I'm still kind of in love with my own car. So today I was going to do this. I didn't do it, but I, I wanted to report on this podcast that I had done it, but I have, but I just want to go to get a 3d scan of my car, like who doesn't want a 3d scan of their car, then I can 3d print a little version of it and play with it on the ground. I don't know, silly stuff. It's all silly stuff, but it'll be fun. It'll be fun to have my little car. Yeah. Now there are more downsides. Um, you mentioned my shoe did not have a bottom to it. It turns out 3d printers hate that. Oh yes. James: [00:29:20] Yes. Frank: [00:29:22] You cannot print that model. And. Apple's algorithm was definitely tuned for the media world where this is a game asset. If it's missing a little piece. Yeah. Rotate it. And then the player can't see that piece, you know, it's all movie tricks and mirrors and pixie dust and all that kind of stuff. But for 3d printing, you need a real solid model. And so I was when I was, uh, So my software, I took Apple's demo, put a UI on top of it, cause everything should have a UI. I was like, this is pretty good, but I was still frustrated that can 3d print it. So then I looked into algorithms for turning arbitrary meshes models into solid objects. So even if it's missing triangles, it'll still be 3d printable. And that was. Great. It took me forever to find an algorithm out there. It turns out this is not a solved problem. It's funny because I feel like 3d graphics has been around. Forever, you know, since at least the seventies, uh, but we still have a lot of fundamentally unsolved problems. And one of those is how do you guarantee that a mesh is solid? There are a million algorithms to do it, very varying levels of quality, but I was able to apply one of them. So that. Even if the apple algorithm did not, uh, output a perfectly water-tight model, as they say in the 3d printing world, you can still run it through another algorithm and get it 3d printable. So that would have to be the technique for pretty much anything you're going to 3d print, just because the algorithm does not even as far as I can tell it does not even attempt to make sure that things are safe. James: [00:31:16] Got it. That makes sense. Yeah. Frank: [00:31:18] Hmm. Um, but in the documentation, they also say that you can flip the object over. The algorithm should be sophisticated enough that you take your object, flip it and start taking pictures again. I did that and my shoe distributed itself through the ether and out came a Morbius blob of 3d geometry, and it looked absolutely nothing like my shoe. So I need to watch some R WWDC videos read more about how the algorithm works, figure out what was going on, because I could not get the soul attached to my shoe. Likewise. I tried to scan my one wheel and couldn't get the bottom of it scanned. And that's an especially hard one because it has a black bottom with a perfectly diffused black wheel attached to it, which does not reflect the light very much. But yeah. If like we were saying earlier, if, if a good scan was important to you, you just need to go buy one of those white foam board things or just any, you know, nice diffuse surface to take your photos upon. James: [00:32:30] Yeah. That makes sense. I think it's really cool tech I'm I'm also, you know, they only added it to. I guess, do they only add it to iPad and Mac? Is that how that works? Frank: [00:32:42] No, only Mac only Mac. Okay. And I kind of see where they're coming from. Uh it's it's a heavy algorithm. It doesn't use all the cores for some reason. I, I was expecting it to use all my CPU, but if you give it something like 30 pictures, it takes my Mac two to three minutes to process those pictures. Okay. And I'm almost wondering if it's more of a memory, more of a Ram limitation, um, is the reason it didn't make it to iOS than a processor thing, because we know they put an M one and an iPad. So we know it has roughly the same power as a Mac. It could do it. Yeah. But, um, there must've been a reason either. They didn't want people burning out iOS batteries on it, or most iOS devices just didn't have enough Ram for it. I'm curious. Which one is the reason? Yeah. That James: [00:33:43] is, uh, yeah. And then they didn't really release a, an app on the iPhone or iPad to Frank: [00:33:50] do, right. It's, it's, there's so many confusing levels to this, and that's why I kind of felt like I wanted to write an app and release it as soon as possible, because I have no idea what Apple's what's on Apple's mind over there. So let's say this API is a part of reality kit. And also a part of reality kit is this beautiful piece of software called reality composer, which is not a 3d modeler. More of it's like a really baby game engine. It doesn't really have the like physics. Gosh, it does have physics though. Um, what do you call this thing? You have a bunch of objects and you can put them together in a scene as a scene builder. So they have all the software, but for some reason they chose not to build a UI. For this API. And I don't know if that's because they didn't have time for it, or because they just made the executive decision that we already have too many random utilities out there. And we don't want to get into maintenance of this utility, or they don't want to do support of it or something like that. And so they didn't want to really see UI, but I'm just guessing I, I, I was amazed that they didn't have a UI for it, and that's why I had to write my own. Yeah. James: [00:35:08] And they showed some, like, they showed an app that kind of did it, but it wasn't their app. It was like a third party app. And so it's like, they thought about it kinda or wanting to, Frank: [00:35:20] yeah. I mean they just, in some ways it's great. Uh, we have an API instead of an app, you know, that's good from a developer. That means if you are a store, you can do it on, uh, on your server. You can do it in the background. You can do it in batch mode. You could write a script that, you know, runs this on a thousand files. I want to UI so that I can play around with it and experiment with all of its different options and things like that. But if you're a business, you have a million reasons why you would want. Um, a backend tool, not a front end tool to do all this kind of stuff. Yeah. Uh, yeah. So that part I'm excited apart. I'm just confused. Why I, and yeah, during one of the WWDC videos, they actually had their internal. UI that they were using. I'm like, okay, so you had that. So just no one wanted to put enough Polish on it to release it or no one wanted to support it. Hashtag question mark. I don't know. Hm. James: [00:36:19] That's weird. Frank: [00:36:21] You're the PM. You answer that question. James: [00:36:24] There wasn't enough time, Frank. It wasn't enough time. Schedule a schedule. Yeah, probably schedule, I would assume. And then they come out in the next version or they drop it somewhere else or they just want it to open up for innovation. They didn't want to say I'm, you know, Sherlock everybody, I guess. But, uh, it is odd. I think it's a it's it seems like the two go together seamlessly, but that maybe they figured, Hey, you have a camera. That's good enough. You'll figure it out. You're smart. Frank: [00:36:55] Yeah. I mean, I actually appreciate that the capture part, the photography part is separate from the modeling part, because now you can go use any camera you want, you can go use your fancy DSLR and things like that to take really high quality photography. James: [00:37:13] Yeah. Maybe that's why they thought about it as like, they, they. They wanted to make sure people know that it's separate completely and so separate that they don't even give you an app to do it. So like, here you go. Uh, what else you want to talk about this stuff? That's Frank: [00:37:28] it. That's all I got. I'm just excited because this has been such a little niche thing and now it's. It's in the O S you know, it's there. We can rely on it for what, at least 10 years we'll have photogrammetry services at our avail. James: [00:37:45] Let's talk about one more thing. The new experimental.net rebel. Have you seen this? Frank? Frank: [00:37:50] No. Is there a new one? We had a C-sharp rebel. I James: [00:37:53] thought, uh, we do, this is the new Don at rebel built on net interactive, which powers tried on that and on a notebooks. This is a project that I just stumbled upon from John Secura, who all works on that team. And it's a C sharp and F sharp rebel. And I think it's powered by spectrum council. Frank: [00:38:16] Oh, okay. So yeah, it does have F sharp. So I, I was holding back there cause I was like, I'm not going to support F sharp, so I don't care, but no, they do support sharp. Bravo, Bravo. Uh, yeah. It's command line. It's rendered in ASCA. It's got lots of colors. It has magic keys to do lots of, kind of fun things. That's neat. No, I didn't know about this one. How did, how did this come across your radar? James: [00:38:43] Uh, Hanselman, tell him, I think he wrote a blog post about it. So he told me today about it. And then I was like, oh, let me just go find John's. Uh, John that's on his, on his, I'll put it in the Shannon's it's in his, uh, on, this is a cool, you know, there's a few cool things about this is that. There's like a, you can switch between C sharp and F sharp, like in real time. And then, uh, there's a bunch of these magic commands. So you can, uh, do things like, bring it new, get packages. There's a sharing variables like between sub kernels or something like that. It's a bunch of these commands that you can do. Uh, there's a SQL kernel. It looks like there's a bunch of things I'm looking at the, you know, that, that thing right in there, it says, you know, here's a bunch of the commands. And yeah, you can do multiline, uh, just by holding shift, enter to do multi lines before executing the code, and you can also like execute and run a dotnet notebook as a script. So if you have a IPY and B or a did file, you can run that and load it up. So you can almost use it as a scripting. A tool too, which is quite cool Frank: [00:39:50] that that's actually really nice because, uh, I actually really love notebooks. And so if you have a Jupiter notebook, there is a command you can run at the command line. That's something like in the old software, the two operations were called tangle and we've one in 10. We would give you source code out tangle. Ah, that's what it was. Tango gave you source code out. We've gave you nice documentation out. Anyway, long story short, you would have to run a command. It would generate a file. You'd have to compile that file. Then run the file. So this sounds much nicer than going through that three step process. James: [00:40:25] Great. Oh yeah. And it's just a dynamic global tool. Like I literally just installed it right now. I love Frank: [00:40:31] dotnet global tools, so. Okay, great. So I know. Um, I'm going to start like writing them and charging them. I just have to find a way to charge people for a dotnet global tool because new gate won't give me a store. So I'll have to build a store until my dotnet global tools. Yeah. Yeah. You James: [00:40:48] got to do that. Yes. I'm doing this as they just do down it, you do down that rappel and then you do.net rebel, and then you set the default kernel to C sharp. I think. And then I spelled kernel wrong Kerr. Now turn now, and then you can just start typing stuff. Frank: [00:41:10] Fancy dancy. I mean, they even imported image sharp. That's pretty great. They did. James: [00:41:16] Yeah. That's pretty cool. And then like an ANSI council renderer thing. I don't know. It seems pretty cool. So anyways, just wanted to throw that out there. I know. I figured you hadn't seen it yet, so, but Frank: [00:41:28] I love it. I love it. Anything that brings me back to my, uh, Q basic days. James: [00:41:36] Hmm Frank: [00:41:40] that's uh, yeah, you, you did a much better job pronouncing his last name than I would have John. James: [00:41:46] Yeah, John. Yeah. Secura, Frank: [00:41:50] let's go with that. James: [00:41:52] Yeah, that's, I'm going to go in the show notes. I speak with John all the time. I spoke with John for years, cause I've been using the Tridacna stuff for so long. Anyways, I'll put this in the show notes. Everyone should check it out. Give it a look. Um, yeah, Donna rappel. Give it a look. All right. Cool. Thanks everyone for tuning in. That's going to do it for this. Week's. Podcasts until next time. Uh, James Bond, the magnet Frank: [00:42:11] I'm Frank Krueger. Thanks friends.