James: 00:00 Now Frank, before we start the podcast this week, let's thank our amazing sponsor Instabug. Now if you don't know about in Instabug , it's really important because I'm just wrapping up. You know my development of an application and I want to get it tested. I want to get feedback sort of my Beta process. So I started sending it to testers that discover bugs, you know the process, frank emails and back and forth or maybe they just don't send you anything. That's where in Instabug steps in. It's an SDK that provides your Beta App with a super intuitive bug reporting and feedback solution that helps you reproduce errors and iterate much faster. Now what's cool here is it's not just for Beta testing apps. Once you go live you can continue to gather insights about your users, measure their satisfaction. You can send them surveys right in your application so you can rev over and over again and make a better application. You can send things like you know, questions, star ratings, multiple choices, anything in Instabug is there for you now and it only takes minutes to integrate and all you have to do is go to [inaudible] dot com slash merge to show support for merge conflict. And also as a special bonus for listeners, you will get 20% off any plan when you use offer code merge conflict 2019 and additionally you can get a free 14 day trial, go to instabug dot com slash emerge and thinks it's a bug for sponsoring this episode of Merge Conflict Frank: 01:33 in the early days of Mono Touch Xamarin.OS. The biggest question I always got was what about performance? How fast can see sharp possibly be on the iPhone? Did you ever get that question James? James: 01:45 Oh, when don't I get that question? I mean, we think about, well is it? Most people think of c sharp as a jaded language. They think about garbage collection and kind of memory and then GC hiccups over and over again. And just the raw, uh, uh, fact that you're not going to the, the pure Api APIs are not in the native language or you're not in a a C language. So I get this question all the time. Frank: 02:10 Yeah. Yeah. And I got it a lot too, and I never really knew how to answer it. The answer I usually give people was it's fast enough as in you're not going to notice the performance of the language, you're going to notice the performance of your algorithms or you know, the Ui itself, updating that kind of stuff. The network is usually the bottleneck and most apps. And so I was always a little dismissive of the question to be honest. But in the back of my head, I always knew that performance was always pretty good. Like maybe not, you know, sea level, everything's running at breakneck speed, but I never really had too much trouble with the garbage collector. And when I did have performance bugs, they were usually my fault. Uh, how did you answer the question? James: 02:54 Uh, well I said, you know, I think, you know how I always explained, uh, Xamarin, I think when I was choosing the platform, it was more of me just trying it out, using the API testing against pure like native Java Lang versus c Sharp versus objective c. So I went through those trials because, you know, seven years ago I was in picking a framework. So I had spun up, um, Cordova Apps, uh, Appcelerator apps and then Xamarin apps and the native language apps. And I was PR, you know, comparing their performance. Now it wasn't building out a full application. This is more of a hello world. Click a button and let me see what the performance, and it was pretty spot on. I couldn't really tell anything. But nowadays, frank, I tell the whole story of well, you know, I, you know, Xamarin applications are native and they run natively and if you look at Ios, everything goes through a full AOT compilation. James: 03:48 We take your c sharp code, compile it down into ill code, compile it down one more time into byte code or big code one of the two and then send it through an LLVM compiler and optimizer and Bingo Bango. And what's really cool now is it's not just ios as there's also an AOT compiler or a head of time compiler for Android, which is kind of bananas a to get that performance. Now all of that said, frank, all of that said, I've probably said it before on this podcast. I say the words, I understand most of it, but I don't really understand all of it. No. Wow. Frank: 04:23 Well, I mean honestly, I don't know if it's bit code or byte code off the top of my head either because the general term is byte code for that reason. I think they probably went with bit Code. Yeah. Dot. BC files. Everyone ignores him. Technically. If you look at your build step, all you're compiling a Xamarin APP, they're there, but everyone ignores them. You don't want to see that stuff. It's a deep down, deep down. And that's why you don't want to know. But I have noticed that, um, the Mono Aot is getting a lot of love these days as you said. Um, it's available on android now. Um, and the blazer people have picked it up and they're starting to compile their blazer WebAssembly apps with it. The Uno platform, which is UWP WebAssembly stuff. Uh, they are compiling with it and I was at a conference recently and they started showing some performance numbers and they showed 'em blazer running interpreted and uh, uh, in the browser and it ran all right. Frank: 05:19 You know, that worked. Everything was fine, but then they ran it on the desktop and.net core and it was so much faster, so fast. You're like, Ooh, okay. I guess the interpreter's not lightening fast, but then they compiled it with this, uh, I think it's a preview or experimental support, uh, AOT where they're using the Mono AOT and it ran faster than.net core and everyone, there was a hush over the, this is all available tech. This is all open source stuff. The motto motto has had AOT forever but people don't know about it I think is kind of the problem here. And the Monono Aot as he described is pretty amazing. Mostly because it uses what Miguel Dean calls the magical Llvm optimizer. James, tell me everything you know about LLVM. James: 06:14 Llvm well I know that we had to do a lot of work because uh, we had our own LLVM type of stuff and then at some point we had to do use the, the one that apple is using or something. Uh, it is a compiler infrastructure project that is a collection of modular and reusable compiler and tool chain technologies. You said developer develop compiler front ends and back ends. That's what I know about it cause I just googled it and that's what the Wikipedia article says. That's beautiful. That's really lovely. What I know about it though, frank, is that when I am building my ios application, it's not doing LLVM compilation because I'm going to a simulator now when I go to a device or do a full, I think even for a device, I don't think it does it, but if when you go through the full packaging compilation, this is the part that takes a long time. That's what I know. And that's supposed to be like this makes your APP fast. Frank: 07:09 Oh yes. Uh, just go into the high level picture here. You said a great Wikipedia entry there, but um, the easiest way to think of Lovm is it's a super optimizing compiler and backend. So it optimizes, it can take code and make it fast and it can emit stuff, which is like arm and x86 and all that. It has a configurable backend configurable front ends, super nice technology. So one thing is usually I don't think LLVM is on by default. If you do an Aot Xamarin app today, I think file new project still has it off by default mainly because most people just don't need the ridiculous optimizations that LLVM does. And as you said, it adds a great deal of time to the build time. Now why does it add a great deal of time? Because it's a super optimizing compiler. It is digging through your code, looking at any little thing it can do. And I've had a lot of experience with it lately and it's a pretty amazing technology. It's just anywhere I can find a shortcut. It will any, any little trick you can think of, it'll apply it. Back in the day people used to say, you know, someday we'll have optimized and compilers and it's pretty amazing that for free you can go download pretty much a state of the art optimizing compiler. Llvm and I'm looking at James: 08:33 Wikipedia article and our good friend friend of the show, Chris Lattner was one of the original authors behind it, which uh, people may not know, but worked at Apple. And did he did swift as well? I'm pretty sure that cause that correct? Frank: 08:46 Yeah. Yeah. Uh, the, the amazing thing about Llvm is it's cleanly written. I don't know if you've ever looked at a source code, but it can be sometimes very difficult to follow. We've had whole episode's about it. And, um, uh, I think it was Chris Lattner his idea in school. He said, look, I'm tired of these poorly architected compiler things. It's really hard to write optimizers forum. He's up performance junkie obviously, and uh, got approval from his teacher, I think did like a phd on it. You know, I don't, I don't know if he has a phd, but obviously did a lot of work on it, wrote it. Um, so LLVM is this optimizer, but you need a programming language. So there's also the clang, a c compiler, c, l, a,N , g, and that's what x code uses to produce all their code. So all Ios code is compiled with clang and clang is an Llvm sub project. Frank: 09:46 I don't know how you want to think about it or just works without LLVM. So those are the big ones. But there are other languages, like you said, swift, uh, fortran, I guess. Uh, you know, any, um, any front end language, you could make any language output LLVM code, shoved that through LLvm and habit, optimize and put it down. It's just a nice architecture. Yeah. I'm looking at the Wikipedia article and it does say c sharp and that has a bunch of little articles to it. A common lisp. Chris will Kuda d Delphi, Dylan, Fortran, graphical g programming, Haskell, Java bytecode, Julia, ia Kotlin, Lua, objective c, open GL, shader language, Pony Python are Ruby Russ Scala Swift, Zoho Action script. I had all the things, all the things. There's a lot more, but that's actually surprisingly a lot that there's a supporting LLVM. Well, was what it is, is that everyone saw how powerful it is and we all said, okay, let's make our languages target that. Frank: 10:46 And you know, uh, throughout this whole episode, I keep saying c sharp, but what we should be saying implicitly is .net any .net code. So this includes f sharp. Uh, this includes, is that, that's all we got. Oh VB oh, don't forget about VB. Yeah. So this can totally optimize VB code also. So my questions, my question then is though it's interrupted, I'm sorry, is what of the .NET I guess projects or anything have have taken part of this LLVM in AOT scenario a and are about to write because the only thing that I really knew for a longest time was the ios stuff. And we had to do that because ios applications are ahead of time compiled. And I thought that that was kind of a natural progression through it. So are there other things that use it? I know you mentioned that, well there's the wasm stuff with uh, with a blazer that's going to support it and then android, but are there other things that are going to take advantage of this? Frank: 11:50 Well, people can absolutely take advantage of it for their own apps and strips. Um, in the data center, power is king, but less CPU you use, the less physical power you require, the more you can run. So anytime you're running a.net app, it always benefits you to optimize the heck out of it. If it doesn't affect the logic or anything. Absolutely. Otherwise you're literally just burning electricity and we don't do that in this modern era. Got It. Um, but as he said, ah, the big projects, I think we've already named, or at least the ones that I know, but this is definitely a technology that anyone can use. It's not a big deal. And it, yeah. And a lot of that comes from, uh, and this, this exists also in the Microsoft world as a, what do they call it? .net native. .net Native. Yeah. Yeah. The thing is, um, Xamarin mono, you know, all those folks, they had to solve so many problems because we have customers of Xamarin. Ios kept demanding that every feature of.net work on this constrained AOT environment. And so it's just, you know, the years and years of bug fixing and progress on the mono side of perfecting this aot. That's kind of what got us to this breaking point where it's kind of universally applicable to everything because it is powerful enough to run pretty much any APP. James: 13:18 Ah, interesting. So basically what you're saying is I should have this on by default no matter what. And then I should go tell the team that's a maintaining the, the libraries or the templates to go make sure that this is on by default as well. Frank: 13:31 Yeah, well only matters in the end app, you know, the final executable. This doesn't apply to libraries or anything, but there are things that libraries have to take into account and AOT scenarios. We've talked about these for years and the biggest one is a reflection emit, uh, any library that required that just did not work on AOT. But now we've gotten news that um, uh, the Mono, people who have been working on the interpreters. So we'll be able to run more and more things under AOT and hopefully someday reflection emit. So exciting times. The AOT has really advanced. But going back to everything, the real sweet spot is when you run that AOT with the Llvm. Um, and I did some recent experiments along those lines and Miguel posted some numbers on Twitter and I thought we could talk about those a bit. James: 14:20 Yeah. He wrote a tweet, it says the amazing at @praeclarum, that's you wrote an Llvm. This is going to get real technical as I even got real nerdy, real nerdy Llvm to IL an intermediate language compiler so you could stop shipping native libraries. And later he measured the performance hit of c to Llvm to il to Llvm to native zero performance loss. Uh, and he said that this proves that everyone, what everyone already knew that LLVM is pure magic. So what did you do? I'm confused. Frank: 14:53 This was the hard tweet. I, I'd even parse this one fall. Yeah. I had to read this one like three times. I did the work and I still couldn't quite a long story short, everyone knows I'm very lazy and that laziness comes in many forms and that I'm willing to put in ridiculously hard amounts of work in order to avoid work in the future. And do you remember a few episodes ago we were talking about native libraries and P invoke and all that kind of stuff? Yeah. And you had said something along the lines of gs frank, I was expecting you to have a more clever solution. Yeah, I was as a throwaway phrase for view, but, uh, I kind of took that to heart. I'm like, what James, why don't I have a more clever solution for any of this? And I'm so I thought I'd spend three days on a little tool, uh, that would compile c code into.net code. So if you have a library written in c, run it through Frank's magic tool and out pops a.net assembly. Shut up. What does that, yeah. And what does that buy you? That by these you cross platform, anything that can run.net can now run it. So basically any c code, you have .net can run it. James: 16:11 You're ridiculous. Didn't take three days. It did not take three days at three days. I decided this is not going to take three days. Frank: 16:23 Um, and here was the problem. James. I knew that if I worked very, very, very, very, very hard on it that I could probably maybe possibly get it to work, but I had no idea what kind of performance would have and so the entire time I was working on it I couldn't decide if it was worth the time. Yeah. Cause I knew the amount of work it was going to be eventually and then there was still doubts of it even whether it would be worth the work, you know the final product. It was one of those things that you wouldn't know if it was worth it until it was working. That's the worst. James: 16:58 Yeah. You're like, oh am I going to literally spend a year and a half of my life and then it's like building an APP and then nobody wants it, you know? Or am I doing a year and a half of work and then I could have just done the thing that what it took me a day and in con from there, Frank: 17:13 right. It's that whole thing of like, yeah, this is really tedious work. In this case, a compiling native libraries and bundling them all up and writing the p invokes. Yeah, it's annoying work, but literally it's a week tops of work. You know, there's, it's a bounded amount of time, you know, what needs to happen, all that, whereas this thing, unbounded amount of time and questionable benefits. James: 17:37 So did you finish it then? Or like what happened? I'm so curious because Miguel said that you finished something, but you know what happened. Yes, I did. Uh, so, Frank: 17:50 uh, three days turned into a month, but at the end of the month I had a working thinner and uh, I was specifically focusing on a Matrix Library for ice circuit at the core of ice circuit. It has to solve a linear Algebra problem. And to do that it has to do matrix operations very, very, very quickly. Um, and so it was a library specifically designed to do matrix math very fast. So it had lots of pointers, lots of math, lots of opportunities for an optimizing compiler to optimize the heck out of it because it was already written to be fast and compilers could just make it faster. Got Super Cool Library. Yeah. So I was able to get that library to compile into.net and direct from Llvm. I are, that's Llvm intermediate representation. That's what they call their bytecode bitcode. We don't know what it's called. There's just so many names for it. That's the problem. Okay. Just call that LLVM stuff. And the two old converts Llvm stuff into .net stuff il and you just added as a reference in your app, you get intellisense, you know, you get code completion, all the functions are there. No Pinvokes, none of that stuff. And you just, it just runs. James: 19:11 Wow. And it turns it into like what? Like a dotnet standard library or just like, oh man. Oh, I emitit dotnet standard. Yeah. Keeping up at the time. Frank: 19:22 So it has, it's a full dotnet standard library. It has zero dependencies, which is a little tricky because a CE programs still have library dependencies as in they have to allocate memory. So you have may lock seallock, free, all that. You have all the string functions. Um, you have io functions, you have, um, socket functions, threading functions. The list just keeps going on and on. Ce programs, we always joke how cross platform they are. We don't joke. We compliment it on how cross by form it is. But the joke is a C programs make native system calls all the time and you have to deal with those. So it's not just a matter of converting the logic of the application, it's also implementing the runtimes. Mm. James: 20:09 Oh, I do have one question here because if this is now taking in the c code and then admitting a .net standard library, what I'm used to having to do is let's say someone gives me like an so file and they give me, you know, they gave me like the header files they give me so files and then I have to write the p invoke stuff. I guess you need the, you need the actual source c code. Correct? That's my first question. Frank: 20:40 Yes. Uh, in general, I wrote it to be that way. You give it c code actually C C plus plus swift four Tran, you know, whatever. Um, but it can also take code by code, whatever LLVM stuff we call it today. Um, a big chunk of my work on this was just reading the Llvm a byte code format. It's complicated. It's long. You can imagine it's a description of an entire virtual machine. So a huge chunk of this project was just reading their input. So my program doesn't read c code. All it does is run clang. That c compiler I mentioned earlier, it runs clang. Clang takes the c code, outputs it code. I take the bit code, I'll put I, I'll put a dotnet standard library. James: 21:28 Wow. Okay. So next question. Now what I'm used to here by the way, cause I, we just publish them brand new docs on c plus plus to use in your Xamarin apps. And it just did like a Xamarin show on him and it was like, okay, so you write the, the p invoke things that you need to and the descriptions and then you need to embed the so files or the, you know, native Ios libs. Do you still have to do any of that then? Or you're just saying no. Like, literally, it's going through the pure raw conversion and magically Frank's amazing and it just works. Frank: 22:04 Pure I l zero native references, the only references that has our to dotnet standard. Um, nothing embedded. That's kind of what's so cool about it. You can disassemble it, you know, run it, run it through any of the disassemble or is, and that's how I debugged it. I would have the c code to my left and I would have the d compiled into c sharp code on my right. It was going through a million layers, a conversion between them. But it was a simple visual comparison. Am I getting the d compiled as the same as the input? Fun. It was a fun project, honestly. So what, so what does this mean? This means if you have any, uh, library written in c sees the easiest one. Um, let's think about like lip Jpeg, Lip Ping, Lib av, you know, there's tons of old c code that we all put out there. Uh, open CV, all that kind of stuff. Um, that can just be shoved into a.net. I mean, turned into, I've gotten at library. Wow. James: 23:05 That's impressive. Are you going to sell this or what is your plan here? Frank: 23:13 I've gotten that question a lot. Like at first I was going to open source it, but it has taken a lot of effort and to make it actually global useful. I, I'm taking a little shortcuts here as an I haven't implemented sockets. There's things that just don't work and I figured the amount of effort it would take for it to be a viable product is a lot. So it's a real question of do I want to do open source or salad? I've never sold a library or tool like this before, so I'm not super comfortable doing it, but I'm just my, I just might because know sometimes you want to get paid for all the work that you do. James: 23:52 It's true. You do want to get paid. Um, but it sounds like what you proved right. I don't know if we necessarily talked about it or maybe you did a little bit, is you also then analyze that outputted dotnet standard library to go to get back to the original one and say like, is it actually up to snuff with just the old p invoking and doing the so file doing all that stuff? Or even with the raw cc pause code, like how did you actually measure that? Right? Cause now that you've output at the dotnet standard library and you've done this magical frank shenanigans to prove that yes, it's all amazing and everyone should be LOVM thing, everything under the sun and AOT and everything and everyone should pay frank a lot of money to do this. You know? Like what, what was your comparison I guess? Frank: 24:41 Um, yeah, so this is the crux of it. This is, was any of this worth the effort? Because what I'm essentially doing is trading developer productivity, like my time for CPU time across the world. So we're tradeoff, right? Yeah. Think about it on those terms. What, What's my laziness worth to the world? How much wasted energy am I adding to the world? Um, so I did lots of performance tests the moment I got it working, I just started performance testing the heck out of it. And, uh, we have basically three runtimes for.net, the kind of matter we have the motto jet, we have.net core reujit cool name. And then we have the mono Aot with Llvm. Those are kind of our big, uh, technologies for running .net code. So I have different performance numbers for each. Who should we go with James the winner or the loser first? James: 25:40 Uh, I'll start with the bottom. The loser Hit me. Frank: 25:43 The loser. All right. Monoo Jit. Sorry. Mono to Mono. Jay is really amazing because the Jit itself is fast. It creates compiled native code very quickly. The problem is it's not very fast code that it generates. So, um, the c code, we'll just say that, you know, that's going to be our baseline here. Uh, how fast can that run? Clang. Optimize zero oh three, you know, full speed. Let's test how fast can this code go. And when you compare that against the Mono, Git Mano is two to three times slower. So an operation that took one second now takes two or three seconds a while. Something. Yeah, not great. So you know, you just put in a month of work, James pulling your hair out, reading compiler books, reading computer science books because this stuff's hard. And then you see that number and you're just like, Oh God, I think I have to go back to native libraries because I can't trade my developer productivity, you know, for two or three acts. Frank: 26:47 That's just terrible. Yeah, yeah. Oh and I should say I ran all of this with a benchmark.net. Shout out to benchmark.net super cool little software. Super Slow to run but gives you pretty accurate numbers, which is nice. I'm going to take a look at that arm and put that in as an open source, a .net library for benchmarking. It actually is open source. Very cool. And the cool thing is if you run it on a machine that has mono and .net core installed, it'll compare the two run times. So you can see how it runs on each one. Super Fun. That's cool. Yeah. Yeah. So then I ran it on.net core. We want to take a guess how much Donna car slow down. I want to say negligible. Maybe it adds 0.2, 5% like a quarter of a, you know, like, well yeah, I guess. Oh, you're too generous. Frank: 27:42 Your... This is versus clang optimize codes. Maybe like just 50% slower. Actually, you're pretty close. 60% now. That's pretty good because we already know, um, Mono is 300%. Yeah. So that's pretty good. It's the percent is a big deal. And honestly, um, this code itself is so much faster than the code I'm currently using. At that point it was a usable product to me that 60% over the sea, that's fine because it was already literally 20 times faster than the code I was already using audit. Yeah. So that's 60% does not matter. So that is great news. That means that the.net core jet is only 60% slower than claim. Llvm fully optimize cause, I mean, that's really impressive. That's good. Bridget. Yeah, I was going to say it out now. Remember everyone, like that's four edge yet. That's pretty impressive for a judge. Um, and they're always working on riujit it too. Frank: 28:43 And now I should, there's a huge caveat here. a jit is only as good as the code that it's given. And although I read a lot of computer science textbooks and everything, I am not emitting the most perfect code ever. It is. I could do optimizations in my own compiler that would relieve a lot of burden on the jet and enable it to do a better job. Got It. Someone just had a nice blog entry where they hand converted some c code to c sharp. So not using a tool like I did, but just line by line type it all in, which is actually something I was thinking about doing for this library to until I saw how many lines of code it was and then I gave up. Um, so that was a valid technique and I think he found that there was only something like a 15 to 20% performance different awhile. Frank: 29:34 So that's pretty good. Yeah. But that's handwritten, you know, not through a tool. So that I kind of think of it like my tool has a 30% overhead. Unfortunately I could do better. Okay. All of that to lead up to the final point, let's run running on Xamarin, Ios on an iPad. You know the scenario that I actually really, really care about Aot Llvm and you know what the performance was m measurable but not tell the difference between the c code and the.net code, which is just amazing. Wow, that's really impressive. So that means that our Xamarin Ios apps, if we wrote it in the style of a c App, which you know we don't, but we use iot and we use reflection and all that. But if we were to constrain ourselves to a c like language, we would be running at the same performance as x code, anything swift, you know, basically anything that goes through the LLVM chain. And it's just so to the question that we used to get in the very beginning, what about performance? I can now kind of with a straight face say there's zero performance overhead. It's not true. We have a garbage collector. It's technically doing things in the background and a few other things. And like I said, it all depends on your style. But in a c style there is zero performance overhead. James: 30:55 Yeah. Bit for bit for bite. They are one-to-one Frank: 31:00 basically. Yeah. In other words, we programmers at all the inefficiencies on top of that. Uh, again, for our own convenience, we use reflection instead of code generators. For our convenience. We do so many things for our convenience, we use a garbage collector because none of us want to deal with remembering who located that memory and therefore we take a little bit of a thing, a little bit of a performance hit. Yeah. Ah, but it's such a relief for someone like me who does think a lot about performance. Yes. To just have like hard numbers in front of me. James: 31:36 Yeah. No, no, I think you're, I mean it comes down to for years. Okay. For years, people, people have asked me and I've seen all of these, you know, metrics and performance analysis of all the different cross platform languages and cross platform, uh, frameworks that are out there and they're like this and that and everyone's running their own tasks. And people always ask me and they would say this about performance. I go, you know, performance is really hard to measure. Like what are you measuring? Are you measuring, you know, date, time conversions? Are you iterating over things? Like, is your code one to one between the two? And I think what yours says is, hey, listen, if you're writing it to the raw c API and write the same c sharp code, ah, on, on an Ios, those are going to run, those are going to run the same Frank: 32:25 at the end of the day. Yeah. That's it. It, it, it created a clean baseline. I mean there are still technically the inefficiencies of my compiler. Yeah. But the cool thing was LLVM is so intelligent. It burned through those inefficiencies. I know for a fact I'm doing some things wrong, but somehow it's just like, I'm magical. I'm just going to ignore all your mistakes and do what I think is right. Yeah. God, I love good technology. I love good technology that works. Yeah, exactly. So now is that a too, too technical? How, how was that? James: 32:58 I think it's perfect. I think I now understand a little bit more of exactly what you did. I understand a little bit more of, uh, you know, when I say the words, what it means and how I could now I think, you know, it only took us 10 years to get this point where we can now say it with a straight face, but I think we've been preaching it for 10 years, but uh, now he can actually have that hard data behind it. It only took frank a whole month to, to come up with his little technology to, to put it, to put it on though. You're going to be writing a blog post about this or what are you going, how are you going to be telling the people about this besides the podcast and everybody listens to you obviously. Frank: 33:33 Obviously. Yeah, I'll definitely do a blog post. I've been meaning to work on it. Uh, Miguel just got ahead of me and I'd just like, ah, whatever it wants to talk about it. And, uh, whether I decide to charge for it or not, I'll definitely have a free version of it that people can just at least play around with, with small libraries because I think it is good technology. And if you do have a little bit of code that you think would do better if it was written in c than just throw it in. Yeah, if I charge for it, I'll just make it for big uses. Probably James: 34:01 very cool. Or, yeah, like some sort of enterprise, I think, you know, json.net forever. It was free and then they had like the, it is weird selling libraries and a bunch of other things, you know? Frank: 34:12 Yeah. It's awkward, but you know, sometimes you just want a little bit of recognition. James: 34:17 Yeah, no, I hear Ya. I think it's, you know, I write a lot of libraries and you know, people who've bought me coffees, which is amazing. I, and I, you know, I appreciate everyone that has ever, you know, really truly supported me in anything that I, that I, that I did. Uh, and I never really thought that I was like, oh, I had, I had a library worth putting a paywall in front of like that, you know, new is going to help people. Yeah. But it's like, oh, you know, like, you know, feels good for the community where it's, you know, in this instance you can see the use cases of I'm a huge oil conglomerate and I want to run all of the c plus plus code in.net and I need to rev it and I don't have years to, to do the conversion layer, go through swig or swag or although their conversion tools and this and that, you know, it's a lot of a headache, right. So, uh, something like this, it makes a lot of, that makes a lot of sense for that like enterprise supported scenario but crazy. That's amazing. Frank here. Amazing. Frank: 35:15 I also want to get that a swift support working cause then maybe we could consume, some swift libraries on Ios. That'd be nice. James: 35:23 That would be pretty, pretty cool. Now that would just ,be pretty nice, core swift logic. It wouldn't be ios logic. I guess Frank: 35:31 it could be either or. Because in the, in the native world, all you say is there's an external function out there, a name on it, James: 35:38 you know, whatever it actually is. I get to, I get to choose. I'm in control. Does that mean you can bring Ui kit to the world? Oh, let's not go there. There's way too many legalities with that. Moving on. All right, well I will put a link to all the LLVM goodness in the show notes and everything else that frank talked about. I'm mind blown. Anything else you want to talk about on this topic? No, that's funny. Thanks for letting me talk about yet another one. Oh, Frank's libraries slash tools. Additions of merge conflict. I mean that's our show at the end of the day. Let's be honest, frank, way better at life than me and development. So it is your show, my good friend and I appreciate all of the hard work that you do. Uh, not only in the community but just the crazy challenges apparent. I feel like I, I feel like I was the real reason that this came about. James: 36:29 But uh, you're welcome. Really listened to. It's unfortunate that I listened to you so closely. I really should learn to ignore you better now. You know, that's why we're best friends. So cause I listened to you and you listen to me and people also tend to listen to us and um, I appreciate all of them. You can, you know, find our podcasts @ mergeconflict.fm. You can follow us on Twitter at merge conflict FM at praeclarum, @jamesmontemagno one thing we did want to talk about before we get outta here, talking about.net is our good friend John Galloway, part of the.net foundation has been tweeting a lot about these elections now. the.net foundation. Yeah. And you know about the net foundation. Yeah. Well I was, I'm always been a little bit behind. I've never fully understood it. I think we've mentioned that on the show here, but it's a foundation that helps what say finance slash support, uh, very important projects to the.net community. James: 37:25 And they're having an election this year because up till now it's been kind of a Microsoft control board, but they're opening it up to the community and they, we have people running for offices. What's going offices? I think so, yeah. They, so they support a bunch of projects, a user groups, meetups that there are nonprofit, um, support a whole bunch of active projects over 462, uh, you know, uh, repositories on Github and, yeah, so there's elections. So election.dot net foundation.org the candidates, there's a bunch of them. There's like 50 of them are, so people are running to be on the board and there's a bunch of seats. Uh, you do have to be a member of the .net Foundation, uh, to do so. Uh, I am, I'm not, I should probably look into that. That's not that far either. But it looks like the qualification is that you had to have contributed to one of the, what'd you say, 500 different libraries out there. James: 38:19 Chances are if you're active at all in open source, you probably did. So it's just a matter of looking through the repositories now. Yeah, I'm pretty sure I have. I mean, if you've done anything in the Xamarin ecosystem that probably they're probably in there too. So, um, yeah. And then, yeah, so there's every year they'll do it and they serve for a year and uh, and, and they help kind of lead the future of where the foundation is going to go. And they each have a little page, I'll put it in the show notes, not only to the.net foundation, but also to the election of the elections run until the 28th. So since this is out on the 25th, we wanted to kind of let everyone know, uh, if you're listening to the very end, very cool. Sorry. Community duty. I think it's really nice that, um, the community is getting really involved in running this thing. James: 39:04 Uh, it's exciting times. I think so too. Yeah. Head over to election.dot net.net foundation.org to learn more and of course head over to merge conflict out FM too. You know, subscribe to the show. Share it with a friend. Tell everyone about how you know you flipped on LLVM and your app and now everything's amazing. You're like, whoa. So cool. Thanks frank and James. Uh, and of course if you want to become a patriot supporter, you can go to our patreon support page little tab up there. And if you just want to come chat with us, head over to our discord. There's a link on merge conflict out of him. I was going to do for this week's a merge conflict. Thanks again to Instabug for sponsoring this week's pod and that's going to do it. James & Frank: 39:41 I'm James Montemagno and I'm Frank Krueger. Thanks for listening. Peace.