Paul: Hi there and welcome to Pod Rocket. I'm your host, Paul and today we're joined with Obinna Ekwuno. Obinna is a developer advocate. He speaks around the world about developing the edge and all sorts of interesting things that pertain to the way that we are building applications at a large scale today. Welcome to the podcast, Obinna. Obinna Ekwuno: Hey, thanks for having me. Paul: Yeah, we're excited for this one because we're going to be getting into some more nitty gritty of the edge and some of the implications it has on how we build applications. I think a lot of people talk about the Edge. It's a popular topic. There's a lot of products that have to do with the Edge. You have D1 from CloudFlare doing some CloudFlare research. You have workers from CloudFlare, then you have Lambda and Vercel and all these things, but what are some of the design decisions that impact, is this useful? Is this a decision that we should actually be thinking about seriously? Really look forward to getting into all this. You're an Edge person, right? You like talking about the Edge? You gave a talk recently. Obinna Ekwuno: Yeah, yeah. I gave a talk about a part of using data and how it's handled in this Edge paradigm and my own thoughts on some stuff that I've seen. I did that at JAMS.com literally two days ago. Paul: At JAMS.com, were people interested in the nitty gritty of what you were talking about because I feel like if you're a naysayer of Edge products, you can quickly receive a lot of hate. Obinna Ekwuno: No, no, not really. I think the thing about the Jamstack space or even the technology space at this time is that, what we're seeing, the growth of new technology and how people are building stuff around these new technologies. For example, Edge Computing didn't just come out today. It's been around for a while and there's been a lot of research papers. As I was trying to make this talk, I saw some IBM research papers on 5G, the Edge, what instantaneous data sourcing would look like and seeing that in the Jamstack space, we rely a lot on the serverless technology. Now there's this new move to Edge functions or what we're now calling some computation and storage benefits and lower latency. There's not lot of pushback. More confusion about what is this thing, which is where I like to start because when you say the Edge, everyone is like, there's some ominous music playing in the background, da da da da which is why I like to start explaining what the Edge is. Paul: Yeah. Yeah. That's the next thing I was going to get into. I feel like we explain what is the Edge all the time on this podcast and I really like the way that you approach the question. Obinna, please take us away. What's the Edge? Obinna Ekwuno: We call it the Edge, but it's actually coined from Edge Computation or Computing, which essentially means that data creation and computation is brought closer to the place that storage and computation is brought closer to the place that it is needed. So for example is if you were building previously on the web, you would deploy applications to different regions, which is where we've started from regional deployments of AWS Lambda or net functions or usual functions. Paul: Like your us-east-1 or your us-west. Was that what you mean by regional? Obinna Ekwuno: Yeah, yeah, yeah. Regional. Yeah. All of those different locations, but now the Edge says what if we can bring this information a little bit closer to the user so that as you are developing, you are able to deploy your application once and this will be applicated across this content delivery network that's handled by someone else, which is essentially the same ideas as serverless, deploy your functions and the functionalities that you need to this space and then allow someone else to handle the scale for you. In this instance now, the Edge is cool because you have this information leaving closer to your users so that's one of the major advantages. Paul: Yeah. One question I have is when you say I'm bringing my information closer to users, we're talking about geographical closeness here because we're talking about [inaudible] so we're talking about geographical closeness and then second off, what is information? If my website is my developer homepage which is just HTML, my information is a big HTML string that can be sucked anywhere, but if we're talking about running a bigger application, I don't even know what that means. What is information and what does that mean to be close? Obinna Ekwuno: Yeah, the most general use case for it right like you mentioned run times like CloudFlare workers or Dino Deploy having this servers around users and stuff. The thing that I like to point out is that as with every other technology, because this has just come out, it doesn't mean that you should change how you are building stuff. It's usually a by case situation, for example, there's this tweet by Jim, I suck at pronouncing last names, but... Paul: Nielsen. Obinna Ekwuno: Nielsen, thank you. There's people at Jim Nielsen that I saw and I asked them, I was like, "Yo, can I reference this to you in my talk because literally it says sometimes it might not just be fast to send less over the network than increase geographical proximity. Sometimes just putting application on the Edge doesn't make it faster. It does make you faster, however, sometimes you just need to reduce the amount of stuff that you are shipping. Paul: It makes the delivery faster. Obinna Ekwuno: Exactly. If you don't need to deploy stuff on the Edge, don't do it. Put your static files or all of those things, like, make it easier to ship over your existing networks and it would make your users able to get that information faster. The Edge is not a silver bullet. If it's important content coming from a single origin, then just render it straight as it is from like HTML. It's probably a way to make it faster. You don't need to do everything on the Edge. There's also this very cool article by Chris Koya, who is like the CSS Tricks guy. He literally calls it Once more onto the Edge, dear friends. It's a bit poetic, but I like the... Paul: It is poetic, though. The way he writes is poetic. Obinna Ekwuno: Yeah. It's once more onto the edge, my dear friends. Paul: I think what you're describing about making your application faster by putting it on the Edge, the best analogy that I've heard to this date is if you're trying to render a video, you're like, you're a video maker, Obinna as well, you're a weekend filmmaker, right? That's one of the things that you said. Okay, so you're trying to render something on your MacBook Pro and it sounds like a jet engine and you're like, "Dang, I just need a faster computer to run this app faster," so then you order from BestBuy, they're like, "No worries, we're going to ship it with expedited delivery," and then they send you the same laptop and you're like, "Wow, that's great." The thing's still slow. I don't know what to tell you. Obinna Ekwuno: I have this analogy that I'm working on for, I've heard it in different places. A colleague of mine, Sunil Pie, had this analogy of the Edge as this big squishy ball with a lot of nodes and the information doesn't live on one of these nodes, like, points on this map, it lives everywhere at the same time, but it's a squishy ball, so it could be anywhere. It's a big ball, a smaller one. I'm doing stuff with my hands. You can't see this. Paul: You're just making a ball. For all you listeners. He's creating a snowball, just picture that. Obinna Ekwuno: I'm creating some anime, some Naruto [inaudible], but the point is you have these different points so your example now would be when you order something on BestBuy, they would bring it from the closest possible store to your house. For example, I'm in London and I order something on Amazon for example. Amazon would ship me something from Singapore if they had it in their store in Greenwich, just close to where I live so the point is just bringing the content close. Paul: On the topic of the JavaScript, should we ship less JavaScript as the mantra of the day and the age right now and it's great that the problem is being tackled and the types of solutions that are coming out are really compelling? What is one of the ways that you see this problem being tackled that you think is one of the most interesting that maybe you've used or you've read about? Obinna Ekwuno: It's funny that you say this because I literally just had a call with a friend and we're talking about this new frameworks and how the selling point of them is more, you don't have to ship all this JavaScript. You can reduce the amount that you are shipping or be more conscious about your use case and what works best for you. One of the frameworks that really excite me these days is like Astro because it's whole selling points is you don't need to ship this much JavaScript, you can also play with some other frameworks that you like within Astro. I think the idea of shipping less JavaScript to the browser would always remain. I used to have this Homer Simpson evolution diagram in the beginning of most of my talks about Edge or JumpStack and stuff is because I feel like we're finding new ways to do the same things. Again, referencing Chris Koya, he had this article that said if you put HTML on the page and render that it to different people, that's the Jumpstack. It's like 1993 all over again. Basically, we're doing the same things but finding better ways and we'll keep finding those better ways on reducing the amount of JavaScript or shipping or trying to make it more performance, but my thoughts on performance is a whole different conversation. That's like optimizing. When should you optimize? Why are you optimizing? Paul: Why you should optimize. That's a big question. Everybody wants to optimize, but it's like what does that mean? Who's your user in the end? You need to optimize for them. That's a whole other conversation, I agree. Obinna Ekwuno: Yeah. Also we still ask ourselves who's your user in this space as well? For everything you're doing is mostly to improve user experience, so in the case of the Edge and the way we deploy applications now, and Dino has this cool framework called Fresh, I think that's totally Edge compliance. You also have CloudFlare pages which you can use to deploy actual applications on the Edge. It's usually concerned with who's your user, what do you want your users to gain from me? Anything we're doing is always surrounded by some improving user experience which is something that really excites me now because before you would have products like, this product doesn't work, this one works for me and now the issue is the problem we're facing is more of a user... What makes users feel awesome. I can use three different things to do the same thing, but what's my preference? Companies are no longer [inaudible]. It's no longer about who has the more resources and stuff. It's more about who has the better developer experience, which is amazing to see. Paul: It's amazing to see and be a part of that as a developer because you're like, "You're making it special for me. That's so nice." Obinna Ekwuno: Yeah, you thought about me. That's nice. Paul: What do you think is one of the largest misconceptions about the Edge and I don't want to pigeonhole us into thinking about the silver bullet. I just use the Edge and makes my app faster. Are there some situations where you can think about where the Edge was thought about or engineered in maybe a suboptimal way or a common design pattern that people don't use the Edge correctly? Obinna Ekwuno: Yeah. I can't really place any example right now. I do know that there's a misconception that now that we're on the Edge, we don't need regions anymore. The truth is that one doesn't cancel the other. In my opinion, I feel like these things work hand in hand in some way because at the end of the day, the most obvious difference between what we call the Edge and what we call regions is that regions are separate. They exist on their own and deployed and share information when needed. However, the Edge is all interconnected and the way run times work is a bit different. It doesn't mean the end of Cloud, any of that, which is great. Paul: The regions aren't going away. They work with the Edge. Obinna Ekwuno: Yeah. They work with the Edge, they are the Edge. It's like, "Luke, I am your father." The Star Wars... Yeah. Paul: Really quickly for any listeners, if you want to learn more about how to ship less JavaScript to the browser and Obinna, you have not heard of Quick. Obinna Ekwuno: Yeah. Paul: Yeah. That's a really cool one and we just filmed an episode with the creator of Quick, so learn about it. It has really amazing ways. What they did is they quite literally figured out if this website's going to run, what is every single DNA atomic action that is going to happen within the run time and then they can stop, start, pause it and selectively load islands as you need and that's a really compelling tool. Go check out that episode. We have other episodes on low JavaScript and island architecture. It's all part of this dialogue about how we create a better user experience. Obinna Ekwuno: I should definitely check out that episode. I have heard of Quick or a starter guide for it, but I haven't really used it to build anything. I should talk about it on at Jam.com as well, but I think the CEO was there of builder IO. Quick is a different... Paul: Yeah, that guy. Yeah, yeah exactly. What about the data storage because when we started these episodes, one point I kind of made was you bring your data closer to the users. If it's my personal HTML page, my data is just a HTML string and that can go anywhere. If it's an application with a database, that's different. Maybe if I build the next JS app, it's okay, there's some service side, there's some client side, but it can be transpiled, it can be kind of pushed to the Edge. That makes sense to me, but what about bigger applications? There are databases, there are caches. How do we think about the Edge? Are there Edge databases? How do you wrap your head around this big dark pool? This is the bottom of the iceberg, in my head. Obinna Ekwuno: Yeah, this is something that also kept me up at night the first few times I had Edge. Cool. We have our client deployed on some Edge server. Cool. We have our functions, that's cool. Some APIs, we can do that with depending on the wrong time you use and all of that is nice. However, what about the database? Where's the information coming from? Applications and functions can live on the Edge, that's dope, but it's not good enough. We have to imagine a simple application where you're sending a request to some API that's probably living on the Edge and the clients are living on the Edge as well and you have your database deployed to regions. That function is still going to make that round trip to get the information back to you. Paul: Every time. Obinna Ekwuno: Yeah. The truth is, if you can't bring the data source as close to the Edge node or the wherever Edge node you are is being called from as possible, then the advantage is lost. If my data wasn't cached somewhere or if the user wasn't closer to the request, then it doesn't help. Some of the solutions around this that people have used in the past, obviously something that we still use is caching. We're like, "Okay cool." If I have a region somewhere but I can cache some information in an Edge cash and then use that to feed the user the information as soon as they need it, which is fine. However, it creates a lot of issues. What do you cache? Where does it even end? There are different Edge workers. KV for example, which is a key value store, you can use that for caching. There's also some D1 database like you mentioned. I feel like we should talk about that exclusively. However, the point is people have found a way to fix the issues on a use case by use case example, but the issue is some databases and what they were built for. A bit different from what we're using today. Paul: One thing you mentioned was D1. This is another CloudFlare product distributed database on the Edge. I don't want to say distributed, it's an Edge database psql base, is that correct? Obinna Ekwuno: Yeah. D1 is CloudFlare Edge database that's built on the sequel lights. Paul: Sequel light. Obinna Ekwuno: Yeah, sequel light. Paul: When we say this database is on the Edge, if I run a CloudFlare worker there and I use a D1 hookup, there's probably going to be a database co-located, maybe even in that same data center. Obinna Ekwuno: Yeah. The thing about building on D1 is because it's built on Cloudflare's Edge database, it's built on Cloudflare's network, what happens is that you have a master database that is read only replicas on the Edge network as well and then you can have information sourced from those places. However, when you make a new right or a rights request to where the data is, it's going to have to replicate across. You have one master that's replicated across, so you can't do fast requests, but when there's a new update, it would also replicate across as well. From what I know so far, it's in Alpha now so I feel like people can actually test it, use it to build stuff they love of articles on demos and stuff that you can build. I actually know the engineers are building these things and if I do say so myself, they're one of the most brilliant people I've ever met. I see a future because CloudFlare has done a thing in the past called relational database connectors and it was pretty interesting what they did because the thing is that with databases, we've established databases are needy in some sense. They require some TCP connection to be long lived. However, when you have TCP and this is a network layer protocol or something, and when you have these bits, the database thinking that okay I'm going to get a TCP connector and the workers run time is built on HTP connectors, you have that clash. What Cloudflare did was really interesting because they have this solution called CloudFlare Tunnel, which essentially is powered by this thing called CloudFlared. What I used to use it for is if I had a local testing environment or I'm building something locally, I can create a secure HTPS tunnel and send to you and you would have an actual HTPS version of that application and test it on your thing. It creates secure connections. What they did was to put that tunnel between a worker and a database and what that did is it creates a network that allows your database to be spoken to in TCP and then your worker to be spoken to in HTPS and then both of them would just be having a conversation and it's kind of having a translator, which was really cool. This was before D1. That still exists and it was a web socket fix to having current databases talk to half their workers as is. However, that wasn't an Edge solution. It was just to help workers speak the language of databases and now with D1, what they are doing now is building on that Edge net or sequel lights and I don't know how much people have about sequel, but it was one of the first several less database languages and stuff and it helps that we are able to build databases on this serverless before this serverless paradigm and move the needle a bit forward towards having an actual edge database. Paul: Do you think that we're going to see databases like this becoming the defacto 80 percent use case product that people reach for when they're trying to create an app? Obinna Ekwuno: Yeah, for sure. From the beginning, we're trying to make the world faster in every way, by shipping less JavaScript, by improving usability, by doing anything we're doing. Essentially when you refactor some code that you wrote four months ago to do less and be more optimal, it's to improve usability in some sense. You can see the shift by how other companies as well are thinking of databases and how their current infrastructure plays, for example, you can see support for databases and Edge functions. One of the ones that really excites me is Planet Scale. When you hear the name Planet Scale you are like, these people are going to be everywhere and they introduced this thing called Planet Scale serverless drivers for JavaScripts, which is essentially HTCP like API that supports their global routing infrastructure. This new infrastructure enables global routing, similar to how CDNs work and reduces latency. It's essentially something that you can install to your existing client planning skill databases that allows you to choose between regions based on where these functions are being called from. They added this support to support things like CloudFlare workers, the cell Edge functions, Netlify Edge functions. Everyone has an Edge function now, which is cool and databases are seeing that you know want people to improve. I even heard of one new one at Jam.com, which is a convex, I think. There's also Zapa, for example. A lot of databases at trying to patent their infrastructure because until this came out, [inaudible] skills environments, you'd have to have that TCP soccer protocol. However, this new driver would support HTP and that would support Fetch API. Paul: You could pretty much use it in any Edge product I guess if you wanted to because it speaks application level HTP. Obinna Ekwuno: Yeah, I might be in a box, but the two major drivers of the Edge that I've seen is obviously CloudFlare and Dino. You can have more. I think I haven't looked into it, but CockroachDB as well I think. There are also some databases that were positioned or built in an Edgefest way, Fauna or MongoDB for example, or Prisma put a way that's allows users to query data over this global distributed network. There's a lot of support for it now so most of the people I see driving this is CloudFlare or Dino because Netlify Edge functions run on Dino. Another thing that's really exciting is just seeing how companies are open about the technologies that they use to make these things possible. Paul: Yeah, it's a new shift, for sure. Obinna Ekwuno: Supabase as well. How could I forget about Superbase. Paul: How could you forget Supabase? Obinna Ekwuno: Mean. How, man? Yeah, they are also doing some really amazing work and they also have Edge functions power by Dino so that's really cool. Paul: That's a really new thing from them, too, just in the past few months. Obinna Ekwuno: Yeah. The Supabase team is always shipping, they're always doing something to improve the developer experience, which is something that makes everyone excited because again, like we said, they're thinking of me. These are developers building for developers so I'm like, "Shucks, thinking of me. Thank you." Paul: To close out our conversation about what the future of the Edge is going to be, if people wanted to follow you and hear more from you, are you on Twitter or Medium or anything? Do you write or blog anywhere? Obinna Ekwuno: Yeah, I'm on Twitter at ObinnaSpeaks and I also have a blog. I need to write more stuff to be honest. I usually share more conversations or when I'm making a talk, but I need to go back to blogging more because I used to write technical articles a lot on Log Rocket, which is a really cool. Paul: No way. Obinna Ekwuno: Yeah. I have 20 articles on Log Rockets, so I need to go back to writing some more. Paul: Check out the Log Rocket blog then if you want to see Obinna's work. Obinna Ekwuno: Yeah, yeah, yeah. Check out. Not just my work. There are a lot of amazing authors on there. Really cool. Check me out on Twitter. I have a website, ObinnaSpeaks.dev. Paul: Obinna is O-B-I-N-N-A. Two Ns. Obinna Ekwuno: Yeah. O-B-I-N-N-A. Two Ns and then Speaks. Paul: Awesome. Thank you for your time, Obinna and musing on the concepts of the Edge with us today. Obinna Ekwuno: Thank you. Thank you so much for having me here.