AUDIO EDIT March Panel === [00:00:00] Paige: ~Cool. All right everybody. ~Welcome back to Pod Rocket, a web development podcast brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and analytics, which surface the UX and technical issues impacting user experiences. Start understanding where your users are struggling by trying it for free@logrocket.com. Hey everybody, I am your host today, Paige, needing House. I am a lead software engineer at the startup,~ uh,~ allspice. And we're back with another panel episode where we'll be talking about coding, outages, layoffs, and autonomous coding agents and doom. But before we get into it, let's welcome our panelists. We have Jack Harrington, the blue collar coder on YouTube, and also a co-host of the Front End Fire Podcast. Jack: Yeah. Hey Paige. How you doing? Paige: Glad to have you back, Jack. Good to be here. ~Uh, ~we have Paul Mikulski's, a pod rocket host and YouTuber. Paul': Hi. Happy to be here, ~and I'm still working on the same project since last panel,~ Jack: ~Well done.~ Paul': ~musicians license, musicians licensing. I.~ Jack: ~That's really cool actually. Yeah.~ Paige: ~Very nice.~ And we have Noel Mincha, a software engineer at Log Rocket. Glad to have you, Noel. Noel: Yeah, happy to be here, Paige. Paige: All right. So the first thing that we wanted to talk [00:01:00] about is coding outages, layoffs, and the retention crisis that seems to be happening in the tech industry. The past few weeks have brought about a reckoning for the ship faster with AI era of software development, Amazon has mandated that 80% of its engineers use internal AI coding tool kiro weekly. And within months, two ~major, ~major outages struck its North American marketplace in a single week. The first was on March 2nd. This generated around 1.6 million errors and 120,000 lost orders, and then three days later, the second outage caused a 99% drop in orders, resulting in 6.3 million lost transactions. Amazon has since launched a 90 day code safety reset ~across 335. Yes,~ across 335 critical tier one systems, which require dual review or approval for ~any code cha.~ Any code changes Amazon disputes, that the AI written code was the direct cause, saying that the incident stemmed from process failures, but the timing and the scale have made the story impossible to ignore. Meanwhile, Atlassian has announced significant [00:02:00] layoffs as part of an AI push and a Forbes piece. This week put a name to something that many teams are quietly experiencing a retention crisis driven by engineers who feel their craft is being devalued by vibe, coding culture. So the first question for the panel is that Amazon says that this was a process failure and not an AI failure. Do you think ~that ~that distinction matters or does AI made it faster and easier to make catastrophic changes ~at~ and mistakes at scale? Tell the real story. Jack: ~I mean, ~I definitely think that it was an AI problem, like if they, if you had just not a lot of these and then you suddenly have a lot. Yeah, I can totally understand where they're coming from. I, when I do a lot of this vibe coding stuff, ~you know, ~I don't, I'm not a hundred percent always checking the results. And the weird thing to me is that Kiro in particular, that particular IDE actually enforces the Amazon process. All the way through, and ~I'm, ~I'm just ~kind of ~surprised actually a little bit ~that, ~that this has happened because that Amazon process [00:03:00] is ~very, ~very heavy in terms of planning and it's, there's reviews of those plans. It's, it is a pretty heavyweight tool. I'm ~kind of ~surprised. It's not like your usual vibe coating tool. Noel: Yeah, I don't know. ~I guess, ~I don't know. ~I, I, ~I feel like if it were, like, there's no way Amazon comes out with a headline that, oh yeah, we vibe coded this and broke something, right? So it's ~like, ~I, of course not, like we would never hear that in a million years. ~Uh, but I mean, I mean, I'm, I'm sure that there's some, there's some level of like, ~it's hard to say, I guess because we don't, I think varying degrees, different parts of ~like ~the stack and architecture in general and how teams interact with them. There's just that kind of whole ~like, uh, ~loss of, ~you know, ~context on code and what's happening. So I feel like it could be like,~ even,~ even if a human wrote this line, I feel like one could still ~kind of ~be like, yeah, but like they were integrating against a service or, ~you know, ~modifying a code base that had been vibed. They didn't have as good of a grasp on it ~as they. ~As they did before, but ~I, I mean, I, ~I think that ~these, ~these kind of issues are going to increasingly be ~kind of ~inseparable from process. ~Like, I think that this is~ like,~ how, how did,~ how does your team understand your code base? ~You know, ~like how are changes made?[00:04:00] What's the testing look like? I think ~like, ~like that kind of is the process and the new process is determining what kind of context humans have versus your agents. So ~I, ~I don't know. Paul': ~Uh, ~if they're very intertwined, it is process, it is ai. ~I mean, ~if you look at organizations at skill, at scale, excuse me, that are doing this correctly or arguably correctly,~ um,~ like Stripe came out with a great blog post about their minions. They, it's a two part blog post. I recommend everybody read it if you really wanna see how this is being done at scale. But ~like, ~for them to make changes, it's not just the IDE, they have a whole process architecture about how agents hand off and then go to review. And I'm sure Amazon had a flavor of that, but it, ~you know, if, ~if it doesn't match the organizational need, of course it's not gonna work. ~So, ~and I would consider that. Process, ~I guess, ~like how you're redesigning ~the, ~the harness. Everybody at that scale needs a custom harness, realistically. ~Um, ~so yeah, I feel like same as, you know, like they're inseparable. I don't one in the same, maybe they're not at the end of the day and the way we tackle it and address these issues, but like right now it's so murky that they ~kind of ~feel one the same. ~So, ~good job. [00:05:00] Amazon,~ I,~ I guess that works as an excuse like you, you sold me. Sure. Jack: I gotta say, I wonder if, ~you know, ~I don't know about you all, but like I tend to do multiple things at one time. 'cause I'm waiting for an agent to do its stuff. And so it's ~like, ~okay, do I wanna practice guitar or do I want to go and ~like, you know, ~fire up three other agents doing three other things. And that kinda like multitasking slash ~process, ~process switching isn't particularly ideal for kind of keeping your eye on the ball of a particular feature. And ~I, ~I think sometimes I have like uhoh,~ uh, you know, ~where was I with this? You know, that kind of like, Noel: yeah, it's like context switching, ~just like, but you're, it's like you're kind of forced into it in this very, it's like. You know, ~it's like you fire off an agent who do perform a task, right? It's like, well, I could just ~like ~sit here and total my thumbs. I could go kick something else off. But ~any, ~any option you do, ~you're gonna, like,~ there's gonna be new stuff. Anything other than sitting there and staring at the ceiling or the little, ~you know, ~whatever Claude icon do its little,~ uh,~ dance thing. It's like you're gonna lose some context as you're gonna start paging other stuff in. ~Like, it's just like, it's kind of a,~ it's a tough one. Paige: Yeah. ~I, ~I ~kind of ~feel the same way.~ It's like, it there,~ I don't think the distinction really [00:06:00] matters because like you were saying, ~it is, ~it is, the process is AI at this point and we've, whether we like it or not, we've intertwined the two so closely that Yeah, you just have to figure out. What happened and be able to debug it. And that's the hard part is that we have ~less, ~less of an understanding, ~I guess, ~of the code that was written because most likely we didn't write it. ~So, ~so figuring out how to fix it or undo it or bring back up whatever got deleted ~is, uh,~ takes longer. ~You know, ~You know, hopefully an agent can help with that as well, but it's not always going to be the case. Paul': I I think it also just points to how important harness engineering is as a practice. It's like there's prompt engineering, context engineering. ~I mean, ~it really comes down to harness engineering. You're becoming a process. Pipeline or for an organization. ~Um, ~and I've seen more and more like people talking about this, memorializing this as a real thing. ~Um, ~harness engineering. 'cause it's,~ it,~ it really is just a flavor of platform engineering. And I just, people ~don't see, ~don't see how deep the iceberg goes on. If you want to use these tools at scale effectively, you need that discipline and you need to like, be serious about it. ~Um, ~I'm sure [00:07:00] there's gonna be a lot of ~like, ~just misalignments on where the org is like trying to go,~ um,~ versus how much they actually need to invest. 'cause they're like, oh, we're spending money on ai. ~Like, ~Jack was joking before we got on this podcast. How many tokens are you spending per month? ~Like, ~that's a real thing. That's a real metric that organizations are paying attention to, to see what this conversion is. That's like half the story. Are you doing the platform engineering? Are you doing ~the, ~the harness engineering? Are you getting somebody that knows how to connect systems responsibly and it's gonna, I like I, this Amazon story is just like the beginning of companies going, oh yeah,~ we,~ we do the tokens. Wrong thing. Jack: and if they do the tokens, then it encourages you to not twiddle your thumbs, right? In order to get up to those kind of token levels, you need to have four or five of these agents running simultaneously. How do you track that? How do you monitor that? Nobody's, at least there's no course that I've seen that talks about this stuff and it's all just ~kind of ~anecdotal, as you say. ~Like ~there's folks out there who are like, oh, this harness is the bomb, ~you know, ~or ~this, ~this meta harness that can run multiple sub harnesses and sub agents, and you're like, what? ~I mean ~this [00:08:00] entirely new field of manage dealing with all this is like three, six months old, ~you know, ~for a lot of folks. Paige: One of the things that the Forbes piece argues is that companies that are leaning hardest into vibe coding are losing some of their best engineers. Do you think ~that ~that is the case? Do you think that there are particular engineers who are staying versus leaving? Paul': ~You're always gonna, like,~ let's say you have 10 great engineers, okay. And then ~like ~this vibe coding thing comes, ~you're always, ~you're always gonna have somebody who's a thought leader in a group. ~Like ~that's just a human principle of organization. You're always gonna have a thought leader. You're always gonna have ~like ~somebody that just rises to the occasion of being the technical owner. 'cause I thought something was interesting and everybody starts to gravitate. Like there's always gonna be an engineer that is still designing and architecting and being creative and like leading how we're developing the harness. 'cause that's gonna change every three months as the models change. ~Like ~there's always gonna be real engineering. That doesn't mean to say though,~ I,~ I think I'm seeing the other like seven engineers on the team now. Instead of like their gasoline from their brain needing to do [00:09:00] creative, like boots on the ground coating work. There may be siphoning off of the idea faucet of the thought leader. And then ~like ~being told to orchestrate agents, so are they just gonna ~like ~leave? And it's ~like, ~oh, this organization's dried up and rotting. ~Like, ~no, you're still gonna get the great engineers and they're just gonna have like more creative time. Arguably it's just like that, that the surface area for how many roles can slot into that type of persona will be squa, squashed ~and, ~and minimized. Like you're gonna have one or two thought leaders orchestrating work to the rest of the folks. ~Um, ~versus everybody kind of working on those really intelligent, fun to solve honestly problems. Just compression roll compression. ~Our~ Jack: ~I can, ~yeah, ~I mean, ~I can certainly just see it. You know, the human standpoint of, it used to be one of those things where you would, I, ~you know, ~think about it as ~like ~climbing a mountain, right? It's ~like, ~oh, I got this thing that I wanna build. And it's ~like, you know, ~it takes forever to build it and it, you're really proud of it and you feel a, ~you know, ~a sense of pride and ownership because you handwritten ~or, ~or use tab completion or [00:10:00] whatever. ~Like, kind of ~put it all together. And then, ~you know, ~you get up Toping outta that mountain, you're like, yeah, look at me. I did it. This is great. ~Right? ~And you feel that sense of pride, whereas ~like, ~AI development is like a helicopter and it just kinda like drops you off at the top of the mountain and you're like, oh, I guess I, I did it. Yay me. Okay. And ~like, ~there's not that sense of satisfaction and it's, yeah. Paige: Yeah, I think. That if you were an engineer who was not into the AI thing, you may have already left the work foot, the workforce. If that was an opportunity for you, if that was a possibility that you were entertaining, ~like ~if you were a, an engineer who'd been in this for 20 or 30 years and you were like, AI is not the way I wanna go. I think you may have just retired out of it at this point. ~Um, you know, ~I don't know necessarily like what you're going to do. Maybe you're gonna do, go do something completely different. I think that's a lot of people's choice now is that they're just like, I'm done with the tech industry. I wanna go run a coffee shop or have a bookstore, [00:11:00] or be an electrician or a plumber. ~Like, ~I think people are reevaluating a lot of what they want to do because as engineers, a lot of us get, like Jack said, that sense of satisfaction from doing something really hard. And the time that it takes to get there and the knowledge and the expertise required to build something good or something that we care about. And yeah, AI takes away a lot of that because you can prompt your way to it, and you didn't actually write any of the code that does the thing, solves the problem. So yeah, there's less sense of ownership or less sense of satisfaction, ~I~ Jack: Yeah. Satisfaction. Exactly. Yeah. Paul': And I, I think one reason why this, ~you know, you, ~you can feel negativity or like dread in our field is because when it comes down to it, oh, the creativity, oh, it's, ~you know, ~it's fine. Nobody cares. Okay? Like in the real world, nobody gives a crap about how much you like to solve your problem. Do you move the business f far and do you move it further? If you [00:12:00] do great, and if you don't,~ you're,~ you're sorry. You lose. ~Right? ~And it's like the,~ it's,~ it's like a reckoning of was this ever real work? ~Well, ~it was because of the product of humanity and where we were in technology. Yes, it was real work. Now is it real work? Arguably that surface area has like drastically reduced and I think. It's that feeling of people going, wow, my work now no longer is like real work. And there is some validity to that. It's just ~like, ~sorry, Noel: Yeah, I think, I guess, I don't know if this is a, it's ~like a, ~like a counterpoint, but ~I think that there is how to, how to frame this. There's like a, I think I, in,~ in a lot of the discourse I'm seeing online, I guess ~I can't, ~I can't speak to if people are like quitting and finding different teams or whatever. I just, I don't have, ~you know, ~data or even like strong,~ uh,~ like anecdotal evidence there, but. It's ~like, ~I think that there is this kind of, we had all these practices and like ways of writing code and ~you know, ~like there was, we had this like code view process and there was an expectation of people knowing things and like having a firm understanding of every change they make and certain abstractions that were good. And ~like, you know, when you, ~when you'd remove ~um, ~like redundant code versus leaving it to ~like ~ensure [00:13:00] separation of concerns were there And it's kinda ~like, ~okay, ~well ~we've decided none of that actually matters and we're just all gonna play like fast and loose now. And I think that is like blowing up in people's faces, like as we were talking to on our Jack: Yeah, Amazon. Noel: there, you're right, so there's like this kind of strife I think that a lot of people are feeling where it's like, Hey guys, wait a minute. ~Like, ~are we throwing the baby out with the bath water here? In ~like ~trying to move faster. ~We're, ~we're like developing systems that we can't really maintain or we hit this like diminishing returns point where it's ~like ~no one really understands this and we try to touch any of it and it breaks, or we end up having to ~like ~redo the whole code base and we're crossing our fingers again. That, ~you know, ~we vibed it, we vibed this piece correctly this time. ~Um, ~so I think that there is just ~kind of ~this like even devs that are trying to embrace the tooling, but they're not seeing this kind of ~like, um, ~balance for ~the, ~the tribal knowledge that had been gained in the last, ~you know, ~whatever, 20 years of software development. It's ~kind of ~just like that frustration I think is causing a lot of people to feel this burnout right Paige: Yeah. And it feels like there's,~ like,~ we've hit the, we know [00:14:00] what the bottleneck is now and it's not writing the code, it's reviewing the code, Noel: Yeah. Paige: and people just cannot review faster because we're human. We can't spin up subagents, we can't have multiple parallel threads of thought looking at different pieces of code. ~So, you know, ~Amazon's fix for now has more human review, but is this the right response? Is this, it's not a sustainable response. Probably they're gonna have quotas that they wanna hit and people are just not gonna be able to keep up unless you hire more people. And what are we doing with ai? We're getting rid of people. ~So, you know, ~what is,~ what,~ what are you gonna do if you're an engineering organization and you know that this is ~kind of the, ~the give and take or the tug four right now? Paul': ~I, ~I also feel like just Amazon is the worst example of ~like ~that organization you just mentioned, PA page. Like Amazon is the bottom of the barrel in terms of ~like ~being able ~to, ~to be fast and loose and ship something effectively and. Just 'cause they had disaster. Doesn't mean like other engineering organizations are like, oh my gosh, ~like ~if we go all into ai, like this is [00:15:00] what's gonna happen there. There's a lot of things that affect if this is gonna work for you or not. ~Um, ~like the tribal knowledge argument, what if your code base started around the time that cloud code came out, you could engineer it from the beginning up to actually work in these systems to build these either as hooks. There's ways that you can check things. ~Um, ~if you have a strongly typed system. ~I mean, ~look at Stripe. I'll go back to their minion example. They're currently, right now they're shipping like 1300 PRS a week successfully doing ~global, ~global payments code because of this minion system that Amazon tried to do. But they obviously messed up. So ~like ~if you're in an organization and you're wondering like, oh, what should we do here? Do here, I just think it's a where do you put your effort? Is it ~in the, ~in the models and the IDE and everything? Or are you like re-engineering the platform of how you ship work? 'cause it sounds like Amazon took this IDE. Put it on. They were like, all right, here's the process. We're gonna bake in all these steps. Whereas Stripe, they were like, we're gonna build a platform. Okay, we're gonna build a platform. There's these minions, there's all these rules built around them. ~Um, ~and ~you, ~you can use VS code if you're working with Stripe and you [00:16:00] can call up to the minion, but it's in an orchestrated platform environment. It's just two different philosophies about how they're doing this process. 'cause it is process at the end of the day. ~Um, ~I just feel like Amazon's the worst example and it really comes down to ~like, ~is your code base AI ready? ' cause ~I, ~I beg, I would say maybe Amazon wasn't and they never got to the point of maybe being AI ready. Stripe sounds like they put in a lot more effort there. Noel: Yeah. I feel Amazon's also ~like a much wider, they're cer like they're~ solving a much wider array of problems. ~Like I feel, I feel kind of, you know, ~as our AI impulses grow. It's like I feel ~the, ~the flows that are much more like data in, data out, I wanna observe what external calls were made. Like the kind of work that a lot of what Stripe is doing is like age agentic tools are very good at that. 'cause it's easy to like, verify changes, right? You can like have the flow run and see ~like, ~I made this change and it fix the bug that ~I, ~I can select which part of the test suite I need to run versus ~like, you know, ~the breadth of Amazon's code base is just like huge. So it's ~like, ~I'm sure it's hard to get it into a loop where it can validate itself and all that stuff. ~Um,~ Paul': ~That's, ~that's so right. 'cause ~like ~one reason stripes work so well [00:17:00] is they have ~like ~1.3 million unit tests or like some crazy million and Yeah. To Noel's point, they can probably figure out what subset of these tests need to be run and run them every time for every PR versus Amazon. ~I, ~I just have no idea. Noel: yeah. Jack: I, Paul': a really good point. Yeah. It just depends so much on ~like ~what your organization is, where you're pointing the Noel: The Paul': Like how ready are you? Yeah. The Amazon story is actually be scared. Please be if you're an engineering manager, be scared of it. Yes. Just ~like, ~don't think that this is like the sign of ~like, ~oh, it's not gonna work. Noel: Yeah. Paul': Yeah. Jack: a question for you though. ~Like, if, ~if you were to restart your career today and the job is basically reviewing code, would you get in software engineering? ~Is that, ~is that fun? Paul': I also feel like reviewing, I don't know if I completely agree that it's all reviewing code and that's where we're going. I feel like it's, we are headed towards requirement specification and reviewing code is right now a majority share of the work because we're not ready as an industry and so we're fixing and teeing up [00:18:00] and going back and rerunning in the review stage versus we're gonna gravitate over the next couple years towards requirement specification, opinion gathering, doing as many human loops before you give the AI this 3000 line document and hit go. So right now, Jack, no, this sounds terrible. I wouldn't recommend Jack: I'm not even sure if ~I, ~I like the whole requirements gathering thing because, Paul': Okay,~ well,~ there's Jack: like, ~you know, ~for ~like ~eight hours, like going through, ~you know, ~like talking about ~like. ~Here's this Word document and it just keeps blah, blah, blah, blah, blah. And it was like, should we say this is should or must or could? ~You know? Like, like, ~oh my God, ~you know, ~I just want a code man. I just wanna see stuff actually happen in the real world. And ~that, ~that's why I got into this. Like I didn't get into this to sit around and review documents, or review something, somebody else's code or review the reviewer's code. And at the end of this, it's ~like, ~it,~ what,~ what's the ~if, ~if Code Revit does all the PR review for you and you've [00:19:00] got, ~you know, ~the agent going and building those specs and all that, and you're just, everything's review, then really the only job you have in the system is two things. One, review and three. They can't blame the agent when things go wrong, but they can blame you as a human. And so you are basically the outlet of blame. ~That's ~that's the job. Reviewing stuff and being a blame target, and I don't Paul': a very, it's, Jack: this. Ugh, Paul': that's a very natural progression though. There's a lot of careers ~and, and, ~and jobs in this world where your value. Is the guarantor. ~You, ~you are, the fall, you, the buck stops with you in whatever industry in insurance and signing ~and, ~and I know it sucks, Jack,~ I'm,~ I'm Jack: That sounds like guess. God. I don't. Sounds awful. It sounds Paige: Yeah, I mean though it's really hard to turn down the money still because if you're an engineer, you and you're, especially if you're in the US, you're most likely making a lot more money than you would in many other industries. Save being a [00:20:00] lawyer or a doctor or something else. That requires a lot of schooling and a lot of pre, pre-preparation. ~Um. ~But at the same time, I very much identify with what you're saying, Jack, like I got into this because I liked to build things and see them come to life and bring ideas to through my own work or my own brain into fruition. And now it alm, and this kind of ties into my rant, but it almost seems more like if you're a subject matter expert in a particular domain and you have a, an idea of how to make that domain better, you can do that now. You don't need to have a software engineer who can build it for you. You're like, I know the publishing industry ~really, ~really well. Or I know how, ~you know, ~whatever construction works in this. Particular way if you know that it almost seems like you could get to the point where you build something that is useful for that industry. And it's not just that you have a software degree, so you know how to put, ~you know, ~you know how to hook [00:21:00] JavaScript up to C-H-T-M-L and CSS, you know the particular pain points of an industry. And now you can vibe code your way into ~like ~a really useful product platform tool that you can sell and monetize. Paul': ~And, ~and, ~you know, ~monetization that it really comes down to what is the motivating factor Like for Jack. Jack loves to create things, page love to create things. ~I, ~I, no, I like to create things too, but at the end of the day, the younger ~we're, ~we're gonna get of people entering this field after the AI revolution, call it. People are doing this for money. ~Like ~if anybody's in here, 'cause you like to create things, guess what? There's somebody that likes to do this for money and they don't give a crap. ~You know? ~And it's ~like, ~damn, ah. 'cause ~like ~Paige said, there's gonna be somebody in Con Construction, excuse me, who's just gonna know how to do it. And they don't care about the art of code. They actually want, ~you know, ~when you talk to the business folk, the vp, they're just like, stop. I don't wanna hear about the code. ~Like, ~what is, how does this move the needle? Paige: ~Mm-hmm.~ Paul': That's what everybody's gonna feel. Money is a good motivator. Jack: ~I mean, ~I would say that has been the progression since I got into this. Like when I [00:22:00] started, this was something you did out of love. ~Like ~nobody would,~ like,~ people were not getting mega salaries for doing software. It was not that. And then, ~you know, ~I'd say 20 years in, it was, ~you know, ~suddenly software engineers, like you could do that like a, ~you know, ~it's like a doctor or a lawyer, software engineer, ~you know, ~that kind of thing. And so you saw a lot of folks get into it. 'cause they're like, yeah, I, ~you know, I'm, ~I'm doing this for the money and I wanna be in a management position and blah, blah, blah, blah, blah. And it's ~like, ~okay. ~I mean, ~just ~kind of ~got used to that. But yeah. ~Uh, uh, ~I see what you're saying. I just, I'm not. Not excited about it. Paul': know there's gonna be somebody in business that's willing to do it for the money and then take the art out of it, and there's just gonna move the needle heavier than you. Yay. ~But you know, ~in some way, like musicians, when they're like, man, like everybody's like going to Suno AI and they're printing out music and they're just like, this all sucks and everything. But now they have this little bubble of their life where ~like, you know, ~I'm gonna go make music for fun. Jack: ~Mm-hmm.~ Paul': I dunno about you guys. Jack, you said on your, this last podcast that you did something just for fun, for shits and giggles. So ~like, ~I'm doing more of that too,~ to, to keep the, ~to keep the love in it. ~You know, ~you [00:23:00] have to. Paige: I think that's it. ~ ~ ~Like, ~I think that's it today is like you have to do something because you want to, for your own intrinsic value, not because it's gonna make you money. Like I really like making art and AI can make way more art, way better, way faster than me. But I wanna learn to paint and paint better. So I'm painting and it's like it takes longer, it takes me weeks to finish something 'cause I only can work on it for 20, 30 minutes a day tops. But I really get a sense of satisfaction because I painted this thing myself at the end. Noel: Yeah. ~Well, I think, ~I think ~the, ~the arts are also like an interesting thing because I feel like increasingly so now we're having,~ like,~ there's such blowback,~ like, you know, ~I'm not, I don't want to go to a gallery and look at art that people prompted,~ like,~ like people,~ like,~ I don't like people art, like musicians make the suno argument, but it's ~like.~ I don't really care to listen to music that is just prompted. ~Like ~if I did, I'd just go prompt it myself, you know what I'm saying? Like I don't wanna look at someone else's prompting. If I need art for some [00:24:00] to realize some vision, whatever it may be, I'll do it myself. But if I'm like wanting to consume, but that feels like a very different thing than like the science of coding. Like trying to, ~you know, ~like an engineering pursuit as a means to an end feels like a very different thing. Jack: I guess I always thought of, and I took a lot of pride in the idea ~of, ~of engineering as an art. ~I~ know, and I know, actually, it's funny, when I first got into a debate about it is when my wife is in college, and I, ~you know, ~we were hanging out with a bunch of theater folks and they were like, that is not an art. It is not a thing that is called an art. ~You know? ~And it's ~like, ~there's painting, there's theater, there's all this stuff. And it's like that, those are arts, ~you know? And, ~and then I looked up the Oxford English dictionary, ~you know, you know, ~word for art, ~you know, ~the definition of art. And it was any,~ uh,~ essentially craft taken to an extreme. ~You know, ~like you get into ~that, ~that is artful, right? And so if you ~really, really, ~really get into the quality of software and the quality of code, then that's an art. And ~I, ~I like that. ~I, ~I take pride, I took pride in my art, ~you know, ~and ~the, the, ~the architecture of it, and [00:25:00] yeah. Noel: Yeah. I think what's interesting about like when we talk about quality of code and ~like ~maintain like it, it feels like these are things that are like the code is elegant to ~like ~look at, maintain, understand it's relatively logically sound. You can like ~kind of ~go through it and like it does all of the things for us. And that's ~kind of ~what I like put to my prior point of ~like. ~The frustration in that. Like we had all of these mechanisms and tools and this whole language for talking about like code that we could understand and we knew there was like not weird side effects and all this stuff. And again, like we're not even talking about it anymore. It's just like we've all kind of moved. So like that I think, is this the core of this strife between ~like ~the art of code? It's ~like ~no, like doing things that we've been doing forever because we know it's sound and maintainable and scalable and all these things, but it's not as quick and it never has been like this is just a ~like, you know, ~very rapid acceleration of the like ~where, where does, ~where does one fall on this line of get the thing out the door versus ~like, ~do a little bit of craft here to make this ~a, ~a thing that we can ~like ~build [00:26:00] and scale and maintain. Paige: ~Yeah,~ I've wondered to myself more than once lately, if there's gonna be like some shops that pop. Up that are all about artisanal code that are like, that will say, we don't, we hand write all of our code still, we deliver a lot less. Obviously we are not the mainstream, but if you want a hand, a bespoke, handmade website, we are the people to come to. Kind of ~like, you know, ~when you're getting salvage denim or you're doing some, or you're buying something that is basically the old way of doing it, not the mass produced way. Is that gonna become a thing for coding maybe? Jack: Possibly. I know what you're talking about though. Yeah. I was actually with a friend just over the weekend and they were going to a bike store and getting some stuff on their bike, and there was clearly, there were the mass manufactured parts and then there was the. There is literally ~a, ~a particular Chris, like Chris Kang or something like that makes this particular bike parts and it's like it's all hand custom made kind of thing. And I don't [00:27:00] know ~that, ~that's cool. That's a bike part, but I'm not sure where that applies to codes so much. ~You know, like, yeah,~ Paul': the Noel: parts seem different than, I don't know, ~you know, ~like it's a weird one. yeah. yeah.~ I mean, like, ~we see this everywhere, right? It's like in every, from every, everything. It's like arts and like culinary craft and like all these things. It's ~kind of ~the same thing. ~I, I mean, ~I feel code is much more of a, like a commodity. It's like a, ~you know, engineering, ~engineering field. That's a means to an end. So I, I can't imagine there's much of that maybe in ~like ~a very cottage, boutiquey way, but I don't know. Jack: Yeah. Paul': To me, it's like we're talking about hand refining gasoline ~or, ~or just buying it from Saudi Arabia and putting in our cars. I'm like,~ nobody,~ nobody even wants to have this conversation. They just want the gas. Paige: Yeah. Paul': ~we're, ~we're in like some weird cave echo chamber here thinking people care about how the gas is made or even like what wars need to get start to,~ to, to, ~to obtain it. Paige: I think you're right. I think there's just not enough of a, an industry that cares about the artisanal craft of code. The way that we [00:28:00] like to talk about it, the ELO elegant code, ~you know, ~there's even, ~I mean, ~there's eloquent JavaScript, for goodness sake. There's a, an entire book that is dedicated to just writing better code. And at this point in time, I think that's such a moot point that we, it's not even, ~you know, ~all the O'Reilly authors, I feel so bad for them because everything that they've written,~ like,~ I don't think anybody's consuming it anymore. Any human, ~I guess. ~Yeah. ~I mean, ~the have already consumed it years ago, so they're fine. ~They, ~they know it. Jack: Content is a whole new area around this, like our whole issue on its own. ~Like ~what do you,~ what,~ what content? It makes sense in this world. ~I mean, ~this kind of content I think makes sense because it's for humans, by humans talking about the impact of AI in their field. ~Uh, ~and I don't know if any AI is gonna care enough to actually read through the transcript of this and be like, huh, okay. Engineers are, ~you know, uh, ~trying to internalize what this means for themselves. Great. Good for them. Now I'm gonna go on and, ~you know, ~do my Claude code thing. ~Um, ~but yeah, I know, like trying to figure out the ramifications of it. ~I, I mean, ~I am. I think [00:29:00] about ~the, the, ~the shift from folks who would do like manual,~ um,~ metal work, ~you know, like, ~like in the old days,~ like, you know, ~putting you together ~like, you know, ~bolts and all that kind of stuff manually. And then nowadays, like everybody does this with ~like ~CNC and it has become all about programming the CNC machines. And you take satisfaction in the fact that ~like, ~ooh, my CNC pattern can be used to mill, build, like mill like a million parts. ~You know? And, ~and that's cool, right? And so maybe that's the thing. ~Maybe, ~maybe the idea here is that we just have to ~like ~learn how to take satisfaction in the AI and in using these tools in a cool way. And like you ~sort of ~just move the satisfaction point somewhere else. And if you can't do that ~well ~you just get outta the game. 'cause ~like, ~I'm sure there's a bunch of folks who like hand mill stuff and are like, yeah, I don't wanna learn how to decode CNC, so I'm peace out bro. I'm gonna go and I don't know, whatever. Do marlin fishing or whatever. Paul': ~I, ~I like that analogy, Jack. ~It, ~it feels,~ um,~ 'cause it's very like end, it's very product focused. That analogy, it's like the end product that we all care about is the refined metal piece. [00:30:00] And if you can center your emotions around like wanting to make the most, the best and the best cut, like end pieces, you're fine. You can find satisfaction. But if you like milling, sorry. Jack: yeah. Paul': Yeah. Jack: ~well. ~Gotta get outta my milling space. Paige: ~Well, ~we've talked about this topic for quite a bit, so ~let's, ~let's talk about another one that's also agent related. Agents running while you sleep, and the verification problem that we've ~kind of ~already been hitting on. ~So, you know, ~autonomous coding agents aren't a future concern anymore. Obviously they're running in production today. ~Uh, ~and a piece this week came out from Claude CODEcamp, which describes teams merging 40 to 50 AI generated PRS a week with agents shipping changes to branches. Their human authors haven't even read ~the, ~the author's core observation. When the same AI writes code and writes a test for that code, you get self-congratulation machines, which I think we've all definitely seen. They'll tell you straight to your face that it's fixed when it's not. It checks its own assumptions, not basically. So the proposed solution borrows from test driven [00:31:00] development where you write acceptance criteria in plain English before you prompt the agent. Then you use a separate verification process like playwright agents probably to check outcomes against those criteria. The idea is that you review failures, not diffs. It's a compelling workflow, but it raises the harder question of where the human judgment fits ~in, ~in a world where agents are running while you sleep. So I think that we've all ~kind of ~discussed this, but have you hit personally the, I have no idea if what this agent shipped is any good scenario where, and ~like ~what are you doing to try and mitigate that? Paul': ~I, I mean, ~I can speak for my team of eight because we do participate in ZTE, and I'm gonna take that term from YouTube, zero touch engineering. Not everything, but there are things that I would make jack wince like he just did, for those who are listening and not,~ uh,~ watching about somebody who would never read a branch and then ship it to production. We do that. ~Um, ~but the, but there's requirements in order for you to fall into that category of doing a ZTE ship. There's [00:32:00] things that need to be in place and a human needs to decide that. What intelligence are you using? Where are you pointing it? What's the scope of the requirements? Some things need to be checked. And then when that goes out, like ~we ~we're monorepo people, right? So ~like ~what packages got touched? What apps got touched? Where do we need to spawn? Playwright agents? What is the minimum key success criteria? Borders don't overlap, text doesn't overlap. These core pages load. ~Like ~there's things that you can check to do that. And then you're like,~ well,~ okay, so then what does the human do as that surface area grows? ~Well, ~the human is gathering requirements. They're doing that awful Google Doc process that we all hate, but it's really important. And then they're directing intelligence in the right bite-sized areas, in the right spots. And it's just like that general ceremony of gathering and directing. The intelligence is just gonna grow. And to me, that's like the human, the main human surface area ~as ~as ZTE expands. Paige: So for you it's more about understanding what are the features that we wanna add and directing the agents toward creating and [00:33:00] implementing those features. Paul': Yeah, honestly. And break and breaking down the features. I don't think we're at a point yet where we can just take ~a, ~a feature ticket ~or, ~or something like that. Throw it at ~a, ~a set of minions ~and, ~and get a ZTE set of branches shipped in, like not yet. ~Uh, ~human needs to sit with Claude code. ~Great, ~great colleague. Break this thing down into a PRD, break it down into tickets, and then one by one, the human is gonna orchestrate the sending of these tickets. And then one of those might take overnight, it might, because you need to spot up browsers. There's ~like ~17 different user flows and every playwright flow can take five to 10 minutes to test, ~you know?~ ~Um, ~so that, that's, and but that balance of where does the human break it down, how small do they break it down? What intelligence can you orchestrate versus what's automatic? That's always gonna be changing. So I don't know where we'll be in six months, but that surface area. That gelatinous space of humans orchestrating things is ~kind of ~where we sit right now. Jack: All I know is Paul': ~mean, ~even Code Jack: CEO is like drooling over those statements because ~that's, ~that's how you get your 200 DK token usage every year ~is, ~is, ~you know, ~just. Uber agents for everything, agents, [00:34:00] monitoring agents and all that. And ~that's, ~that's how we keep ~this, ~this spinning ball of AI ~kind of ~running. Noel: I gotta do something with all these data Jack: ~I, ~I guess so they're all in looking at each other's code. Yeah. ~You know, um, I mean, ~I think I would kind of also add on like a mission criticality to it. ~Like, ~I'm okay if you want a ZTE, like some text changes ~on a, ~on a doc, ~on a, ~on a page, fine, whatever. Go for it, ~you know? ~But if it's like zte, eing, I don't know, like a unit test overhaul or something like that,~ like, uh, nah, ~nah, we're gonna, we're gonna, we're gonna spend some time with that. We're gonna look at. Noel: Yeah,~ I'm, I'm, ~I'm very intrigued,~ um, kind of when I, ~when I read about organizations and how they're determining what tests need to be run for, what set of changes and like that kind of like, that's always been a somewhat tricky problem. And ~this is, ~this is the one that I run into when I am doing more agentic coding right now. It's like where there will be weird side effects to things or like things will break. Unexpectedly and like some other view that it needed to make a change on that ~like ~ended up calling something two layers deep or something. It's ~like, ~oh, this utility [00:35:00] function that's called by something that's called by something ended up breaking this other view and like catching those and figuring out when to run a subset of your suite for those kinds of changes. Because I think a human is, I don't know, may,~ maybe,~ maybe not, maybe this depends on the person. ~Um, ~but ~I think, ~I think that's kind of stuff's a lot easier when you have a code base,~ uh,~ like stripes like we were talking about earlier, where there's like rigorous unit tests ~kind of ~at every level of abstraction throughout the code base versus it being more like a code base. It's more integration test heavy. If you're not running that full suite all on every change, it can be easier for stuff to slip through the cracks. ~Um. So, ~yeah, I'm like, I'm ~kind of, I'm, I'm, ~I'm keeping my ear to the ground a little bit for like tools and processes ~as, ~as teams are figuring that out, because I feel like that's the crux of this problem. Like ~how do, ~how do you figure out what needs to be verified when, so it's not like for every little, if I make a styling change, I don't need to run a playwright agent suite that, ~you know, ~costs 200,000 tokens for a,~ like, you know, ~one character change in a label or something. ~Um, ~but Paul': Yeah. And that's definitely a weakness noll. ~Like ~[00:36:00] if we do a CSS change, damn sure. Like we're running the whole playwright Noel: that's what I'm saying. ~Yeah, yeah, ~ Paul': gonna boot up. Yeah. And it's just like I, it has to be that way. How else could it be like me leaving an agent to decide whether to run playwright or not is heinous in Noel: Yeah. ~Right, right. ~But like even ~in that, ~in that Stripe blog post,~ they're,~ they're talking about how like depending on the changes, subsets of tests run, and it's ~like, ~okay, like how, like where, ~like ~what is your guys' membrane there? ~Like ~is, I feel like that's just a super interesting kind of,~ uh,~ code architecture problem. ~Um,~ Paul': I, ~you know, ~I also think that some ha Have you guys seen Andre car's auto research? Jack: Now? Paige: I'm Paul': Okay. He, this came out like last month or a few weeks ago. ~Um, ~the idea is he had an LLM get trained and then ~redo the training, ~redo the training, reset the hyper parameters, whatever, until it was damn good. It was a mini large language model. And he did this by having Claude have one thesis, make a score, propose a change, and then if the model,~ if,~ if the score moved over that next training. Noel: Yeah. Yeah. Yeah. Paul': it didn't work, he'd revert. And so you [00:37:00] have ~like ~Claude sitting on top of maybe ~like ~three or four other Claudes and then ~like ~driving get with one proposal and then changing it. Like I tried this, I tried it not within the large language model, I was developing software. And I was like, okay, Claude, we're gonna make a video script and you're gonna have these like other 12 claudes judge based on like continuous speech. Make sure like it flows, make sure it's like good for YouTube shorts, blah, blah, blah, blah, blah. And they're failing. Now Claude is like going, oh, this is fail. ~Let's, ~let's fix it and repeat. Now I'm sure you could do the same thing with tests. We have tests. You have some sort of clo that distills the test. At first it's gonna suck. You have a human, you go in, it's just basic like beige smoothing or whatever. Like ~you, you, ~you make hypotheses. You look at what actually happened, you bring the data back in, and next time it's gonna be better. And like this whole dance is now starting to happen with ai. So if you can once again, get your requirements down, make a score out of it, feed it back into the system, yeah, it's gonna get better. To the point where it's writing me YouTube short scripts, like, how can you teach an AI ~to, to, ~to be suave? No, [00:38:00] taught itself, which is kind of weird. I bet it could do it for tests too. Jack: was the initial input on that B swab, ~you know?~ Paul': ~like ~be,~ uh,~ no, I said you need to maintain the TikTok style, A DHD gr emotional gravitational pull throughout the video. I just Jack: Oh, okay. Paul': the whole time. Jack: Yeah. Emotional gravitational Paul': suckle from supple, from like the little mouse feeder, Noel: ~Yeah.~ Yeah. Optimize. Optimize for brain rot. Paul': ~Yep,~ Jack: Oh god. Noel: ~Mm-hmm.~ Jack: Oh, funny. I took a Stanford course on gamification once and ~like ~half the course was a. You should really be careful about using this because you can impact humans in terrible ways with this kind of, with gamification and ~you know, ~now you're like, oh yeah, I'm gonna go and like hone and train an LLM to make stuff that's going to reduce a human's attention span. Paige: to Jack: Let's go, let's accelerate that please, because Paul': What can Jack: that's taken us to a Paul': a zoomer. Jack: in this world. Paul': It's very interesting to see how large language models are being meta on [00:39:00] top of each other to train them to do things that like are very esoteric, like creating brain rot. ~Um, ~but Jack: Yeah, I ~mean, I, ~I did this actually just recently because I needed to ~like ~have two side-by-side versions of like using tools and using code mode and then compare the output of both. And yeah, I used Claude to be the judge of, ~you know, ~these two outputs to say versus like ~a, ~a, ~you know, ~a known good and say, cool, this was good, this was bad. It wasn't actually changing the behavior at all though. ~Like ~this is just more like smoke testing kind of thing. Paul': ~Right. ~Yeah. Paige: So do you think that this actually changes the velocity argument a little bit? We've been talking so much about requirements gathering and making sure that the tests are good upfront, or the criteria ~is, ~is ~really, ~really strict and solid and ~thought, ~thought out. So if you're spending all this time upfront planning and writing and adding, ~you know, ~guardrails and catchalls and rules and stuff like that, are you, [00:40:00] are we actually shipping code faster in the end? I think the answer is still yes. 'cause it feels that way. But I don't know if that's just my personal,~ like,~ feeling about how it works versus ~is it actually, ~is it actually shipping faster? Jack: That's a Paul': We're definitely in a weird spot where we're like getting, we're sharpening pickaxes while we're mining at the same time. So it's like hard to judge how effective the sharpness of my pickax is ~ ~'cause I keep going back and working on it. Jack: Yeah. Paul': But I'll tell you one thing, I've deployed probably 12 apps last month and ~not, ~not production, not like they're making money, but ~like, ~not for me and not for like my mom. 'cause she wanted to like, have something that checks her email and ~like ~ranks her email. ~You know, ~it's just like things, there's things that I just couldn't create that are now created. I'm starting a YouTube shorts channel. Like I would say it's faster. Paige: Yeah. ~Well that, ~that actually brings up another question. What does, how would you define a senior engineer or. Like those different levels and strata of engineers that we've had forever. Because typically it means you have a certain level of mastery over a [00:41:00] coding language or understanding larger architectural problems or things. But if you're not the one who's writing the code anymore, like how do we classify engineers? What do we use to ~kind of ~set them at these different levels that we're, that we've known in the past? Jack: ~Well, ~Well, I think the whole senior engineer thing for me was always, has always been like junior engineers focus on the code, like the actual, like mechanics of the code. And because they're working at that level. And then as you go up the engineering chain, you're working with product more, ~you're talking, you know, you're, ~you're working at a higher level of architecture or that kind of thing. So I think that ~it, ~it says to me that like the folks that are going to gain the most from AI is are senior engineers and that's actually ending up being the case. And unfortunately we're losing. The entry points for a lot of junior engineers, which is incredibly shortsighted on the part of, I don't [00:42:00] know, engineer software engineering writ large. Because at a certain point, ~you know, ~a lot of folks are gonna take their vibe coded app and make their millions and be like, okay, that was fun. Peace out. ~You know, and, ~and ~like, ~that's it. You don't have somebody's, you still need people to review the code. You still need people to actually understand all this stuff. You still need people to actually put together a decent prompt that, that says, here's the architecture I want you to follow, ~you know, or, ~or be able to review the architecture that comes up with, ~you know, ~yeah. ~It's, ~it's tough, but I think the, I think those na, those monikers still stand back to your original point. Paul': The, ~you know, ~building out the team that is making the music app now. Junior engineers have been the focus actually because, and not necessarily a software, but like people who are like, yeah, I've done, or maybe have a bachelor's in computer science or ~like, ~some sort of degree, but ~like, ~oh ~I, ~I got my ~uh, ~master's or my accelerated blah in, insert this music industry. I want you, you know why? Because we're gonna give you Claude code and the slip layer between somebody [00:43:00] being ~a, ~a senior engineer and then talking to you, all of that waste is gone 'cause you're doing it now. And there's a senior engineer, whether it be me, whether it be the other one senior engineer, we have controlling the harness, making sure things are like shipped correctly. So our team is mostly juniors, people who are subject matter experts, ~kind of ~like how Paige insinuated beforehand that are being enabled with the tools. ~Um. ~They're ~kind of ~like a junior engineer now. Like they're young, they're fresh outta college. ~They, ~they're not sure what to do with their degree, but man, we can make them like, very effective ~in the, ~in the,~ uh,~ workplace. be fair, that's not a software junior engineer though, right? That's like somebody had the whole point is they have a specialty in something else, and so there's also ~like, well, ~they're not gonna spend four years learning about proofs. You know what, I don't know. I don't know what the impact of that will be. Jack: yeah, ~I mean, ~at the end of the day, like if you, your music app, right? You want to go and charge people money for that. Andy, we want to have a certain level of privacy to their music that they create so that nobody steals it or whatever. I don't know what, ~you know, ~those sort of things. ~When, ~when does a senior engineer go into the mix to say ~like, ~Hey, when we build this API, [00:44:00] it has to have, ~you know, ~this level of auth and it only can only be accessible from these points because here's the overall architecture and here's the, that, those guardrails. Because that what we're seeing as a is a,~ uh,~ an issue with a lot of the, this AI stuff is that security vulnerabilities become,~ um, you know, are, ~are really the big issues coming out of this. ~Like, ~I think there's one just recently where, ~you know, ~all of the documents that these folks had uploaded, and these are like IRS documents and ~yada yada, ~yada, were all just basically exposed to the open web with, ~you know, a, a, ~a simple token that anybody could get and everybody shared the same token. So there you go. ~You know. ~Would it? I, yeah. Somebody needs to be in the process somewhere to make sure ~that ~that doesn't happen. Paul': Always a role in space and necessity for a senior engineer. Like it's just compressed, ~you know, ~instead of needing like ~their, ~their reach of looking at that stuff is much farther. Paige: ~Mm-hmm. ~So maybe it's less now about a specific area of expertise at that level, but more about a broader. ~Kind of ~t-shaped engineer that we've talked about in [00:45:00] the past where it's like, you need to be, you need to know ~a, ~a decent amount about infrastructure and security and best practices and, ~you know, ~X, y, and z to, so ~less of, ~less of a specialist, more of a generalist, but at a higher level. Paul': It was like the bar was raised a little bit honestly, for ~like ~that role. ~Like ~if you wanna be a real senior engineer on a team like that. Yeah. The bar is raised. Jack: And the bar is also raised because you have to be more of a communicator than you have had to in the past. It was one of those things where I met a bunch of principal and architects level folks over the years, and these are folks that basically, ~you know, the, the, ~the dollar band to retain them was just. The level up, right? And so they're, they didn't gain any additional skills. They just had to go to be a principal because we, that's how we pay them. HR says, you can't pay, ~you know, a, ~a senior engineer that much kind of thing. And they didn't learn the requisite skills that you need really to go from senior to principal to architect or whatever, [00:46:00] which are communication skills, which are the, those abilities to have those thoughts about or ha ~you know, ~come up with those designs,~ uh,~ for the architecture and then communicate them to all of the junior engineers and the teams ~and, ~and do it in a way where you answer all their questions and make sure that the documentation's all to date and all that stuff, right? Those skills are really important in the age agent age because that's how you talk to these LLMs. You have to learn those communication skills. So that is actually a cool, in my book. Paul': I would agree with that, Jack. It's like this ~like, uh, ~special sauce that you need,~ uh,~ intersection between being technical and communicative. Paige: Cool. ~Well, ~I would keep talking about this all day, but we've almost come up on an hour already talking about these two topics, so let's do a quick a. Break, and then we'll do some hot takes. ~So~ this episode is brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and analytics, which surface the UX and technical issues impacting user experiences. Start understanding where your users are struggling by trying it for free@logrocket.com. Alright,~ uh,~ would you like [00:47:00] to give us your first hot take for this episode? Noel: Yeah, sure. ~Um, ~my,~ uh,~ I guess my hot take is ~I, ~I think this is, we have a big, there's several big problems here that I, ~you know, ~we talked about lack of junior engineers and all these things that,~ uh,~ LMS are gonna ~kind of ~precipitate. I am worried about, I think this is a weird one, but I'm worried about ~like, ~the long-term efficacy of a lot of open source projects. I feel like a lot of people have been talking about this, like death of a lot of SaaS companies. 'cause people just vibe, code things internally. I think we're also gonna see that kind of ~like. ~Death of open source a little bit. 'cause people are like a, maintain an open source project I think now is ~a, ~a large successful and is really difficult,~ um,~ because of so much junk that people are just like putting up for prs and stuff that have obviously not been written by a human. And these open source maintainers don't wanna just push that code to all of their,~ uh,~ dependence. ~Um, ~and then, ~you know, ~like for small stuff, I think there's just gonna be an increasing discoverability problem as there is more and more junk just pushed. ~Um, ~so ~I'm, ~I'm worried about this kinda lack a lack of ~like, um, ~I don't know, clear winners I guess. And like open [00:48:00] source place where it's ~like, ~okay, like this is ~kind of ~the open source library used for requests if you're using, ~you know, ~no or whatever. ~Um, ~so I'm worried we're gonna have ~like ~a little bit of a dry spell there or, I dunno, this will be maybe the, just the depth of kind of the history of open source ~as, ~as we've known it. ~Um. ~And I'm,~ uh,~ that's one that ~I'm, ~I'm worried about. And I feel like all the, all these repo maintainers are like putting out blog posts and stuff now 'cause they're like, we don't know what to do. ~Um,~ anyway. Paige: That's a good one. ~Uh, ~Jack, what's your hot take for this week? Jack: My hot take is that USBC sucks,~ like,~ to be honest, like it was supposed to be this like really cool like single cable to rule them all. And because now I've got ~like ~18 different versions of USBC, like the power only all the way up to ~like ~the crazy can't be more than six foot thunderbolt five cable, blah, blah blah. And they're like, ~you know, ~like all these little details. It's ~like, ~no, ~you know, ~like we went to IKEA and my wife's ~like, ~here's this like bin of USBC cables, and she's ~like, ~we'll never need USBC cables again. We could just ~like ~buy all these ones that are like fuck 40. I'm like, I could go into a whole diatribe about [00:49:00] why these SBC will suck, but. Even, I don't understand really, to be honest, like all the different freaking standards, Noel: that's a great one. I've had this whole thing too there, like there's a huge iconography and the functional problem in the USB standard where it's ~like, ~you don't know what's,~ like,~ you plug in the thing with a battery to another thing that's got a battery. It's ~like, ~which way's power gonna flow? Who knows? ~Like,~ it's like this is all controlled by software. Yeah. Jack: Yeah. Yeah. And then people, and then HTMI runs over it. My kid is a hardware engineer and ~they, ~they were over at Apple and they were like, USBC is crazy. Noel: Yeah. Jack: it's, there's too much stuff going across this cable. Paige: Oh, nice. Noel: Nice. Paige: All right, Paul, what's your hot take for this week? Paul': I love the USBC one felt very hard. ~Um, ~I've been looking a lot into the,~ uh,~ defense situation with philanthropic and open ai. I think it's being blown outta proportion in the sense that they are both equally terrible. Um.~ Um. ~They would both kill people just without knowing it. And ~the, ~the nuances of how these models are deployed [00:50:00] really matter. Like are, which servers are they running on? Who controls the servers? Like what monitoring is on those servers? And there's differences between open AI and the contract ~with the, ~with the defense department ~and, ~and with philanthropic, those matter, it really matters. And for people to ~like ~put philanthropic on this pedestal after this happened,~ like,~ yes, it was nice that they did stand up. That was a reaction. It was here, this is part two, the hot take that was a reactionary because of things that happened that they're not gonna let us know about that philanthropic had a hand ~in, ~in terms of the current situation in the global geopolitics. So yeah,~ don't,~ don't be praising philanthropic people. I love philanthropic. Claude CO's my main driver, but just like they're equally terrible to, to Sammy. Okay, Jack: ~You know, and, ~and. ~You know, ~I, most people don't have ~like ~an, ~you know, ~a weapon sitting on top of LLM or whatever, but if you're using like LLMs to do like HR stuff, don't do that. That's not what they're designed, that's not what it's designed to do, right? It's designed to predict the next token based on some prompt. That's all it is. ~You, you know, ~it's, it looks like it's smart, but ~it's, ~[00:51:00] it's not. ~And, ~and to say ~like, ~cool, let's go and ~like, you know, ~decide. Our targeting of F1 seventeens,~ uh,~ based on this LLM is insane. Paul': Like j just exactly ~like, ~and just remember philanthropic was doing that. were doing that. So ~don't, ~don't act like Philanthropics anti-war. Okay, that's it. That's my hot take. Paige: Nice one. My hot take is gonna come back to a little bit of what we were talking about earlier where I think it's a lot less today about ~what, ~what we can build and a lot more about what we should build or what taste should tells us we should build. Because now we can build anything, ~you know, ~we can vibe it into existence. We can do, we can make pretty much whatever you want at this point. So it comes down to a lot more of what is actually useful to be built, what should we be putting our time towards? ~What, ~what should we be building? So I think that's ~kind of ~like the next frontier for engineers ~is, ~is leveling up your taste and being better about making those sorts of recommendations to your product manager or to understanding what your users ~are, ~are telling you [00:52:00] they want or they need, and being able to then build that based on. That kind of experience and that kind of level ~of, ~of taste instead of level of effort or ~um, ~how quickly can we get this out? 'cause we could probably get it out pretty darn quick, and it's more about what should we be getting out? Noel: Nice Jack: Total. Yeah, totally Paige: become a taste maker. Jack: Yes, exactly. Noel: artisanal code. Paige: Exactly. High quality. All right. ~Well ~thank you everyone for joining us. Thanks Paul. Thanks Jack. Thanks Noel, for being here. And thank you to everybody who listened to this episode of Log Rocket. We will see you on the next episode.