PodRocket - April 2026 Panel - AUDIO EDIT === [00:00:00] Paige: Welcome back to Pod Rocket, a web development podcast brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and analytics, which surface the UX and technical issues impacting user experiences. Start understanding where your users are struggling by trying it for free@logrocket.com. Hey everyone. I'm Paige Needing House. I'm your host today and I'm also a lead software engineer at the company Allspice. And we are back with our panel episode where ~we'll, ~we'll be talking about CloudFlare versus Versal, the Turf War over next js. And whether you can actually trust ~AI written co.~ AI written code. ~It is a wild few weeks at Anthropic,~ but before we get into it, let's welcome our panelists. We have Jack Harrington, principal Dere engineer at Netlify, Jack: ~Hey everybody. Welcome to be here.~ Paige: ~on YouTube,~ Jack: ~Yeah.~ Paige: and also co-host of the Fronted Fire Podcast. Jack: Absolutely happy to be with you, Paige. Paige: Glad to have you back, Jack. We have Paul Mikulski's, a pod rocket host and YouTuber. Welcome back, Paul. Paul: Thanks for having me again, Paige. Happy to be here. Paige: And we have Noel Mincha, a software engineer at Log Rocket. Good to see you. No. Noel: ~Yeah, yeah, ~Happy to [00:01:00] be here, Paige. Paige: All right, so let's talk first about,~ uh,~ the Versal CloudFlare drama that has been going on for a number of weeks now. ~So, ~back in, ~I guess, ~February, late February, CloudFlare released v Next, which was a re-implementation of the entire next Js API Surface that one engineer built in a week using $1,100 and Claude Code to do it. Pretty amazing. ~Uh, ~and the goal was to basically let developers deploy next JS apps anywhere without being locked into first sales infrastructure, which has. Repeatedly been a pain point and a thing that a lot of people have complained about. So the community reacted as it always does, and within days ~v's.~ ~Cmo, uh ~V'S. CEO, Guillermo Rauch had publicly disclosed seven security vulnerabilities in the weak old project and accused cloud flare of trying to quote, fork the entire developer ecosystem and destroy open source. A very CEO tech thing to say. ~Security research then piled on ~the security researchers piled on finding dozens more bugs. ~Uh, ~a separate state of JS survey [00:02:00] meanwhile found that 17% of next JS users hold negative opinions of the framework with two versel centric as the dominant complaint. So let's talk about this, the fact that CloudFlare built a working framework in a week for $1,100. Does this change what we think about for when it comes to cost of actually building serious developer built frameworks and tools in your opinion? Paul: Absolutely not. Noel: No. Yeah, I don't think so. I feel like ~when we're, ~when we're making our tooling decisions, we set out to find the things that we feel are the most, like robust and well vetted and all these things. So it's just like ~you know, ~the fact that it's like, okay,~ like, ~sure we could vibe code our own whole framework in a weekend and do it that way, but like, why? Like what is the, what's the advantage here? Jack: Do you wanna own that tech debt? Really? Noel: Yeah. Jack: Yeah. Paul: I view it very similar like cars in, in ~like ~the car economy. ~Like ~if somebody came out and they were like, we're gonna start an American car company, actually slates. Late's a good example. They have the new truck. [00:03:00] It's ~kind of ~open sourcey. You Jack: Oh, I love that one. Paul: Yeah. It's like cool. It's cool. Like I want to tinker around with it. But ~they, ~they made the truck and the whole open source is the defining feature. It is something net new that engineers had to put novel thinking into making these parts adaptable. Now, let's say they didn't do that and they just made a new truck. They're like, oh, we're an American like car truck company. It's cheap, but we made this truck. Cool. Cool story, bro. We've made trucks for a long time. What is different about it and how does it plug into the ecosystem? You make an electric car? Oh, that's cool, but what about ~like ~all the infrastructure around this? You making the car, doesn't really matter. And that's how I feel about this next JS project. ~Like, ~that's a cool story, bro, but what is, how does this integrate into my day-to-day landscape as a developer? If anything, it is a cool use case of showing the progression of AI models. Maybe it could be a benchmark or something at some point. ~Well ~am are we gonna use it like. I don't know, put something new on it. Put some engineering in it. ~Make, ~make this your truck open source. Jack: ~I mean, ~when you think about it,~ it's,~ it's so easy to literally point Claude in an OSS [00:04:00] data source code repo and be like,~ just,~ just make this again, ~you know? ~You're right, exactly right. It's like. When they made ~the, ~the truck, right? ~The, ~the modular truck, it was a, it was different because it was modular. They're not making like the O one 50 variant, like the open source F-150. ~No, no, ~no. They're making like the, making something new. And ~I, ~I think that was what was missing here was that, ~you know, uh, ~this is cool, but. You're just ~kind of ~replicating stuff that we ~kind of ~don't like. ~I mean, ~one of the things about it is ~like, ~sure, next JS has become versatile specific, but like probably issue number two was the app writer just didn't make sense for me as a way to websites. ~You know, ~the RSE thing just got in the way, or it's too complicated or whatever. And so redoing ~that, ~that, that doesn't really help anybody. Paul: Also on the whole, like next JS doesn't play nice with infrastructure. I feel so many different ways about that comment. ~Um, ~I understand it certainly has been the case and continues to be the case in a lot of ways. If you look at next JS and you just go,~ well, uh, ~this is inferior because,~ uh,~ it's only made for versa. I feel like that's a u [00:05:00] skill issue terms of ~like ~understanding how to deploy next js ~well, ~they're also doing active work. ~Like ~Elizabeth put me on a podcast where we talked with one of the,~ uh,~ core team members over to Versal, and they were talking about the adapters, API, they're aware of this, ~like ~Open Next has existed, and now they're trying to have it more routinely plug into this ~like.~ Maintainable thing by the team. ~Uh, ~it is not something that they're actively working against us on. It is just a product of the opportunistic nature of a venture funded company on top of a great open source technology. You can make it work for yourself if you want to. Like I run several self hosing next apps, the caching, the route caching path invalidate, it works. Is it the best thing to reach for? Is, that's your choice, my friend. That is your choice. ~Uh, ~but using that as like a blanket argument to go, look, we rewrote it and now it's better. It is just ~like, is this, ~is this the actual undercurrent of the narrative you're trying to tackle? ~I'm, ~I'm not sure if that aligns. Noel: Yeah, ~I mean, ~I understand how they got there. I'm sure if your cloudflare's sitting there like, man, you know, look like ~this, ~this kind of,~ uh,~ vendor lock-in is bad. Like we have all, [00:06:00] like, all this cool hosting infrastructure, like what if we made the version of this that was easier for people to deploy onto our, into our ecosystem? ~Like. ~I don't know. I understand that coming out of a whiteboard meeting,~ like, well, ~let's just see what we can do,~ like,~ as a fun, kind of proof of concept. And I feel like that was even ~kind of ~how it was presented in like the initial blog post is ~like, ~that was very much the vibe. ~Um, ~but ~like, ~yeah, I dunno, it ~kind of, ~of course took on this life of its own and everyone, all these opinions bolted onto it. ~Um, ~so ~like, ~I don't know, it's ~kind of ~messy ~in, ~in a couple different directions there, ~I guess.~ Paige: Yeah. So you think that. Noel: yeah. Paige: Maybe it was like, like you said, it was intended to be a cool experiment. Check this out. Look what we did. ~But, ~but the internet took it too seriously, as it always does, and was like, ~well ~look at all these bugs. Look at all this issues. Did anybody actually read this code? No. And ~you know, ~and everybody's immediate question is, when can I put it into production? ~Well, ~maybe it wasn't intended. Noel: Yeah. Yeah. It's like the, ~you know, the vocal, ~the vocal minorities on both sides of ~like, ~a very small subset of people are like, I'm gonna switch to this and use it for production. And a very small subset of people are like,~ well,~ we're [00:07:00] gonna vet this and point out all this stuff. And everybody else is ~like, ~I thought this was a, like a, I don't know, POC kind of thing. Like most people are sitting here from the outside looking at the extremes. Jack: Yeah, I really doubt this was any kind of linear or Jira or GitHub issue like that they, ~you know, ~thought about in a meeting. Were like, yeah,~ let's,~ let's read you next. And I think it was literally. Really just a guy who's ~like, ~Hey, I got Claude here. Wake up in the morning. Sure. Redo next. And then ~it, ~it popped out and he is ~like, ~wait, what? Really? And then he ~kind of ~showed it to some people at ~like ~CloudFlare and they're like, that's ~kind of ~cool. And then ~like, ~step two, they didn't really think much about, which was like, ~you know, ~actually trying to ~like ~make this a thing. ~Like, ~I don't know. No yet, Noel: Question mark. ~Question mark, ~question mark. Question mark. Blog post? Jack: Yeah. ~Right, right. ~Yeah. ~And, ~and on the adapter side,~ I,~ I,~ it, it's, ~it's possible but not easy is ~what, ~what the adapters have done like this. There's still a lot of stuff about the versatile infrastructure that is hard to replicate. Paige: But at the same time, if [00:08:00] you look at kind of the things ~that ~that CloudFlare is doing, like their new M dash framework, which is built on Astro, it heavily locks them into. And void, zero Void dev, like the hosting platform. Both of those are CloudFlare lock-ins, and it's for the same reasons that Burl has locked next JS in. They're like better performance, better, ~you know, ~options where you just one click deploy and that's ~kind of, ~that's the DX that they want to go for. So it's almost becoming like CloudFlare is doing the same thing just with other. Frameworks ~and, ~and teams. So yeah, ~I don't, ~I don't know how much longer they're gonna be able to stand on that whole open source. ~We're ~we're just another good option for everybody kind of thing that they're going on right now. Jack: I think frameworks are starting to ~kind of ~be like deprioritized, maybe a little ~like, you know, ~so since we're all ~sort of ~dealing at the app level now with Claude, it's like, we want this app, like all the machinations that happen underneath it are just ~kind of like, Hmm, ~code. And maybe people just don't care. They just don't care that ~like, ~[00:09:00] oh, ~well ~this is. Is COBOL on top of a, ~you know, ~an IBM mainframe? I don't care where it is, who cares? ~You know, ~this could be anything. It just, what matters is that it does this job ~and it, ~and it works. Paige: ~Well, ~let's talk a little bit about AI security. 'cause it seems to be on a downward trend this year for the most part. Noel: Yeah. Speaking of herself. Paige: Exactly. Speaking of vl, so the last few months. Yeah, we have had an unusually concentrated run on AI security incidents. There have been, there's issues just this past week with the vibe coding platform lovable, where 18,000 users had their data exposed,~ uh,~ with a critical off logic flaw. ~Um, ~and it,~ the,~ the funny thing about this one in particular is that it sat on a separate API vulnerability for 48 days that let any free account access other users source code. They didn't do anything about it, and then they denied that the breach had occurred at all in the first place. Like ~it's,~ Jack: God. Paige: it's just shocking that they would've let [00:10:00] this slip through. And then, so that was lovable. And then,~ uh,~ March 31st, Andro accidentally published all of Claude's source code,~ uh,~ with a forgotten debug file to their NPM. ~um, ~which was the third incident for them. While a simultaneous supply chain attack on Axios, a very popular fetching ~data, ~data fetching library, met anyone who was updating that day may have pulled in a remote access Trojan onto their system. ~So, you know. ~All this is happening. Anthropic has also talked about their new model mythos, which autonomously found thousands of zero day bugs across every Major OS and browser before it was even announced. And then during safety testing, this is interesting, I didn't know this. An early version escaped its sandbox, emailed a researcher while they were eating lunch, and then tried to cover its tracks by hiding its changes from the audit log. Jack: Was that the one that was going after the guy who had an affair? ~Like,~ Paige: Very possibly. Jack: yeah, because they, it looked through all the company email and found the guy was having an affair and just, and wanted to blackmail him about it. Paige: Yes. now we need to [00:11:00] worry about our AI agents blackmailing us into submission. Jack: Yeah. Paige: ~You know, ~all these things have happened. They've all happened within the last three weeks or so, and it really ~kind of ~begs the question ~is, ~is anthropic having a bad month or is this just a sign of something that is more systemic throughout the industry because of the proliferation of ai? Jack: Totally systemic. Totally systemic. ~It, ~it feels to me like, ~you know, ~the velocity is just so high on stuff right now that it's impossible to keep up with everything and with the number of prs flowing in and, ~you know, ~AI doing the review of the prs, and then it just, there's not enough humans in the loop. The humans are completely exhausted by doing review as opposed to coding, and ~it's, ~it's. Just a bad scene and it's just gonna get worse, but we're trying to keep it positive, right? Paul: the, anytime somebody's like, ~oh. You're, you're handing~ AI's like handing a machine gun to ~like ~somebody who has never ~like ~fired a rifle before. ~Um, ~it's just like technology. That is the technology the problem. ~I mean, ~that's a philosophical question. If machine [00:12:00] guns are the problem,~ we,~ we can have another podcast about that. But ~let, ~let's just pretend like ~you know, ~it's the operator. Noel: on that one. Sorry. Paul: Yeah. Yeah. ~It, ~it, ~it's a, ~it's a cultural problem. And ~like ~imagine what would happen if we took,~ um,~ 2025 Honda Civics and Toyota Corollas and dropped them in ~like ~1905, like New York City,~ like,~ or San Francisco. It would be an absolute disaster even though they had horse-drawn carriages. This feels similar. ~Like, ~are we ready for it? Are the teams ready for it? It sounds like people don't have the right ceremonies, they don't have the right careful checks and balances. There's this black box of AI that they just ~kind of ~trust. To me, that just feels like a people problem, the way that we approach it. And there's also, we're still weeding out the industry a bit. All these layoffs, ~I mean, ~we all know they're not like AI based. It's, oh shoot. Now we can ~like ~downsize a little bit. ~Um, ~but as we refine the workforce, so to speak, and we get people that really play with these tools nicely versus just like POK and prod them. I don't know. There, it could also be a transitionary problem, still all under the back,~ uh,~ umbrella Jack that I agree with, which is ~like, ~yeah, it's systemic. It's systemic, but in the people vector it feels ~like.~ Jack: By the way, there's another podcast on the [00:13:00] fact that we're actually rehiring a bunch of people that're. Are laid off. And that's becoming a trend because these companies basically, ~you know, ~were so high on the supply when it came to ai. Wow, that's a good one. That,~ uh, they, ~they laid off a bunch of folks on the assumption that people were actually, that the AI was telling them that you can lay off half your workforce and they're like, heck yeah, let's go. Turns out not so much. Actually, there's this thing called institutional knowledge. Paul: Ooh. Yeah. Noel: Yeah, it's tricky. I feel like there's ~like ~a,~ there's,~ there's, and a lot of these are, this is ~kind of ~getting into my hot take. I don't want to give too much away, but I feel like ~there's, ~there's ~like ~a couple different,~ um,~ kind of accelerants happening in parallel that I think ~are, ~are ~kind of like, ~like again,~ I,~ I think that there's a lot of it is just like the vol, the velocity at which things are. Moving. So I feel like it's hard to get our bearings more so than it's ever been in the past, even just like on an individual level. So ~I think, ~I think it's hard to keep in mind like, where does the attention go? So stuff doesn't keep happening like this. And I think that's like a very [00:14:00] tricky question to answer. 'cause even on the examples we just went through, it's like some of them are like. ~You know, ~like more traditional like security vulnerabilities and leaks. And some of 'em are just like, we're just moving too fast and like blasting things out to the internet because we're just like, ~you know, ~like firing so quickly. ~Um, ~so like the, there's like that going on. I think there's a little, there's more. It's weird, everyone's ~kind of ~rolling their own stuff a little bit more, but I think we're also, there's like kind of been a bit of a,~ um,~ concentration more so now than there's ever been in the past because like ~the, ~the,~ um,~ the LLMs like, ~I mean ~Claude in particular ~like ~tends to really favor a set of tools. It's ~so, ~it's like everyone's vibe coded projects are like using certain libraries,~ um,~ and stuff like that. ~Um. ~I think like historically, a lot of this stuff was built like when, again, ~kind of ~like to our prior things, like people when people were reaching for frameworks or tools or whatever it was, they would pull something off the shelf that had been vetted, someone spend a bunch of time doing it, making sure it was okay. And now in the. [00:15:00] Roll your own. I don't wanna pay for this. I'll just ~like ~vibe my own up. ~Like ~if you, you opened a new, a whole new surface area that's like a vector for problems that ~you didn't, ~you didn't, you would never have had before 'cause you didn't even think about it. But now that's in your purview. So ~like, ~you've gotta think about it. And I think ~that ~that is a, a. That's not a,~ like, you can, ~you can prompt and go tell your L one,~ like,~ oh no. Like really tell it to use tools that are available, but it wants to write code and build the thing itself. ~Like~ Jack: It really does Noel: that's the thing. So that part of me is ~like, ~I think ~that ~that's just a lot of the problem still. ~Um,~ Paige: Yeah, I've seen this, these memes going around on X and on LinkedIn, and it's like I canceled all of my SaaS subscriptions and I spent $15,000 on an LLM rebuilding all these tools that I use, and now they're insecure and they've not been vetted by anyone. And I'm support, but I'm off all those subscriptions. And it's ~like, ~yes, exactly. Is this really a good way that you wanna go? And the time that you wanna spend on a less stable product that you've not read the code for and nobody [00:16:00] else has looked at. Jack: Good luck getting through that security audit when you want to get your,~ uh, you know, ~SaaS enterprise stuff going and get your SOC two compliance. Paige: Yeah, it's never gonna happen. Jack: Yeah, it's funny, I used to, there was a red flag that I saw occasionally in interviews, ~you know. ~10, 15 years ago when somebody would come in and it was like, oh yeah, ~you know, ~I wrote my own encryption library and I wrote my own web server and I wrote my own, ~you know, ~blah, blah, blah, blah. And you're like, why? Why did you do any of that? ~You know? ~And it was like ~kind of ~a weird thing,~ like,~ like guy on the side of the road, you're like with a sign, like end of days, whatever. You're like, what? That's just a weird thing to do. But nowadays it just seems like everybody. Yeah. ~Like, ~oh, why would I pay 15 bucks or something? Then I could vibe code in a couple hours because it's tech debt. You don't wanna deal with that. ~Like, ~what are you doing? Paige: Yeah, this whole, the whole argument of build versus buy has ~kind of ~gone out the window and everybody's just build, building and vibing everything instead, and it shows. Jack: It just seems like such a bad idea. Like I am literally quite the opposite. I always ask Claude right up [00:17:00] front,~ like,~ I'm, I wanna do X, what, ~you know, ~what's the off the shelf tool that will do this for me? And then only in the case where there's nothing there will I actually go and ~like ~pull the trigger. Noel: Yeah. Yeah. Re or research vet these tools. Like compare, like I feel ~like the, ~like the agent agents are great at that stuff. ~Like, ~go Jack: Oh yeah. Noel: the deep research,~ like, you know, ~find opinions on these things. What's the Paige: Give me options. Noel: ~Uh, ~yeah,~ like, um.~ Paige: And ~I mean, ~I think ~the, ~the whole thing ~with, ~with Claude's or Anthropics newest model mythos ~kind of ~points to this even more if it's finding massive, ~you know, ~multiple bugs in every operating system and browser that has ever existed, that have been built by hundreds of people over decades potentially. Why would we ever think that one AI agent set on a task could do it better? Noel: Yeah, it's an interesting question. 'cause like I feel like,~ well,~ one, one could argue that then ~like, ~okay, if Mythos is writing its own system and can vet itself this effectively, like maybe that is, ~you know, ~like that it's pulling, it's used, it's leveraging whatever mechanism it has to find these vulnerabilities. So it's ~like ~if you can unleash the bug finder on [00:18:00] whatever it just wrote. To ~like ~vet itself. Like maybe that's like a better new future. ~Um, ~but it's also hard 'cause like I, yeah,~ I,~ I guess ~I'm, ~I'm, I think there's a, the narrative around ~the, ~the whole ~what, what, what, ~what were they calling this project? Glass. Jack: Oh G. Yeah. Paige: glass wing. Paul: wing. Noel: Wing is just ~like, ~I'm still ~kind of ~in this,~ like,~ I think that there's something here. I feel like this, but it is really drummed up. ~Like ~this is. Marketing,~ like, like ~there is definitely a marketing spin that we can talk about how much, ~you know, ~like ~what the, ~what the core of that is and how much a, an issue it is. ~Um, ~but yeah, again, it's just like another reality distortion field. I feel like I'm trying to see through ~like ~what, like ~how, ~how capable is this? How much danger is it actually,~ uh,~ is it actually exposing? Paige: ~And that's the problem with all. Oh, go ahead Jack.~ Jack: ~well, I, ~well, I, I used to actually work at a security company,~ like,~ not like a physical security company, but like a software security company that was doing like static audits and stuff, and. I learned all kinds of stuff about ~the, ~the actual economics of hacking and how these zero day vulnerabilities are worth, ~you know, ~millions of dollars to non-state actors to get the one thing that they can go and get into systems before anybody else [00:19:00] drops. And it, I just, every time I hear about, ~you know, ~the philanthropic stuff, it's like you could have literally just spun off a whole company and made. Billions of dollars. ~Like ~if you, if this is actually true, if you have all these zero day vulnerabilities, ~but you know, ~whatever. ~I guess.~ Noel: Yeah. Yeah. ~I, I mean, ~I feel like the counter is ~like, ~they just don't want to be in that business. ~Like, ~is ~I, ~I don't know. Again,~ I,~ I'm a skeptic as well. I'm like, I'm with you. I'm just playing the devil's advocate here. It's the,~ like,~ they don't want to be trying to take on that security risk. But again, ~I don't, ~I don't know. Like I, again, it feels marketingy to me. I agree. Jack: ~I mean, ~anything's better than what, ~you know, ~whatever wrote lovable API surface where literally anybody can get to anything at any time without checking,~ know,~ whatever that model was that, don't use that one. Noel: Yeah. ~Mm-hmm.~ Paige: Yeah, it just, the problem is that all of these reports and stuff are coming from AI companies who have skin in the game, so they have reasons to emphasize or drum up, like you say, the marketing angle of it and how earth changing this is and world shattering and all this stuff, but. But [00:20:00] you're right. Like how much does it actually affect it? Some of these systems, like we said, have been around for years, for decades, for longer, and people to the best of our knowledge have not exploited these bugs. So are they really as much of an issue as they're saying they are? I don't know. Noel: Yeah, and it's hard, like I feel, and I'm not super up, like maybe more has ~kind of ~come to light, but I feel like ~the, ~the crux of bugs that tend to exist in these kinds of things are like. When X, Y, and Z all happen, then this can be a problem. It's ~like, ~okay, but when, like how often is that actually happening? ~Like ~is the standard window windows user just running windows at home that's updated and everything, like how often are they actually, are the stars aligning where they're susceptible to this vulnerability? It's ~like, ~I don't know. ~I mean, ~I'm sure there are tons of those paths where things can break, but ~like.~ How, like again, what? Like how bad is it? Maybe is the clean, the cleanest way to say that? And it's ~like, ~I don't know. ~I, I hope, ~I hope we'll be able to ~kind of. ~See the output of this eventually so we can ~kind of ~use it to ~like ~look back and, ~you know, ~do an analysis of ~like, ~what [00:21:00] were the interesting ones here? Was there any that were, ~you know, ~was it was everyone's router,~ uh, you know, ~able to be remotely accessed or Jack: Oh, definitely. Noel: yeah, like that. It's like that, that, that feels like it's always the thing. It's like ~these, ~these things that, ~you know, ~ISPs are providing that no one's actually vetting and all that. ~Like, ~that's Paige: ~Hmm.~ Noel: that stuff keeps me up at night. But, ~uh. I, ~I just hope, ~I hope, ~I hope we have good kind of auditing and visibility here in the coming months. Paige: ~Mm-hmm. ~I guess another question to be to you might be, ~you know, ~anthropic has always ~kind of ~framed itself as the safety first AI lab, but then we're hearing about that, ~you know, ~they're releasing their source code, they have their own models escaping sandboxes. They're, they've got agents, blackmailing employees to shut up and not say anything like. Does this, do you still buy that framing or do you think that maybe this also is just part of their marketing spiel to appeal, to appear more ethical, ~I guess, ~than other AI companies? Paul: I feel like ~it, ~it,~ they,~ they are being genuine in some regards there [00:22:00] because ~I mean, ~yeah, they could totally make a billion dollar business, Jack. They didn't, and it's because like Noel, I agree with you. They're not in that business. They're not in the business of making firms to do security exploits. ~They, ~they're sitting at this level of ~like, ~we have this ~like. ~Amazing machine and we want to roll it out correctly. Like they could be much more parasitic and advantageous if these things are true. Again, it's under the, if these things are true,~ um,~ they are also like with the whole DOD thing that happened in the past, there's nuance there that ~I, ~I think I mentioned on that panel when we were discussing it, that matters like philanthropic, they. They did not want to go through the contract 'cause they didn't have oversight with the model, like it would be deployed on-prem. ~You know, ~and to me ~that ~that's them ~sort of ~going ~like, ~Hey,~ we,~ we know where our limitations are and as much as we don't wanna become absolutely pummeled. By the United States government, ~we ~we're just not comfortable with this. ~Like ~there's more nuance than just open ai, ver versus anthropic, and they seem to be cognizant of their capabilities and trying to steer in the right [00:23:00] direction given the alternate directions that could have gone ~in ~in the past. Are they perfect yet? No. With the BUN thing ~and, ~and leaking the source code. That's really interesting because that bug was in the BUN repo before philanthropic bought bun. It was already reported. It's not like they messed up. They just like probably took it on. They were like,~ yeah,~ yeah, bun, ~let's go, ~let's go. And they just didn't get through the backlog of crap, ~you know? ~And then Noel: a model Paul: they posted, Noel: and their, ~uh.~ Paige: say, Paul: model on it. Yeah. Paige: that. Paul: ~Um, ~yeah, but there, there's nuance to every headline about them being like, not security focused. 'cause it's really easy to look at an AI company and go ~like, ~oh yeah, they're not security focused. Not that I'm glazing anthropic or anything, but ~there's, ~there's nuance ~to the, ~to the security thing. Noel: Yeah,~ I,~ I, oh. Jack: ~Well, ~to me it makes sense that philanthropic would go and get into security in a big way. They understand now what their position is in the market. They are the coding AI at this point. Like nobody, like they're ~the, ~the top dog. And there are so many companies that are sitting on top of and re basically just [00:24:00] reselling Claude as. ~You know, ~X, Y, or Z, lovable, ~you know, ~app builder,~ yada,~ yada. And they are looking at that and saying,~ well, well, ~when one of those companies or all of those companies start reporting that ~the, ~the code that's created has massive security issues, ~you know, ~that's gonna end up hitting back on us ~in a, ~in a big way. So I think, ~you know, ~they, their next push was obviously into, ~you know, ~how do we go and make our, what we're the code that we're creating more secure? And so that's what. That's how they got to mythos. So that makes perfect sense to me. And ~I, ~I don't doubt that it's actually really secure, to be honest. Noel: Yeah. Yeah. ~I mean, I, I, I, ~I, in my head it's still ~like, ~like Google is still the security AI company because ~like. ~They've played this whole game slow. I still feel like they've been in this space longer than anyone else with ~like ~a number of dollars that still ~like ~dwarfs the amount that anyone else has put into. Like Google's been doing this forever and I just feel like a fire was finally lit beneath them. And it's ~like, ~okay, that's why we're even seeing anything out of Google at this point. But like all these foundational papers, all this stuff is like all built [00:25:00] on Google scaffolding. So it's just like the stuff that they're doing, I still found, but they're just like, they don't like. They just don't, again, this ~like, ~they just don't wanna be in these spaces. They're just like, this just doesn't, isn't where we want to. If you're Google, this isn't where you wanna try to make your money. At least ~is my, ~is my take. They're just as Google always does, playing it close and slow. Nobody, ~you know, ~nobody's using the Google AI tools, the Google ai, like code gen tools, at least a much smaller portion, I think for that reason. ~Um, ~and Google like has their little products they'll put out and stuff, but ~like, ~I dunno, it just doesn't feel like they're putting the resources into this that they could be if they were ~really, ~really wanted to like, get into Paige: Yeah, ~I mean, ~does anybody who uses Chrome actually use the Ask Gemini like sidecar Noel: No, I just. I just ~like ~go to the site. I dunno. Yeah. ~Like, ~yeah. Paige: look. Use your human eyes to look around and click with your human mouse. Noel: Yeah. Jack: ~Well, I, ~I do use that little,~ like,~ blurby thing and then occasionally I'll actually click on,~ like,~ have a chat with it about that blurby thing at the top. ~You know, ~that's actually Paige: the little AI overview. Jack: Yeah, that's not bad, ~you know. Um, ~and I just recently tried out,~ uh,~ anti-gravity again. Noel: Yeah. How was it? Jack: ~uh, ~it's cool. ~I mean, ~the [00:26:00] whole thing nowadays is like fleet management, like the fleet, ~you know, ~you, how do you handle 10 agents simultaneously in doing air traffic control around that, around this one's doing research, this one's doing a POC. This one's,~ these,~ these three are fixing minor bugs. ~You know? ~How do you get a dashboard across all of that and end up being like it? Of an engineering manager role of ~and, ~and not go crazy. ~Um, ~and so apparently they're trying to do on anti-gravity is very much akin to that, although ~like ~Carr three is now moving into this space. So yeah, this whole like fleet management thing is becoming ~the new, ~the new hot. Paige: the new term burnout term for that is called brain fry, according to the Harvard Business Review, Noel: I feel it. Paige: ~it's, ~it's basically. Noel: ~Mm-hmm.~ Paige: It's the context switching that we're all trying to do. 'cause we've got eight agents all doing different things and we're trying to keep track of all of them at the same time. And I, I feel that as well,~ so,~ yeah. Jack: And we're all looking for that dopamine hit that we lost that we had when we were just straight coding stuff. It was [00:27:00] a cool thing when you were coding stuff and it worked. It was like, ooh, ~kind of ~dopamine hit of fun. And now it's ~like, ~oh wait, I type a lot of stuff and then I gotta review stuff. ~Like, ~these are the things actually that ~like.~ I don't want to do, this is why I am an,~ like,~ this is literally Paul: that's why Palantir stocks their office with Zs to, to keep the Paige: I've become the QA team for my AI agent. Just making sure it's doing what it says it's doing. Jack: right? That's just boring, man. Like, where's the dopamine hidden in that Ain't there? Noel: ~I mean, ~I think that ~there is, ~there is like the cheaper. Lower hanging hit of ~like, you, ~you prompt and the thing works, and it's ~like, ~okay, cool. It's neat that I, my, ~you know, ~whatever one's vision has now conceptualized, but it's just ~like, ~it's not the same. Like it's a different, it's a different thing. Jack: It's the difference between climbing up to the top of a mountain and feeling ~the, ~the endorphin rush of having done all that and the accomplishment of that, as opposed to be getting a helicopter that just drops you off on the top of the mountain. You're like,~ well,~ that's a cool view, man. Okay, let's go. ~You know, like ~there's, it's an entirely different thing. Noel: It is still a cool view, Jack: It's a, yeah, it's a cool [00:28:00] view, but like I Did you earn the view. Noel: Yeah. Yeah. Paige: You paid for it. Jack: I guess, yeah, ~right. ~Yeah. That's the ~like ~see, people would show me their like cool Teslas and stuff, and it's ~like, ~great. You could cash, you could write a check. Congratulations. Wow, rocking. Paige: Oh,~ well,~ let's,~ uh,~ let's go on to our hot takes. 'cause I think that we've circled around the AI issues and the feelings that we all have for long enough. So before we do our hot takes, let me do a little ad break. ~Uh, ~this episode is brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and analytics, which surface the UX and technical issues impacting user experiences. Start understanding where your users are struggling by trying it for free@logrocket.com. All right, Noel, you ~kind of ~alluded to it already, so I will let you be the first one to share your hot take for this episode. Noel: Yeah, ~I mean mine, ~mine is one of these where I'm ~kind of ~in this mode where like I feel like ev, everyone's doing this. We see, we're seeing, I feel increased outages and incidents and everything, and like services we use and just like stuff seems to be broken more and. I [00:29:00] am, I think I'm, I am, I'm finding it increasingly annoying to try to ~like ~read between the lines on these things and determine like,~ how,~ how much is this just ~like, you know, ~the problem that we were just talk speaking to of like velocity due to AI versus like other kind of the other market factors that seem to be at play right now. And I've been just like. Losing a lot of sleep on that. And I'm trying not to like, just pull the lever of ~like, ~this is all just 'cause AI has made coding worse. But ~like, ~it's hard not to,~ like,~ it's hard not to just blame it on like blame all these, ~you know, ~outages and flakiness and just like slowness and apps and websites and stuff on. ~Like, ~this is some, if I've coded garbage. So ~I'm, ~I'm curious if you guys just ~kind of ~have this like daily use frustration with a lot of these things and, ~you know, carry, ~carry. Paul: Yeah. Noel: Yeah. ~Well ~that's ~like ~all the time. Jack: Yeah, like I, ~I mean, ~we're just like, ~you know, ~randomly,~ I, I, ~I tried to watch a YouTube video on my big screen yesterday and ~like ~took ~like ~three times to get Yeah. I had to ~like ~reboot the app a couple of times and it's ~like, ~what? Noel: Yeah, Jack: this Noel: like there's, it's hard. There's the [00:30:00] confirmation Jack: man. Like how hard is this? Noel: part of me is ~like, ~I can't just blame every bug on AI now, but I'm just like, I'm just always, now I'm just like always in this slightly aggravated state at it. ~Um,~ Jack: we know, like we're in that burnout state of ~like, ~oh, we're reviewing code. ~Well, are you, ~are you really reviewing the code? Are you looking as deeply as you should? And ~you know, ~in the back of your mind that like stuff's getting through, ~you know? ~And if it's getting through for you, that means it's getting through for other developers. And that's what we're seeing now is just ~like, it, ~it ~at the, ~at the cost of maybe getting like the little r rating on this, like in, in ification is happening ~in, ~in the software. Stupidly high right now. Noel: Yeah. Anyway. I dunno if it's a hot take or a complaint, but I'm just like, it's just, I'm, it's like I feel like ~it's, ~it's bubbling out into the rest of my life in a way that I don't, I wish weren't happening. It's like all the software I use for so much now feels slightly worse and ~I think, ~I think it's this,~ uh,~ anyway. Paige: You are not wrong. I don't think that you're wrong at all. I think we've all reached that point where people don't, ~I mean, ~people don't actively like using the internet anymore. They're like, it was good once and now it's just ads and [00:31:00] popups Jack: Ugh. Paige: paywalls and, Noel: social media is cooked,~ like,~ I don't, like every, I get like LinkedIn, I'm just like, guys, what are we doing? It's ~like ~two thirds Paige: It's the Noel: AI slop. Twitter is just complete garbage. It's just ~like, like, ~I don't, yeah, like it's, the internet is different now. Worse, and I think we're gonna need new. ~Um, ~there's gonna be new like postures and ways in which we interact with it. ~Like, you know, ~I think the credentials of a creator and all that stuff are gonna be even more important,~ uh,~ going forward. I think so, Jack: That's a good hot take. Noel: yeah. Maybe this was my Jack: ~mean, ~I hear a lot of folks that ~are, ~are not happy with. Creator? ~Well, ~they use the term influencer when they're not happy with us, Noel: I'm not even, I guess I'm not even saying influencer, but it's just ~like, ~we'll care more, I think there's gonna be more weight put on who is saying a thing than what they are saying, because we're not gonna know,~ like, you know, ~and like kind of ~the, ~the origin of this thing, like ~these, ~these people that are just generating text and. Making spicy posts, they get traction. I feel like people are gonna wise up a little bit more ~like, ~this is just like ~this, ~this is fa a fabrication. ~Like ~no, like none of this is real. This is [00:32:00] just made for clicks. ~Um, ~at least I hope so. We'll see. Yeah. Paige: We will see. All right. ~Uh, ~Paul, what is your spicy take for this episode? Paul: ~Uh, ~my spicy take had to do with Claude code, but I'm gonna force two into here because one of 'em has to do with what Noel mentioned, which is the. Internet dying. Totally. ~Um, ~I have a few people in my life that are more in the artsy side that are now getting into technology because of these AI tools and there's like a second internet out there, guys, like I, I'm not hip to it yet. I'm still getting ~into, ~into what they're using, but people are putting up platforms, small websites, like there's small website builders that look like the 2008 internet that we ~kind of ~all remember and they're gaining traction. It's something that like normally would be like in the dust cabinet, and I'm like, oh wow, that's like a new thing that is more decentralized but not fully decentralized. We're not talking crypto. It's just like slightly more decentralized control of the internet resources and I think that's gonna continue to. Flourish by what I'm seeing in these creator communities, which [00:33:00] is ~kind of ~cool. My main hot take was about cloud code, though. I'm seeing, I've tried pie agents. Okay. ~Very, ~very neat. I like the pie agents. They're expensive 'cause you have to use your own API key. You don't get the subsidized cloud code rate. And they work really well. They work really well. Right now, I'd say they work way better than Claude Code right now because Claude code has a lot of bloat. It's a 10,000 token system prompt. It's a lot of stuff I don't need. Sometimes I'm like, dude, I have the to-do list. Why are you making this to-do list? Like I have A-J-S-O-N to-do,~ like,~ please stop. And that's where PI agents are, just ~they, ~they shine. However, my hot take is long run. If you're normal dev, like I'm just like doing projects here and there. I'm still gonna be using cloud code long term. Right now they're in this bloy sort of stage, but like they're the coding people, like Jack mentioned, they're gonna figure this out. They're gonna hear community feedback. They're gonna clean up the features that they don't want. I strongly believe this because if they lose that edge, they lose a lot of their market permeation. So I have full faith that philanthropic is going to continue to iterate on cloud code add features, but most importantly more so [00:34:00] clean up features. And then it is gonna start to rival ~sort of ~like that counterpoint, counterpoint that pie agents rise so heavily on now, which is, it's my coding agent I control like the harness. That being said, ~like ~if I'm making a bespoke harness for a team. They have a particular model repo set up, like they use this AWS or netlify or whatever. Sure. Pie can be great because you can make ~like ~these great t UIs ~and, ~and how they all interact like bespoke for that team. But daily driver, I, I find myself struggling to think about switching away from cloud code even in the bloated state that it's in right now. Paige: I mean between Claude Code and Claude Desktop, which now has ~like ~Claude Cowork and Claude Design for your designers or your PMs who wanna be designers but don't have the skills to use Figma like. Anthropic is really coming for the tech, the whole tech like startup. They're, yeah,~ they're,~ they're developers. They're QA engineers, they're designers, they're everything. The product managers. ~It's, ~it's amazing. Paul: The philanthropic stack ~for ~For product dev. Yeah. Noel: Yeah. Paige: new [00:35:00] Anthropic Stack. Stack. Noel: ~I mean, ~I think that's the vision, right? Like I, I feel like we're, ~there's definitely, ~there's definitely, ~I mean, it's, ~it's probably happening in interviews, right? Interviews hiring people. It's ~like, ~what's your guys', ~you know, ~defacto AI stack at in your company? ~Like, ~I think ~that that's a, ~that's a thing like, 'cause I think, ~I mean,~ Jack: In the tech world, that's a Noel: Yeah, exactly. It's like it's bring your own, roll your own, or ~like, ~know we're very opinionated. ~Like ~this is how we do it. ~Like, ~I think that's absolutely happening. Jack: of the tech world, man, like ~the, ~the view of AI is just so wildly different than our internal view of ai. ~Like, ~it amazes me actually, that they said the artists are getting an ai, it's ~like ~one artists, like the artists I know are not AI happy. Paige: hate it. Jack: They hate it. And ~you know, ~the second thing I just, ~you know, the, ~the more general populace, ~I mean, ~they're burning. Burning,~ uh, you know, ~factor, not factories,~ uh,~ warehouses. And ~they're, ~they're stopping data centers in Noel: Sam Altman's house. Paige: Yep. Jack: crazy. Paige: it is interesting and ~I, I mean, ~one of the things that I've begun doing in my own interviews ask of interviewing people for our company is asking them what kind of AI tools they use or experience they have, and. Yeah,~ it's,~ it's very interesting and [00:36:00] revealing like which developers are on board with it and which ones are like, hard stop not doing it, not getting into it, not gonna be good at it. ~Just, ~just not doing it. Okay. Jack: right there? Is that a show stopper? Paige: ~Um, ~it's not, but it definitely gives me pause. ~Like, ~I don't, it doesn't really matter at this point, if you like it or not. It seems like you need to at least be decent at using it. Even if you don't necessarily, if you're not a fan of it or you don't agree with it, 'cause everybody else is Jack: Yeah. Paul: I almost wonder if there's gonna be a, like a special role, like if you have a team of 10 devs, you like, I want one person that is an ai, but they're, I've ~like ~vetted them so heavily that they're like. An algorithm with time complexity Mastermind and their mind is un lobotomized and uncorrupted. So they're like the special oracle of original in intellect. Noel: ~Uh, ~yeah, that's. Paul: Yeah, Paige: ~I mean, ~it could be a thing. Yeah. Paul: mind. Jack: Yeah, we need the But Larry in candidate, ~you know.~ Noel: Yes. That's awesome. Paige: All right, Jack, what is your hot take for this episode? Jack: I'm gonna continue my tradition of [00:37:00] going non-technical. I'm gonna say that my hot take is that the reboot of Highlander is just gonna be as good as the original. ~I mean, ~Henry Cavel as Conor McLeod. I'm so stoked. Dave Batista as the Kergan. Come on, Russell Crow as,~ uh, you know, ~doing Sean Connery. His role. ~I mean, and, ~and the Noel: couldn't have done it better, right? Jack: right, and the director of,~ uh,~ John Wick ~is, ~is the director and you've got,~ uh,~ any, and he said he is not gonna get rid of the Queen soundtrack. ~I mean, ~come on. This is,~ I'm,~ I'm so stoked about this. Then the set photos and looked amazing. Huge. Henry Cabell fan feel like, ~you know, ~man, like he should have kept ~the, ~the Superman thing. ~I like, ~I like the kid, but dude. Henry Cabell. So good. So that my hot take is, I'm just super excited about it. Paige: ~That is, ~that is a stacked cast and crew right there. All right. ~Uh, ~so my hot take is gonna be ~kind of ~a continuation of hot takes of past episodes and I'll just beat this dead horse 'cause we're talking about it again with security. [00:38:00] But geez,~ think,~ think for two seconds about what you're giving your credentials and your. ~You know, ~secure tokens too. Just for ~a, ~a second,~ like,~ I know Google, this Google Sign in button is super easy to use and it just wants to access your contacts and your entire Google History. But just think about it for two more seconds of ~like, ~does Lovable really need all that information? Does Claude really need to look at your Google calendar or your Notion Workspace, or your Jira tickets or whatever, just. Just think about it people, ~this is, ~this is how these data breaches are happening too. We're just like, yes, sign in. ~Go, ~go. You have free reign to see everything. Read and write full permissions. ~Just, ~just think about it for two seconds. That's all I'm asking. Noel: Do you think auto mode in Claude will make this better or worse? Paige: Oh, I think it's gonna make it worse. I think when you give. Yeah. Claude's running bash scripts and nobody has time to review those things fully. Or making API calls that we're not even aware of. Noel: I, yeah, I love two mind, like that's my impulse as well. But then part of me is ~like, well. ~I feel like right now [00:39:00] everyone's in a bad habit of just ~like, ~just a, it gets probably fine, right? You get the security thing and just like it, it is probably fine. Like I'm not Paige: you're like, Noel: them. But now if it's an auto mode and there's something vetting it in front of me that's ~like, ~Hey, this is actually weird. It's ~like, well, ~maybe that is good. ~I, I, ~I have less,~ like, you know, um, ~there's less noise. So like I'm paying more attention when something does pop up, but I'm not, I don't know yet, ~like.~ Paul: It's helped me is I put important tokens that might be for actual writing up into my cloud service behind, behind one password, and I have the Jack: Yes. Paul: And so now when the agent uses it, key chain pops up and I'm like, oh, it's doing ~a ~a thingy thing. ~Thing ~thing. Jack: ~Mm-hmm.~ Paul: Human. It's forced human in the loop. Jack: Yeah,~ it,~ it's so true. I was just gonna say one, ~you know, ~get your digital house, personal digital house in order. Get the, ~you know, ~get one password, get a password manager, make 'em really hard, ~you know, ~do the OTA tokens in it. Like just go full hog into that. And then also, ~you know, ~lock down your local network. ~Like ~make sure that, ~you know, ~nothing is open port wise, it shouldn't be. And ~you know, ~if you wanna do the VPN thing, use like a tail scale or something like that where you know you can get a [00:40:00] lot of that coolness, but it's not. ~You know, ~opening up any new, ~you know, ~security holes that you don't know about. Noel: ~Yeah, yeah, ~yeah. Cloudflare's,~ uh,~ what do they call tunnels are good tail scale's. Great. Jack: Yeah. Tunnels got, yeah. Yeah. Tail scale's got funnels as opposed to tunnels, Noel: Yeah. Maybe ~I, ~I'm,~ I,~ I, maybe tunnels might not be, but Cloudflare's Jack: Tail scale's. The bomb tail scale's. Fantastic. Noel: Yeah. Yeah. It's neat. Paige: Awesome. ~Well, ~thank you guys for joining us for this panelist episode. We hope everybody has enjoyed listening, and we'll see you on the next episode. Paul: Thanks, Paige. Thanks guys. Jack: See you next time.