December Panel - AUDIO EDIT === Josh: [00:00:00] Welcome back to Pod Rocket, a web development podcast brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and analytics, which surface the UX and technical issues impacting user experiences. Start understanding where your users are struggling by trying it for free@logrocket.com. I'm Josh Goldberg, and we're back with our panel episode where we'll be talking about AI platform consolidation, open source independence, and the true cost of AI infrastructure. But before we get into it, let's introduce and welcome our panel. First off, we have Paige. Paige is a staff software engineer at Blues and co-host of the Frontend Fire Podcast. Paige, how are you doing? Paige: I am doing good. I'm glad to be here. Excited to talk about this stuff. Josh: Yeah. Next up we got Jack, who is the blue collar coder on YouTube and also the co-host of the Front End Fire podcast. What's up, Jack? Jack: Hey, happy to be here. Happy to talk ai. Happy to talk about Subaru Connect, all that stuff. Josh: Heck yes. Jack: Oh, let's go. Josh: And last but not least, we have Pauly Pod Rocket Host and YouTuber. How are you doing Paul? Paul': ~Uh, ~I say [00:01:00] YouTuber 'cause I don't make a lot of videos, but they exist ~and you, you can find them under my name time to time when something inspires me and I like to put them out. Um, yeah, ~yeah, excited to dig in. Especially some of the recent BUN stuff going on that we're gonna chat about Jack: Oh, yeah, yeah, yeah. Button getting acquired. Ooh. Okay. Does it make for a new open source acquisition model? Paige: like it. Jack: I think Josh: Well, you know,~ Well, you know, we were gonna, we were gonna discuss that second, but let's talk about it first. So, ~for context, the implications here we're talking about are all from anthropic, the AI company,~ uh,~ acquiring bun, which is a JavaScript runtime. So let's start off with just ~kind of ~first impressions. What do you all think? ~How does this mean?~ What does this mean for the industry? Paul': Smart. Bye then. Paige: Surprising. I think,~ uh,~ I, or at least I felt like. The people that I know that talk about this kind of stuff, like Jack, like Paul, like everybody didn't really see it coming. ~Uh, ~but it also makes total sense when they did a little bit of an explanation because Fun apparently is powering Claude Code, which is Anthropics flagship project, which has already reached one think it a billion dollars [00:02:00] in annualized revenue returns. It's one of those. Like acronyms. But anyway, it's making a lot of money, so it does really stand to reason that if they wanna continue to build on that and continue to use BUN as the driver underneath that, they would bring it in-house. ~I mean, ~that actually makes a ton of sense when you think about it. Jack: And also it's what they're putting on top of BUN in terms of like~ like build~ building applications, right? So ~if, if, if you're marketing,~ if you're marketing to a company where,~ uh,~ okay, so ~you, ~you want to claw to the HR person so that they can help build forms internally to go and, I don't know, sign up for the. ~You know, ~basketball court or whatever, ~you know, um, ~having the ability for c Claude to just go and build on top of a, ~you know, ~BUN instance, a, ~you know, ~something that just works right outta the box, connects to all the databases, which BUN now does directly, ~you know, ~without any external packages. It can create executables, it can run websites, ~you know, ~all out of one thing. ~Right. ~That's a fantastic build target. And [00:03:00] the person who's using Claude Code would have no idea that it was JavaScript under the hood. ~It's just, it's just, ~it's just a thing that, that, ~you know, ~makes a basketball res court reservation form, boom, done. ~You know, ~that kind of thing. Paul': Yeah, I also think it's a risk management a little bit. I always find it wild when ~like ~a big company buys some project. It's just like, okay, here's this like bass, Pandora's box A, now you have to ~like ~fix it, or B, they don't buy it. And they maybe fund the person. They fund the project and they just hope the lights stay on. ~Uh, ~this really feels like a, we're lifting the curtain a little bit and philanthropics, like, we want a secure platform to ship this to non devs. ~Like, ~like you were saying, Jack, like how are they gonna do that? They need something that they own front to back that they can bottle up. And let the HR person use to sign up for the basketball court. 'cause that's what yeah. Jack: Right. Yeah. Yeah. ~And, and, ~and BUN has been very expressed. They got the $7 million, I don't know, three years ago or so, ~right. And now, and, and, ~and even at the time you were like, why [00:04:00] did they get 7 million? ~What for ~what for bun? ~You know, like, what, ~what is that about? And~ then they, and they've been very, they,~ they haven't been. ~You know, ~hiding the fact that they have no idea how they're gonna monetize it. They had no service, they had nothing. And ~so, ~yeah, to your point,~ right,~ I, if an philanthropic is building on top of that and they look underneath the covers and they're like, Hey, you guys could ~like, ~fold up next month and, ~you know, ~we'd be out. So yeah, we're gonna buy, we're gonna buy you. Thanks. Paige: Yeah, Jack: They, the question is, where does it leave us though? ~Like ~as BUN consumers outside of Anthropic. Paige: Yeah. I mean, that's~ I mean, ~always the question when something that starts out as open source gets acquired, but at the same time, this to me is like ~a, ~a good omen for. Other people who are in open source or who wanna build run times or who wanna build something that not necessarily has a direct path to revenue, ~kind of ~like void zero or ~you know, ~some of, just some of the other runtimes that we've seen pop up in the not too recent past. If you can get some big company, depending on you. Then maybe that's the thing. Maybe you don't have to become a cloud hosting company and ~like ~[00:05:00] expand from your roots of being ~a, ~a JavaScript runtime into that there are other options to exit or to get acquired and keep building the thing that you're building and make it better and have the funding and the resources and the team that you need to do it. Jack: Yeah. That's huge. Yeah. Paul': Yeah. ~I, ~I really wonder what it means for us as pun users, because as a,~ uh,~ neter of Reddit opinion every morning. As I get outta bed, it's interesting to see ~the, ~the spread of opinion from everything is gonna be ruined to now. This is gonna make bun the thing,~ um, ~I guess we'll see it settle ~in the next, what is it, December 10th when we're recording this.~ ~Yeah. Somewhere in that range. ~ ~Um, ~but it's interesting to see how, like the visceral spread of, like,~ how,~ how spread it, it really is. ~Um, ~yeah, we'll see. Jack: I think it's gonna be somewhere ~in the, ~in the zone of, it's going to be hard for them to justify the stuff that it bears no relationship to anything that they do at, ~uh, ~ Paul': Oh yeah. Jack: right? That, that, that's where the rubber starts to hit the road of ~like, ~huh, what do we like as an example? Like meta, ~you know, ~they don't even use RSCs on you. [00:06:00] Facebook, which was always a kind of a con about, ~you know, ~RC is like you, ~you know, ~react to something you use on the site. ~Right. Um, you know, ~so if it didn't, if it's not integral to Anthropics vision, then that's a problem. But otherwise, and Paige: vine? Jack: ~I, ~I think it, I think most stuff that Anthropic wants to do is stuff that we wanna do. So there's that. ~So, you know, ~maybe it's okay. ~Right. You know, I, I, ~I wanna make it an executable that can, ~you know, ~put up a website about basketball courts. Sure. What the heck? Josh: Yeah, I think RSC is an interesting example because it's,~ uh, I've ~owned by one or two companies, or mostly framework in this case, react, adding something that is actually primarily for not one of those companies. ~Uh, ~surely that. Having some large open source project like bun, which is in many ways community driven and community used, kind of pressures anthropic to invest in the community side of things like that. That would be a pretty big downside for them to lose. ~Right.~ Jack: Yeah. I'm curious though. I, so I'm on the Tant Stack team. That's part of what I do. ~And you know, we all know about, yeah. All right. And we all do, you know, ~we all use React Query and now Tant Stack Start, which is getting a lot of great play because next Js and yada yada, ~know, ~we don't have a way out, [00:07:00] right? ~There's no, we're not,~ we don't wanna make a service. We don't want to go and get VC because VC is gonna just try and make, get us to make a service right. Paige: Yeah. Jack: Is this another way out? Is this Another way to fund Tans Stack more permanently ~is, ~is you know, getting some mega core, like on Tans stack start and having them, ~you know, ~basically back tans stack. I'm just to be curious, like I, not that I'm saying we're gonna do that, but I'm just saying ~like, what would, would that you be positive?~ Would you have positive feelings about that or would you have reservations or both? Paul': Tan Stack starts an interesting one because it's so widely used by ~like ~a particular flavor of. Developer, like I feel like there's a tribal, there's a tribe around Tant Stack, whereas bun, it's like the America, it's like the melting pot of ~like, ~what can you do with this? With Tant Stack? I'm like, this is my data fetching layer. And ~like ~everybody here who's talking about is thinking about this one thing. So like the conversation is scoped, the objectives ~are, ~are scoped. I'm not saying they're cleanly scoped. ~Right. ~But ~like ~onto the issue list. If you were to embed all of them,~ they're,~ they're gonna be closer [00:08:00] together in meaning. Jack: Yeah. for sure. Okay. Paul': ~Um, ~so that gives you guys some forward trajectory. I'm sure that, ~you know, ~BUN can't take leniency of,~ uh,~ for community netted development and pushing that forward. ~Don't, ~don't lean which way or another personally, but I'm just so happy that Tans stack is like the community's vibrant. And I think that's really shows like how important the community is, no matter what it is. That's the first thing I look for, pulling in any new technology is just like, how many issues and how fast do they churn? 'cause Yeah. Josh: Yeah. Jack, which specific big tech company are you selling Tans Stack to? We'd like to know. Jack: I think~ I think you, um, who there was a, a, uh,~ one of the really great newsletters, I think bytes, ~you know, ~and they, he's always got some good humor in there. And ~he, ~he went over the bun. ~Uh, ~the acquisition was like, ~you know, ~congrats to him from ya being nelli wealthy Jared. And then,~ uh,~ and then he has said,~ and,~ and premature congrats to Tanner for getting bought out by OpenAI. ~So, you know.~ Paige: I saw that. Jack: ~yeah, ~yeah. Yeah. Not happening, guys. Like ~I ~I, ~there's no, ~there's no conversation. ~Like ~Vice doesn't know anything. They just having fun. Josh: ~Great. Well, ~this is an enjoyable topic. Let's move on to one that's a little more controversial. ~Uh, ~at around the same time that we [00:09:00] saw the BUN announcement, we also saw an announcement from Zig the programming language. Zig is publicly quitting GitHub as a cohost, not just GitHub actions, but GitHub altogether, and they're moving to Code Berg, another cohost. They're blaming Microsoft's increasingly AI driven focus for degraded developer and user experience, and as well as that stability. Quote inexcusable bugs at the same time. Berg probably got hit with a DDoS attack and went down, but is now back up and Jack: I love Josh: yeah. ~Uh, ~per, perhaps it was coincidence. ~Uh, ~and now it is back up and you can actually look at the Zig repository and GitHub, which just now in the read me points to the code Berg one. So let's talk about what Zigs departure from GitHub signals. To start Microsoft's AI first strategy. Is it fundamentally at odds with OSS maintainers? Is it causing problems in the product? What do we all think? Paul': ~Well, ~the first problem is Microsoft tried to invent something new. They shouldn't, that's like outside of their area. So they, that's the first mistake. ~Um, the, the, ~the second thing is, yeah, a lot of people don't want it forced on their [00:10:00] throats. ~Like, ~if you're gonna invent something, you need to like talk to your users. Maybe,~ um,~ again, they tried to invent something. Um, there, there's mine. I dropped the mic.~ Um, ~ Jack: Yeah. Right. Bang. Yeah. Hard to follow that Paul': just buy things and then charge way too much money for it. Jack: ~you know? Let me, ~let me play the counterpoint. Like I, I've noticed no degrading in just the basic. GitHub, ~you know, ~infrastructure that I've seen and, ~you know, ~I did try to do, and I just recently, like a GitHub co-pilot was like, Hey, I see you made an error here. Would you like me to fix that for you? And I'm kinda like, eh, sure. Whatever, ~you know, ~go for it. It was a pretty laborious thing and ~like, ~it was very kind of pedantic about it doing it, but it got it done. And I, ~you know, I, I, I, ~I just wanted to experience it. I'm not sure I do it all the time, but I didn't. I don't see anything. It's fine Paul': Yeah, ~I mean, ~it works fine. Yeah. Paige: ~I mean, I've, I'm kind of ~I'm kind of in the same boat as Jack. I've never experienced the degradation that Zig talked about with GitHub actions getting stuck or getting queued up in runners for hours forever ~or forever.~ [00:11:00] But at the same ~PO~ time, I do very much understand. Where you're coming from of like, I don't want AI shoved into everything and every platform, which it is at this point. ~I mean, ~I cannot open a SaaS company platform and not have them trying to get me to use their AI copilot or helper or whatever you call it. And I ignore them all, almost all the time. I do ~so. ~Yes. I think that if you don't want to be part of the AI revolution or you're just like, that's not where we're going, or that's not our core skillset, I can absolutely see how that would be really irritating to OSS maintainers who just want the basics that worked well before and they're not getting any funding or any attention from any of the dev teams who are, who worked on them previously because they're all being. Shipped towards ai, everything. ~So, ~but at the same time, I've never heard of Coberg before now, so I didn't even know that that was like an option or where I would go if I decided. ~I, ~I think like GitLab is really the only, or [00:12:00] Bitbucket, or the only options that I knew of besides GitHub for repo storage. ~So, ~yeah, I don't know. Jack: In comparison to Bitbucket, I'll take GitHub ~all day, ~all day Paul': Same. ~And, ~and I also haven't experienced any degradation issues whatsoever. ~I, ~I definitely think it's a, we're talking about two separate things, which is there, the quality of the service and are there issues with their service because of attention. And then there's the like. How are they productizing this new thing they're trying to invent and build and experience with developers? ~These, ~these are like two separate things and ~like ~Zig is ~kind of like ~lumping them into this one argument of ~like, ~now they're bad. Okay. Well, like the service is okay. Like we all like the service, but ~it, ~it's the AI thing where it's like, guys like Microsoft, please put your foot on the brake and ~like ~figure out how to correctly roll this out to people like AI's work. You don't need ~like. ~A magic wand to like, make it read a GitHub action log and ~like ~give you something useful. ~Um, ~you're not, you guys aren't doing anything like super special. The special piece is understanding your users and understanding like how to deliver your product to them and ~that ~that was missed. And if that's missed in tandem with like, [00:13:00] oh, well there's this other basic thing that's broken on the repo that really should just work 'cause it's part of the get spec that feels really bad, that feels really bad. And I'm sure people don't feel heard. Jack: I will, and then I'm not Microsoft, but I will come to the defense one more time. And then like I, Zig is an interesting case because they're not like the usual,~ they're,~ they're not getting ~the, the, ~the ai pr flood that like a JS library or a TS library is gonna get. ~Right. So. And, ~and if Microsoft can give, and I think they have tools around this, ~you know, ~can ~kind of like debunk not, or whatever,~ de deify, ~you know, ~a lot of these prs that are coming in for the like~ like real like, or not real~ front end libraries that are gonna get a lot of. IPRs, ~right. ~I think there's value there, right? I don't know ~if, ~if,~ uh, you know, ~code Berg or whatever has those kind of features, but ~for, ~for that class of library, that would be really important. Paige: Yeah. ~I mean, ~Zig has taken a really hard stance against AI because we were just, I, a friend of ours just talked about this weekend. They apparently have put into their docs no LLM submitted. Prs, no [00:14:00] LLM, additional code, no LLM read me. Updates, like they don't, they do not want AI generated anything near their repo in any way, shape or form, which feels tone deaf in this day and age. Like even, how are you gonna try to figure out what is written by an agent versus what was written by a human, but also. The agents can be helpful in a lot of instances. They can do documentation that we're too lazy to write up. They can do tests that we're, we don't wanna, ~you know, ~cover every single test case or potential failure state. So I, I understand where they're coming from and ~I, ~I think that probably other people are going to follow suit, but at the same time, it's like if you don't use these tools and other people are, you're also potentially going to be left behind because you're so. ~S you know, ~against it in every way, shape and form. Jack: Yeah, Josh: Do we think that this has ~kind of ~implications or is an indication that we're reaching a breaking point where open source projects might want independence from, say, big tech [00:15:00] owned platforms or ones with a lot of AI in them. Jack: I don't think so. ~I, I think, ~I think it's just a one-off that, ~that ~that's my take. Yeah. An interesting one-off. Paige: ~I mean, ~I mean, Coberg, like I said, I'd never heard of it before. Now I think that GitHub is such a dominant player in the market that even though it's owned by Microsoft, and even though people might not appreciate that, if you're anywhere online in development and code, you're probably gonna be on GitHub and not being there is probably gonna be a big miss for you. Josh: Yeah, as an open source maintainer myself, I have felt some of the bugs that Zig has reported in GitHub actions and personally grumbled about how if they weren't focusing on ai, they probably would've fixed them, but ~it, ~it's not gotten to the point where I would in any way be able to move off. So obviously props and shout out to them for at least trying it out. ~I mean, ~they're standing on principles and those principles are hard to stand on these days. Jack: Yeah. ~I, ~I'm just looking at the Berg site and I gotta say ~like, the, the,~ not putting any kind of berg tie in to Coberg is a, is just a massive mess. ~Like, I, you know, ~come on guys. Seriously,~ it's,~ it's pretty lame site. ~Like, ~whatever. Josh: ~Yeah, they should. ~They should ~definitely, ~definitely have some bergs. [00:16:00] Let's add a little more fuel to this conversation. Shy Ude came up again. So for context, shy Ude is a lovely NPM worm that's infected quite a few packages. ~Uh, ~it's exploiting GitHub and NPM infrastructure, or at least it was, and compromised apparently over a thousand packages this time. While it's not directly tied to ai, this is another example that. Folks in the industry are pointing out of, because we're investing, or at least GitHub and Microsoft, and they're owed and PM are investing so heavily in ai, they're perhaps drawing resources away from the teams that would be able to more directly ~kind of ~defend against these sort of attacks. ~Uh. ~Together. These stories ~kind of ~are starting to raise a central question that we're ~on, ~on track to start to discuss. Are we becoming to reliant on platforms whose priorities may or may not align with OSS communities? What happens when these platforms got bought, say, GitHub by Microsoft or Bondi Anthropic. So ~let's, ~let's start talking about Salude. Does this warm highlight any sort of systemic fragility? What do we think are the kind of lessons and learnings we can take away from it? Coming back again. Jack: Oh yeah, for sure. ~It, ~[00:17:00] it shows a vulnerability ~in the whole m~ in the trust model of MPM and they're working on that. ~Right. ~So we just ran into this at 10 stack with a whole bunch of stuff around ~the, ~the OIDC requirements for publishing now, which really hardened like ~the, ~the level of stringent this about ~like.~ Where can you, you know, publish a package from The credentials don't last very long. ~It, it, it, it,~ they're really trying ~to, uh,~ to cut down on this kind of thing. ~I mean, ~there are other vectors, of course, people could just publish a bad package, ~you know, ~and try and get people on it. There's that, ~you know, ~but yeah, no,~ I,~ I think it's more of a systemic vulnerability in MVM, first and foremost. Paige: Yeah, we actually, we ran into the same issue over. Are at Blues because we have,~ uh,~ a number of SDKs that we publish to NPM and. ~I, I like,~ I support how GitHub tried to rectify the situation, which was going to this forcing everyone basically over to the OIDC trusted publishing model, but they have pushed it out so quickly. There are so many [00:18:00] use cases that they cannot support, at least initially outta the box. For instance, with us, we have both production versions of our N-P-M-S-D-K that goes out and beta versions for the dev team to test against, and there are two separate scripts, but NPM currently only has the ability to put in one script. As the trusted publisher, otherwise you'll have to use personal access tokens or which now are set to expire Immediate. Yeah, very quickly. ~Um, ~so I was very fortunate to be able to figure out, with the help of the GitHub community, to use GitHub reusable workflows. Which then allowed me to call the two different scripts that we have for our publishing. But I know that for other people that won't work for whatever reason and the fact that they didn't have any sort of documentation or other workarounds that you could do intermediately so that you could, ~you know, ~be safer, but also. Not just wreck your whole publishing [00:19:00] set up if you have a custom one, which I'm sure a lot of people do. ~Um, ~yeah, that really felt like a miss and that's something that Microsoft and the GitHub team should have put more effort into and more devs on if they were going to try and force ~this, ~this turnover so quickly to help people. Get back into a good workflow because it really sucks when your suddenly your stuff isn't publishing for no other reason than your token expired again. Jack: It's a. Josh: Yeah, my Create TypeScript app project has, I just haven't bothered updating it. It's currently broken for all this. I'm waiting for the second wave of complaints to come in to learn what the actual correct strategy will be. Paige: ~Mm-hmm.~ Jack: And ~how, ~how quickly are folks going to just go, you know what, I'm just not even gonna publish outta CICD anymore. I'm just gonna get a pat anytime I Paul': that's how I do Jack: just publish out of that. Paul': because I'm at the point, I have so few packages that I publicly publish. I've, I have ~like ~three, so it's just like this is too much work. I have like things to do publishing from right now. Paige: ~Yeah.~ Jack: ~The other, oh, sorry.~ Paige: and that's going to open people up to more, ~you know, ~security vulnerabilities, [00:20:00] not less. When You're manually just yoing it and pushing Paul': You're so Paige: from your machine. Paul': vulnerability. ~do ~do you have any idea where my personal access token has been? Oh gosh. Jack: ~Oh man. Dude,~ Josh: ~the problem, oh, sorry, Jack.~ Jack: So the other thing is with the shy ude is, ~you know, ~one of the things it's doing is it's going ~and, ~and roaming around our machines looking for environment variables. And that's a pain in the butt, right? ~If I'm, you know, ~you know, if I'm using some SaaS services or AI keys, things like that, now what am I gonna do? ~I mean, you know, ~they got the one password stuff, but now you got this weird flow where it's like you gotta inject mvs and it's like, ugh, it's such a pain. ~I, I end up just like. ~I'm kind of running, ~you know, ~a little bit loose and I've got ENV files around and ~if, ~if I get hit ~well ~I'm gonna have to go and, ~you know, ~just re refresh those tokens assuming that I know that I got hit. ~You know, ~that's. Paul': ~Well, ~a little off topic, but we are in the ~day, ~day and age of one-stop reusable software, right? Where I'm sure you guys have like just pumped something out. And I've been a huge fan of Doppler as my secrets manager [00:21:00] for like teams, because you can ~like ~make these workspaces, ~like ~invite people to a team and then everything's managed in the cloud. Nothing's local. It boots up a sub shell and then there's no end files. If I update something, like all the devs get it immediately. ~Um. ~Now I'm just like, I don't wanna pay $20 a month per dev. I can make my own Doppler. ~Like ~this is, this will take me two days ~and, ~and it'll be hosted on fly io ~or, ~or whatever. ~Um, ~but the general notion of ~like ~not eating around N files, I think ~is, ~is good. I've liked it. It's, it makes me feel a little ~less secure. Uh, I mean more secure, sorry,~ more secure. Jack: ~ you'll,~ you'll have to teach me what the flow is. 'cause ~I mean, I, ~I can do it in one password. I, there are cli for it. That'll grab it out of ~the, ~the vault or whatever and put it in like transiently in the shell, Paul': ~Right. ~Yeah. Yeah. ~That's, ~that's ~kind of ~how Doppler works. But it's like a registry. ~It's like, ~it's like a HashiCorp vault ~kind of, but, ~but with tooling. Jack: Yeah. it's just getting the LLM to realize that like, Hey, when you run Paul': That's tough. Yeah. You really gotta put in your Clot MD and be strong about it. Jack: ~Like, ~ Paige: Don't look. Jack: Yeah. you really need, you know, you don't have end files guy, ~you know? ~I love doing the, like the Ironman thing of like~ like telling the lm giving, you know, ~talking to the LM about some plan [00:22:00] and being like, okay, so here I wanna build this and I wanna build this thing, I wanna build this thing, but ~I, ~I'm gonna go off and get some coffee now, and by the time I get back I want this done. ~You know, ~and that kind of thing. And it's ~like, if it, ~if it messes up on the end files, I'm like, ah,~ dude,~ dude, ~you know, ~it was right there. It was right. Ah. Paige: I can't let Claude off a leash personally, ~I I, ~I loved to like,~ like, ~plan with it and have it make a plan and make a.md file or something that it can follow, but I still won't let it just make decisions after that 'cause it inevitably gets off track and does something wrong or goes down the wrong path. And if I'm not watching it, it'll get pretty far down the wrong path before you can course Paul': Maybe you're not sadistic enough in how you scare it to staying on track. Paige: Apparently not not mean enough to it, Paul': ~No, no, ~no. The trick is to Jack: There's a building, it's on fire and all the people in the building are depending on you ~to, ~to get this Paul': is weird. Threatening human death is actually very effective. I recommend it. Jack: What have we gotten to? I don't know. Josh: ~are you referencing There is an article, perhaps~ there were some articles recently about how the more stern and critical you are with your ai, the more [00:23:00] successful the Paul': Well~ Well ~that's being challenged with Claude four five. 'cause it's a little bit more conversational. It's a little bit better at understanding natural language. ~Um, ~and so that means you have to be even less mean. You're like, look, there's this, this, there's that. There's that. Maybe it'll run on an airplane. Maybe if it messes up, people will die. I don't know, ~like. ~That works. But if you're like, somebody will get shot if this ha, if this doesn't go, that doesn't work anymore. Jack: ~I, ~I've started doing the thing where ~I, ~I congratulated. Like, Ooh, you did a really good job on that section over there. I really like that. And I, it, I was talking to the guy from Infinite Red, 'cause ~he, ~he did this as well, and he actually had a thing where he is ~like, ~I think this matters because you're basically reinforcing what you want to see, ~you know?~ And ~so, ~yeah, be positive, be engaging, be, ~you know, ~supportive. Ooh, you did a good, you did a good job.~ good job. ~ Paul': ~talk to it like I'm King Jam,~ ~Jamun or something. I'm like, you're so great. But like deep down I'm like, there's a camp over there like I'm gonna send you there.~ Jack: ~Oh, geez. Okay.~ Josh: ~steer us back~ Jack: ~Yeah. Okay.~ Josh: ~Pause for editing. All right, well,~ now that we've transitioned back to ai,~ uh,~ let's talk about AI investments. One more topic before break. We've got a lot of stuff getting pushed out in the news about companies investing in ai. There's a lot of AI hype that's still very strong, but we may or may not. Be starting to see cracks show in the [00:24:00] economic supporting game. There was a really interesting article and discussion recently with I-B-S-C-E-O where they stated that there's, and we're quoting here, no way the industry's trillion dollar AI data structure spending could possibly pay off under the current infrared costs. So we've got power GPUs, cooling, real estate networking, the whole stack. He's arguing that the current AI build out resembles past bubbles where we have a lot of capital investments and it out pay outpaces actual realistic returns. ~Uh, so. ~This perspective kind of resonates with a broader sentiment in some engineering circles that were within the Gartner hype cycle, perhaps maybe entering the TR of disillusionment. There's a lot of demand, but so are the costs and the unit economics for inference and training are pretty challenging. Meanwhile, we're still seeing a lot, and ~I mean ~a lot of amounts of money. ~Kind of ~staggering amounts of money being burned by top AI players like open ai, Google philanthropic meta, and so on. So there are a lot of questions abound about sustainability consolidation and how the actual cost structure can affect downstream developers. That's a lot to talk about. So let's [00:25:00] discuss how are AI companies doing? Are they hitting a wall? Are we seeing infra costs, exceeding value creation? What do we think about ~kind of ~the short to medium term in this context? Jack: Their costs are clearly exceeding their value creation for sure. And they have been for a while, open AI or, uh, anthropic, I think was like a three to one subsidy around some of their models where you pay a buck to them and they're paying three bucks to actually run the thing. That's not sustainable. ~You know, they're just, that's where,~ that's when you're in the customer growth mode, looking for folks to onboard ~and, and, ~and spending more than you're gonna get from them. I don't know why necessarily they're still in that mode, but they actually reduce the costs on Opus, which is pretty interesting. Maybe that's actually, yeah, based on infrastructure. I don't know. But yeah, it's scary. The frontier models are scary. Paul': ~I, ~I also feel like saying it's not. Breaking even, or like they're so above costs is you totally see it. And you also get that feeling. I'm sure if you look at the new Chinese hardware coming out, that's very impressive. Uh, kind of like the amount of computation those chips can do. ~Um, ~but at the [00:26:00] same time I get this feeling of ~like, ~if anybody asks the US government, when they told all the settlers to go claim all the land that was rightfully theirs,~ um,~ so they could get a few acres. If you looked at that investment, you would go ~like, ~Hey, they're losing a lot of money. And ~it, ~it took a long time for the crops ~to, to be, ~to be pulled. ~Um, and, ~and for the communities to grow. And I'm not just talking about the farmer making money to then pay taxes. I'm talking about like a town being built. ~Um, ~like ~it, ~it, it took a hundred years for that town to get built. And I, every time I'm like, oh, the AI bubble, 'cause there's just a thousand videos on YouTube about this, like new. Every day I catch myself. I'm like, wait a minute. Like philanthropic is kinda like setting us out. As like settlers and figuring out how to use the land that they're giving us, and like that's gonna take time to grow into a city that then has a sticky nature to it. That it's like encompassing ~a, a, ~piece of ~a, a car, uh,~ commerce. I am in no way educated enough to know if that matches up to the current spending or trajectory of how these people are thinking about it. But there's an element of that that I think gets ignored a lot ~in, ~in the discourse. ~Um, ~and we, it ~kind of ~brings me back to, yeah, you [00:27:00] wanna talk about dictatorships. I think this is where ~their, ~their brain is thinking a lot like, how can we own work? I know Josh is like looking stressed out over here, but that's what they want to do. They want to buy us all. Anthropic is not your friend. Anthropic does not like you, and Philanthropic wants to take a portion of your earnings and capitalize privately on it. ~Um, ~I don't see how this is any different. I think it's a calculated move. Even though as normal people we're like, what? Maybe that should be a foreboding warning of ~like ~what they see a little bit. Paige: Yeah,~ that's,~ that's actually a really good point that I hadn't considered as thoroughly, but it makes sense. When you think back to what tech companies have done in the past, what were social media companies set up to do to connect us? No. They were there to data mine from us and get all of our personal information. And if we're not paying enough, even if we're paying something to these AI companies. That means that the other thing that they're getting from us is tons of information and training data for their models. So ~I, ~I agree though. I think that there's. So much weird math that's [00:28:00] happening in the investment circles around AI with Nvidia and open ai, and Gemini and Google and Anthropic and all of these where they're all agreeing to invest ~tons of data, ~tons of money into data centers, but only if they start supplying this much amount of power, or they're serving this many people, or they're reaching these revenue goals and it's all like, how does this all actually shake out? When we take away all the incentives and the pledges and the promises, like what is, where is this money actually coming from and who's going to really supply it at the end of the day? And I think that's the thing that I always struggle with is like there's all these guarantees or handshake agreements. For support and new money and investment, but is it actually happening or is it going to happen? Or is it just gonna be one of those things that they say and there's a press release and then we never hear about it again?~ again?~ Paul': ~Fund. ~ Jack: ~it's,~ Paul': ~Sorry, Jacka. ~ Jack: it's circular, right? ~It's, ~it's, ~you know, it's, uh,~ Nvidia that's, ~you know, ~investing in anthropic, which is then in turn buying Nvidia hardware, and [00:29:00] it's just this, everybody in that cycle you want, is there new cash coming into this externally? Paige: Mm-hmm.~ Mm-hmm.~ Jack: ~Or you know, ~or are they just creating an artificial demand that then they can say, oh ~look, ~look at all the demand. ~Well, ~you're creating the demand for all these GPUs. Paige: Yeah. Is there demand outside of the massive tech companies that are all investing so heavily in this? I'm not sure. I don't know what it actually is when you take them out of the equation. Jack: ~Right. ~And the fact that like, what 3% of all,~ uh,~ Microsoft Office users actually use copilot or who have access to it, actually use it. Paige: Exactly. Paul': also just think we're not as a society using AI the way it's gonna be, and we all agree on that,~ right,~ too. It's like we're not using it how it should be used. And developing an AI system right now myself, it's like, oh, we thought people were gonna ~like, ~wanna chat and ~like, ~make artifacts. And it's like, no. ~Like ~AI is there to decide ~like ~if this, then that. And then the person's just like using an app. It's just like, oh, it like did the right thing. You know? That's a micro, that's a micro AI like in a little junction, and we're gonna see [00:30:00] thousands of those per proliferating out. So it's like, is there gonna be GPU usage? Maybe not on chat GPTs, but ~like, ~come on, it's gonna go everywhere. This stuff's gonna live on your devices. Sure, it can look live locally in some regards. I, yeah, I just, I really think ~the, ~the,~ we,~ we are going out into the frontier and like we don't even know what~ what the city,~ that there's gonna be cities yet, but there're gonna be cities ~type of ~type of thing is happening. ~Um, ~and on the circular thing, Jack, like this brings me back to the idea of like, we're all in a casino. There's a house and they're shipping all these chips around and we're like, wow, that's a lot of chips. But at the end of the day, the casino's the one who has all the chips. Jack: You are not Paul': getting the chips, you're not getting the only, but, but in the shuffle, ~you know, ~you know, what they get is a portion of your income,~ uh,~ a portion of your ~delta, your, your, um,~ power delta over an organization. And that's what's minimizing. I can go to an organization like five years ago, you could go to your employer and go, I need. This much more money ~and your, ~and your strong arm ability was quite large. I'm sure people already feel that diminishing. It's gonna diminish more and that's what anthro is capitalizing on. And it's [00:31:00] like in the mix of them doing their chips thing,~ we're,~ we're not noticing. Jack: Yeah, I completely agree about that. And I also agree about ~the, the, ~the small models thing. That's something I've been on about for. couple years now, or when last year was like that, those three B seven D models that you can run locally in alama, ~you know, ~on super commodity machines or on your phone, ~are just gonna, and,~ and make those small decisions and do those little small, very finely tuned things, that's gonna be become huge. And it's interesting to me that like,~ like, I don't really see, I see huffing, I,~ I see a hugging face in that space. I see, ~you know, ~smaller. Model creators in that space. But I don't see open AI or Anthropic doing anything in that space to help us build out smaller models that ~are, ~are tuneable. ~So that's kind of, yeah,~ I mean meta, I guess Meta's, yeah, their and their llama package. And their llama models that you can train, I guess. But they're still big, Paul': ~I mean, ~it's wild. You can pumble. 80% of human knowledge into a four gigabyte binary. That blows my mind. Yeah. Jack: That's nuts. And ~it, ~it still knows, ~you know, ~the random,~ like, you know, ~who is this? ~You know, ~Julius Caesar's, [00:32:00] whatever. ~Well, you know, ~I was like, okay, that's that. And you're like, wow, what the heck? That's impressive. Josh: So in terms of kind of practical engineering terms and velocity in the way that we use these tools, what would you expect when models like Haiku and the other smaller ones get more and more investment and a little more capable over the next few years? Paul': I wanna see local models getting more attention and investment. I wanna see Wasm in my browser effectively run models, which there's projects out there, but they can be better. ~Um, ~and when that happens, there's gonna be a decoupling of ~like, ~everything needs to go through these big providers. Everything needs to go up to the cloud. Who knows how that ~transpi~ transpires into like. The economics of the situation we're talking about, but there's definitely gonna be a switch where people have finally have that of like, oh, I can just ~like ~use this technology, this AI thing in my browser, disconnected on the airplane without my wifi upgrade. ~Um, ~Yeah. Jack: Yeah. Paige: We need more. Infrastructure to support those smaller models too. Because right now it's so easy to connect to the open AI [00:33:00] API or the Anthropic, API or whatever. But we need to make it easier, super easy to connect to those small models and have them running locally. And I think that's the piece that's still missing is ~it's still, it's a, ~it's a lift to get there, especially if you do wanna have your own private instance running on AWS or something like that. ~Or, you know, ~or even just running on your own computer, like keeping it up to date and making it easy to access ~is not,~ is still ~not, ~not as much,~ uh,~ a thing. That's what needs to enhance and get better, I think. Jack: and just to circle back to the whole code. You know me, whatever coberg thing, right? If there is, it was a vendor I would trust to go and add in, like, okay, in order to run this CICD job, we need, ~you know, ~this three B model deployed on my GitHub org. It would be GitHub, like GitHub could do that sort of thing at scale and yeah. Josh: ~Well, ~that's a great transition to some of our more targeted hot takes section. ~Before we go to the hot takes, we're gonna take a quick break and be right back. So I'm just gonna go ahead and restart. That was a weird phrasing. I'm actually not sure how to phrase this. My apologies. Well, that's a great transition over to Hot Takes.~ Before we go onto that section, we will have a quick break and we'll be right back. ~Pause. ~This episode is brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and [00:34:00] analytics, which surface the UX Ag technical issues impacting user experiences. Start understanding where your users are struggling by trying it for fudge. Ah, I was so excited to get it right this time. This episode is brought to you by Log Rocket. Log. Rocket provides AI first session, replay, and analytics, which surface the UX and technical issues impacting user experiences. Start understanding where users are struggling by trying it for free@logrocket.com. Alrighty, let's talk Hot takes each episode. We like to have every person come in with a few hot takes, things they wanna rant about, maybe about development, maybe about life. Who wants to start us off? Jack: All right. I'll jump into it. 'cause apparently my hot take ~is, is, ~is controversial, so we're a big Subaru family. We all have Subies Trust rely on Subies to the end of the, you know, I live in Portland, Oregon, so it's not unusual ~like, ~but ~it, ~it is a thing and apparently, and my wife just told me about this yesterday, that all the car manufacturers now are doing these like subscriptions. So [00:35:00] now in order to get,~ uh,~ any kind of extra stuff ~on my,~ on ~a, ~a late model Subie that I don't have, but ~I mean, ~imagine I bought a new one if I wanted to get remote connect, which will allow me to ~change my, or~ have a digital key. ~Uh. ~A guest driver, vehicle status, last park location, lock and unlock, remote star climate settings, blah, blah, blah, blah, blah. Tesla supercharger, access and plug and charge. I would need to pay a monthly fee to Subaru to have that. And it's just like, ~it was like, uh,~ really? Does everything have to be ~a, ~a monthly charge? ~I mean, ~come on. Paige: Yes. I am right there with you, Jack. ~We started talking about this before we recorded this episode, and it is,~ they are killing us with subscriptions. Everything is a subscription now. It's all monthly, and I understand from a business perspective, it smooths out your revenue across the year. It's better than one big hit. You know when you initially sell the product or even a yearly renewal, but at the same time, everything wants a subscription. New York Times wants me to sign up with a $1 subscription to read one article. Subaru wants me to sign up for a year long subscription to [00:36:00] start it remotely from the kitchen while I'm standing there looking at it. ~I mean, it's just, yeah. Can we. ~Maybe can we think about usage models? I know that PE-people who are proposed that for SaaS in particular are like, yeah, I like being charged when I am actually using something, not just because I own this thing and I have to be able to, let's think about usage model subscriptions. Maybe that is the way of the future. I'm happy to pay for things I'm using, but if I don't look at Netflix for a month, maybe Netflix just doesn't charge me for it until I open the app again. Is that an impossible dream? Jack: the impossible dream apparently. Josh: Yes it is. Paige, what's your hot take? Paige: ~I mean, I'm kind of right there with Jack. It's like subscriptions are outta control and we need to figure out a better model. But other than that, I mean, it just seems like. There is. So I, ~I don't know. I'm just overwhelmed by everything that's happening in technology right now. I just can't keep up with it. ~You know, ~I saw earlier this week that,~ um,~ I think it was Anthropic ~made, MCP~ gave MCP its own AI foundation. So they're, ~you know, they're~ giving back to the community in that way, but it's just like. How does, how do, does anybody keep up with everything that's coming out? There are so many tools, so many new AI [00:37:00] things to try to learn to look at. I just feel a little overwhelmed by ~the, ~the speed of change right now. And I know it won't slow down because that's impossible, but I wish I could speed up so that I could keep up better. Jack: I just want like less things in my life that are clearly just trying to use me for my money. ~You know, I just, ~I just feel like we're just like, every interaction I have is all like, just how can I ~just like ~take a little bit more money from you? And it's Paige: Selling stuff to you? Jack: Like seriously, Paige: ~Mm-hmm.~ Jack: ~you know?~ Josh: ~You know, ~there are services you can pay for that will tell you which services you're Paige: I'll block that from you. Jack: ~Right. ~Just, it's infinite, ~you know? ~I just want to go through, through a day not realizing,~ like,~ just like just breathing ~in, ~in the space is costing me, ~you know, like, ~I don't know, 20 bucks a day worth of subscription services, like really? Paul': ~Um, ~ Josh: Paul, what's your hot take? Paul': ~uh, ~mine's mu much more tangential, but it's still technology related. ~Um, ~the use of AI gen in general,~ um,~ we're so like horse blinded on ~like ~what using AI means and what value it brings to our lives. [00:38:00] when I hop in the chat GPT, I'm either trying to Google something or I'm trying to like explore some idea that I just want somebody to go back and forth on and I'm like,~ this,~ this to me feels like the Google SOA of ~like ~the ability ~to, ~to generate video. ~Like ~SOA is fun. It's great. Generating video is wild. We can do so much of that. And it's in a, it is in a little dinky app. So I feel like there's ways that we can start using AI and evangelizing this that's not like, Hey, look at this cool thing to regular people that will change their lives and we're not doing that enough. ~Um, ~and just you're building a product. Go have somebody who's not a developer. Use it, like chat with it. If you're, if you, if your mom uses Excel. Show her how to download a CSV and point it to like your claw desktop. ~Like ~there's just like so many little things that can help people. And the one that I want to champion with is ~like.~ AI's great at data. We can take tons of data and we can read it. Guess what it, this is like a public call of service. You know what? Data is publicly available as of like in the past [00:39:00] month, two month, three month, all of the Epstein files and everybody should download them if you're interested and figure out how to put these people in jail. It is like our collective responsibility to. To,~ uh,~ email ~your, ~your congressional,~ uh,~ bodies to email your representatives, like find things. I found things in Massachusetts of people that aren't yet brought up in the news that are completely damning and not all the files are even released yet. So this is a great way for you to learn how to use ai, whether it be in cloud, desktop, codex, whatever, with free and public available information that is for a mission that is very important to us as society right now. It's like very important. ~Um, ~there you go. Jack: Hot. Hot take. Josh: It's There you go. very, hot take. Excellent. ~Well, ~I have one last hot take,~ uh,~ before I open back up to the room, which is completely unrelated to everything we've talked about. Oh, excellent. Or ox LT just released, they're ware linting alpha, which I'm very excited about. I realize this may not be so exciting in the context of ai, but in general it's very, [00:40:00] very interesting stuff. They have ~kind of ~the rust based Linux. Lint. And then underneath that they have the go based linter backends, not the full linter called Ts Go Lins. So they're able to get type wear lint rules like no floating promises for TypeScript developers and JavaScript developers at a shockingly fast rate, like 10 x times the kind of normal JavaScript speed for me a lint. So very excited about that. Yeah. Jack: just, it's a weird thing though that like it, that's void. Zero and V is po and then there was VI test or V test or whatever, right? And then they went ox, and then there this whole like offshoot of OX stuff. And it's just weird from a branding perspective, I would just expect that they would have like the, the lint, ~you know, ~or something like that. ~Right?~ Like,~ why,~ why ox? ~Like, I, ~I Paul': Their brand is like a chiseled face, ~you know, ~with like sharp edges for their words, they, Jack: Chad Face, Josh: Chad Lint, Jack: Chad Lin. Paige: Oh my God. Josh: ~Error messages or Chad phrasing. Great. ~I do think this is very good for ai, though. ~Uh, ~a lot of folks have started to really feel the pain of AI written code and the more tools we can get that automatically detect [00:41:00] and fix, especially at high speed, the issues ~that ~that's good. Jack: ~you know, I, ~I just did a big project with ai. I did a lot of vibe coding, but, ~you know, if you, ~if you use it for a POC, just to kind of feel out the space and then as you want to go and productionalize it, ~like ~go through it and review it and, ~you know, ~nail it down, ~you can, ~you can create a really good hyper accelerated flow for. A mid to senior engineer who knows what they're doing, ~you know, ~it can be like, okay, ~I know, ~I know what good code looks like. I know what bad code looks like. I know the structures that I want. And ~if you, ~if you prompt a tool the right way, you can get there a lot faster. Paige: is the key though, Jack. You have to know what you're doing so that you can steer it away from the pitfalls that it will inevitably drift towards if you don't. Jack: Yeah, tons of asynchronous imports. Claude is light, loves async imports, man. It'll just litter those It's like, oh, I do. I don't wanna bring you in that package just for no reason. Ay, import right in the middle of ~like ~some [00:42:00] massive function. You're like, okay, that's great. Josh: Someone should write a lint rule. Jack: ~Yeah, ~yeah. No, none of that. Josh: ~we got a minute left. Who's got one more hug? Take.~ Jack: ~Upgrade your React~ Paige: ~Yes.~ Jack: ~if you're on RSC, upgrade your React 'cause they, they found a big CVE that is a non-trivial CVE. It is like a big vulnerability. People can run anything on your hosts. It's not good. So yeah, watch out.~ Paige: ~And there's patches everywhere. It will not take, it should not take long. Just please do it.~ Jack: ~Up. Upgrade your ~ ~versions, you'd be fine.~ Paul': ~Or go find~ ~some out in the wild and then contact the company and be like, Hey.~ Jack: ~By the way, you have a huge vulnerability.~ Josh: ~Believe the platform for pay the bug bounty that ~ Paige: ~you don't have ~ Paul': ~I, there's been so many times I've hit a company up and I'm like, so I like dumped like this table. And they're just like, so how'd you do it? I'm like, what's the bug bounty? And then they never hit you back and you were like, well, well, I got a coffee to go pick up. Like, ~ Jack: ~Yeah. ~ ~Right. ~ Paul': ~because you didn't Great. Well, that's all the time we have. This was a lovely discussion. We talked about AI in the context of companies and investments, BUN getting acquired, shy, Ude, Zig. Moving off of GitHub, we talked about some hot takes and we talked about. Before that, even IBM's CEO's statements about the kinda long-term sustainability of the industry.~ Paige, Jack Paul, this was an absolute pleasure. Thank you for joining me and signing off everyone. This has been PO Rocket. Josh: Cheers. Paul': Hi. Jack: Happy to Paige: Thanks.