David Egts: What's new. Gunnar Hellekson: I uh am continuing my research into uh Gemini and uh gems. David Egts: Mhm. Yeah. Gunnar Hellekson: So I I pulled several several things converged and now I can do something very exciting. David Egts: Mhm. Gunnar Hellekson: I'll tell you what it is and then I'll tell you how it's done. David Egts: Okay. Gunnar Hellekson: So I can now go to any Google document that I've got, slide, presentation, what have you, and ask a panel of customers how they would react to that message, that presentation, and so on. David Egts: Mhm. Mhm. Gunnar Hellekson: And those are the the customers that I'm asking are actually defined by our kind of company standard personas. David Egts: Mhm. Gunnar Hellekson: Uh super useful, right? Just to have that available right there in the sidebar. David Egts: Mhm. Gunnar Hellekson: Um and also think about you don't just have to do it with a customer panel, right? 00:05:39 David Egts: Mhm. Gunnar Hellekson: You could do it with my bosses, for example, my staff, for example, right? David Egts: Mhm. Mhm. Gunnar Hellekson: My team for example. David Egts: Yep. Gunnar Hellekson: So if you've got these teams or groups or panels defined uh say in a set of Google documents, you take all those documents, those become knowledge for a gem. Uh and then you can share that gem with the rest of your team for example. So now me and all my staff can work off of the same set of personas when evaluating each of our pieces of work. David Egts: Yeah. Gunnar Hellekson: And so now we can do things like, okay, well, don't bring this new white paper in for review until it's been through this until it's been through this panel. It's been reviewed by our customer panel, the virtual customer panel, and the virtual whatever executive panel. David Egts: Yep. Yep. Gunnar Hellekson: Um, extremely helpful, right? Um, a nice little QA process that you can uh that we can do on on material and artifacts that we create. 00:06:46 Gunnar Hellekson: Uh anyway, all uh I had been trying to build a system like this with your got your apps, Google app scripts and your nadenss and your all kinds of agent agentic whatevers and uh turns out that Google added David Egts: Mhm. Gunnar Hellekson: enough basic features that you can actually it's it feels built in now. Um which is extremely cool. Cool. I just I'm in a document, open up a sidebar, click on the panel that I want to review, and I say, "Please, each of you, please give me a paragraph of critical feedback on this document." And 45 seconds later, uh, I'm feeling confident about how good my document is. It's great. David Egts: Yeah. Now, if I were to think about like the uh I I bet at Amazon, what they do is that they you know how they would do the six pager or and then they could run it through their panel Gunnar Hellekson: Mhm. David Egts: of leaders or teams and to sort of grade it, right? Gunnar Hellekson: Yeah, sure. David Egts: or how they develop products by writing the press release and coming up. 00:07:42 Gunnar Hellekson: Mhm. Yes. David Egts: Imagine having people having it add quotes from your customer panel of like quotes right in the press release. Gunnar Hellekson: Oh, yeah. Yeah. Yep. That's right. That's right. Yeah. Pretty good. Pretty exciting. and uh also confirmed one of my stronger priors now which is that all this AI tooling features all these neat gigas they're giving us uh are rewarded uh the more work you do on the foundational stuff the better off you are long term you know um and so the more you invest in the personas the better output you're going to Mhm. David Egts: Yep. Yep. Yep. And the more you share it and instead of like every team having having a different set of personas, uh you know that's you know it's like if it's a shared it's like oh I don't like those personas. Gunnar Hellekson: Yes. David Egts: Well you change it right for everybody and you make it better for everybody. Gunnar Hellekson: Yeah. David Egts: So it it's sort of it's almost the uh open sourcing or crowdsourcing of this instead of like people having you know private select channels or direct messages or uh private documents you know that whole default to open the 00:08:46 Gunnar Hellekson: Yeah, that's fine. David Egts: more you could share these things is that's more useful it is. Gunnar Hellekson: Yeah. Yeah, that's right. That's right. Uh, once again, collaboration wins the day. David Egts: Yeah. Gunnar Hellekson: So, yeah, it's great. David Egts: Awesome. Gunnar Hellekson: So, I'm enjoying that. What about you? What's going on? David Egts: ticket master. Painful. Painful. So, uh, like I was delighted to see that Rush is actually going on tour. So, the the drummer of Rush, my favorite drummer of all time, Neil Pier, he passed away well over a decade ago. And it's like, oh, they're never going to tour again and everything. And it was, you know, and so they they actually have a a young lady that's the drummer uh that's going to be the that's going to tour with them. So, I was like, "Oh, this is going to be awesome. I I got to check this all out." And Gunnar Hellekson: Nice. David Egts: so, it's like, "They're actually coming to Cleveland. They're 00:09:51 David Egts: actually doing two shows in Cleveland." And it's like, you sign up and everything. And it's like, "Okay, I sign up for the pre-sale." It was such a terrible experience. It was horrible. It's And then and it was on like a partner site because it's at the uh the I forget what they call it, the the arena where the Cleveland Cavaliers play. Uh, and so it's in there. So all the tickets are sold through SeatGeek, not Ticket Master. So then I had to get an account on SeatGeek. I go to like add my credit card to it. It wouldn't let me. And it's like I'm trying to buy tickets, right? And then it's like you click on the tickets and it's like uh then you say select and it's like sorry those have been sold out. It's like you're you're killing me, right? Went through that and then it's like okay well we got more tickets coming and and then uh that was today. and it was uh sold at uh they're they're actually doing them at um more locations. 00:10:53 David Egts: So, uh I I actually was able to get tickets for Detroit. So, I'm excited. Gunnar Hellekson: Nice. David Egts: You know, several $100 later, many hundreds of dollars later, uh got three tickets. So, me and the MS and and my daughter are going to go there. and and uh but even when I was there, it's like I knew my credit card was expiring, so I had to update it. Gunnar Hellekson: Nice. David Egts: So, I go into my account and then uh I I plug in I I try to update it and it wouldn't let me. And so, it's like how am I going to buy these, right? And then I was then I had multiple tabs open because it's like I'm going to try there were two two um two nights in Detroit, there were two nights in DC. So, I had four tabs open and it's like whichever one gets me in, I'll just buy the tickets. And then it figured I was like some sort of scalping bot and it just put me in jail and it wouldn't let me in, right? 00:11:47 David Egts: And I and then so I closed all the windows and I was down to like one tab for like Detroit and it's like hoping for the best and eventually I got in, I got the tickets, all is right with the world. But it's like what a and I understand it's like you know dealing with the bots and you know it's like just a uh a hard problem for them to solve but you would think it would be an better way to solve it in terms of like at least like before you get in just wanted to let you know your credit card's going to expire. Just wanted to let you know seriously only have one tab open. Gunnar Hellekson: Right. David Egts: I noticed you have two tabs open. You're going to be in jail in the next minute if you don't close the other tabs. you know, and instead of waiting for, you know, the floodgates to open and, you know, just really anxietyinducing and and I was to the point where it's like, if I don't get the tickets, the heck, fine, and everything. 00:12:28 Gunnar Hellekson: Right. Yeah. David Egts: So, um, but now tomorrow, um, uh, there's pre-sale for Foo Fighters and Queens of the Stone Age doing stadium tours. Uh, so yeah, so I'm excited about that. Uh, so we'll see if I get in that or in at least my credit cards up to date. Uh, so we'll see how that goes. Gunnar Hellekson: Well, you think about it from ticket master's point of view. They're they probably don't have a huge problem with the bots because every bot represents a bunch of sold tickets, right? Um their biggest problem is that they're not capturing enough of the secondary market that the bots are operating in, right? David Egts: Precisely. Gunnar Hellekson: Uh yeah. Yeah. Yeah. Yeah. So, like I'm surprised they have any protections at all against the bot stuff. David Egts: Yeah. Gunnar Hellekson: Um. David Egts: Yeah. Well, and they they make the money when, you know, the bots sell at at an inflated price and then they make a percentage off of the inflated price, but if it sells, maybe it sells multiple times in between. 00:13:28 Gunnar Hellekson: Mhm. David Egts: I don't know. Uh but, you know, I'm sure there's more money that they could make uh that they would like to make. Gunnar Hellekson: Edit, David Egts: But and and my guess too is that it it from a legal standpoint with them having somewhat of a monopoly, you know, they want to try to be as customer friendly as they possibly can uh while extracting as much money as possible from people. And if like somebody like me that is like a super fan and I can't get tickets, I'll be like the heck with it. I'm not going to do it anymore and I'm out. Gunnar Hellekson: Yeah. David Egts: You know, it's like it's not worth it. And I don't think they want that either. Gunnar Hellekson: Yeah. Yeah, it seems like they should be able to have a well, if they went so acquisitive, they might create a better user experience, but they are obviously not optimized for the user experience, right? David Egts: No, obviously. Gunnar Hellekson: They're optimized for the for the seller's experience. David Egts: Yeah. 00:14:27 David Egts: Yeah. Well, they're optimized for maximizing revenue. Gunnar Hellekson: Yeah. Yes, that's right. That's right. David Egts: Yeah. Gunnar Hellekson: And it's gotten hauled in front of Congress more than once. Um and uh they still operate with impunity, right? David Egts: Yeah. Yeah. At least they put the the all-in pricing in, you know, while you're looking. So instead of like adding the $60 per ticket convenience fee to every ticket on after you're starting to check out, they put it up front. So you know, so it's like you see the pain going in uh instead of at the last second. Gunnar Hellekson: Yeah. That's right. I guess that's the progress we made the last 20 years of working with this company. Yeah. David Egts: Yes. Right. Gunnar Hellekson: Right. David Egts: Yeah. Yeah. But uh we're going to this is I I have I worry that this is going to be a long show. Uh this is going to be for uh Dan Riceer. It's probably going to be at least two or three commutes to the Pentagon for him uh instead of one. 00:15:29 David Egts: So, uh, but this this this week we're going to be talking about weaponized AI and seemingly conscious AI. So, we're going to get super philosophical about AI, uh, this time. Gunnar Hellekson: Nice. David Egts: But, uh, so, Gunnar, for Oh, and also, you you saw some Onion article or something, you know, speaking of of Rush. Um, what's up with that? Gunnar Hellekson: Oh, yes. Yeah. Um, so there's this Onion article which, uh, Dave, I guess you and I both agree, uh, is cut a little close. Uh, hit us where we lived. David Egts: Yeah. Gunnar Hellekson: Um, and, uh, sorry, I'm not trying to find the, uh, call this out. David Egts: Yes. Gunnar Hellekson: Yeah, the headline is cool dad raising daughter on media that will put her entirely out of touch with her generation, which is uh done. David Egts: Mhm. Done. Yeah. Gunnar Hellekson: Exactly. Uh and it's got a picture of a very sweet 12-year-old girl holding a Talking Heads LP, which vinyl. David Egts: Yeah. Vinyl, right? Yep. 00:16:40 Gunnar Hellekson: That's right. David Egts: Yep. Yep. Gunnar Hellekson: Which was uh Yeah. cut a little close. Um, I'm not unreasonable about this. David Egts: Yep. Yep. Done. Done. Done. That mission accomplished. Yep. So, yeah. like it. Yeah. Gunnar Hellekson: I is I'm not unreasonable about this. He adds, if she doesn't want to watch Harold Lloyd shorts tonight, that's no problem. We still have another five or six prisoner episodes to get through. David Egts: Uhhuh. Call child services. Yeah. Yeah. Gunnar Hellekson: Stuff stuff. David Egts: Yeah. All right. Gunnar Hellekson: Yeah. David Egts: But but uh for for people to get a copy of the Talking Heads uh vinyl, uh where should we send them? Gunnar Hellekson: Yeah. They need to go to uh dgshow.org. That's d and Dave gzandgunner showow.org. David Egts: Yep. And in uh cutting room floor, we got uh you can actually go to this URL. Uh it's a a whole website running on a disposable vape. 00:17:42 David Egts: Uh so because why not, right? Gunnar Hellekson: I didn't even realize that uh vape had enough uh horsepower in it. David Egts: Uh Yeah, it's not it's not a full-blown operating system, but it's it's like enough to run a web server and and if you click on the link, it it gives you a copy of the page, but in the copy of the page on a real web server, it's like if you really want to go there, click on this link so it doesn't, you know, the the vape doesn't get slashed on it. And uh yeah, and then the other thing is uh we also have on the cutting room floor a uh it's an uh Google Pixel 3 uh connected to a 5 and a/4 Commodore 1541 5 and a quarter inch floppy disc drive. Gunnar Hellekson: Oh Yeah. David Egts: So why not? Um, so imagine the the USB uh C uh shenanigans you got to go through to get to uh serial port of a Commodore 1541, but you can uh and then uh also speaking of retro, we got uh a 1990 Airststream uh NASA uh 025 command vehicle. 00:18:52 David Egts: It's it's still up for auction. Uh they're asking 199K and uh what what does it look like, Gunner, to you? Gunnar Hellekson: Uh, it looks like an Airstream trailer is what it looks like. Uh, but it's got that cool NASA racing stripe stuff on it. David Egts: Mhm. Very 70sish kind of. Gunnar Hellekson: Um, very 70sish. David Egts: Yeah. Gunnar Hellekson: Yeah, that's right. And it looks like it's kind of like foned with antennas up at the top. Uh, and uh, here's a guarantee. David Egts: Mhm. Gunnar Hellekson: You will be the coolest RV in the RV park when you roll up in this thing. David Egts: Yep. Right at the Walmart. Gunnar Hellekson: That's right. Yeah, that's right. David Egts: Yeah. Gunnar Hellekson: You're definitely going to turn heads. Unfortunately, Dave, I noticed in the uh, you're right, $200,000 prohibitively expensive. David Egts: Yep. Gunnar Hellekson: Also, unfortunately, not exportable uh, because it's under ITAR. It's got an ITAR designation. David Egts: Yeah. You you can't export it or sell it to hostile entities. 00:19:47 David Egts: Uh I guess foreign or domestic, but it only has 8,240 miles on it. Gunnar Hellekson: Right. David Egts: So you would think it's it's like a bargain. And if you look at, you know, despite having the ITAR designation, they also removed the NASA sticker from it, the meatball. Uh, so you gota, you know, you'll have to, I don't know, go to Fathead or whatever and get a meatball sticker and put it on there. Gunnar Hellekson: Oh. David Egts: But, um, assuming you're allowed to do that. Um, but it, uh, the photos have held up pretty well. It's, I don't know if it's like fiberglass or or aluminum, you know, it's not rusted and so it looks pretty good. Gunnar Hellekson: Yeah. David Egts: And uh yeah. Gunnar Hellekson: Yeah. Well, I imagine NASA's pretty good at keeping up Barbies. It's I would I would guess. David Egts: Yeah, it's parked in probably like uh the boneyard somewhere and in the and yeah, next to a space shuttle. Gunnar Hellekson: Yeah. David Egts: Uh but yep, but then we got that and then there's like I went down the rabbit hole of the Simpsons. 00:20:40 David Egts: They had the the whole steamed hams uh vignette uh of Principal Skinner and Superintendent Chalmer's, but it was done as uh redone by humans as a critically acclaimed feature film and it's like 40 minutes long. Uh so it's time you'll never get back uh but possibly worth it. And then I also found out that there's also other versions of it like a steamed hands episode but uh assume it was uh pretend it was banned like in the USSR. There's another one is a German expressionist film and then another one uh that takes place in Nazi Germany. So there's something for everybody in the cutting room floor there. Gunnar Hellekson: I do like the number of German expressionists. uh uh deep cuts in the Simpsons films. I remember Itchy and there an itchy and scratchy episode uh entitled Worker and Parasite. You remember this classics? David Egts: Yes. Yes. Classic. Gunnar Hellekson: Yeah, that's right. David Egts: Yeah. So, let's let's weaponize some AI. Um, so, um, you know how people are trying to smuggle prompts and AI and get around the guardrails and all that, right? 00:21:45 Gunnar Hellekson: All right, let's do it. Mhm. David Egts: And um so something that really caught my eye about this one is that like you know how you do like multimodal AI and you upload an image or a screenshot to the model and you have say hey what's Gunnar Hellekson: Mhm. David Egts: what's going on there or whatever and it's maybe it's a totally innocent looking image and all that and but what what happens if you uh like it's not it's kind of like steganography But it isn't. But what happens is whenever you upload the image, it doesn't take the the model actually scales down the image and puts it into the model. Gunnar Hellekson: Mhm. David Egts: And if you could take the image and then you could do like a scaling image scaling attack so that whenever you scale the image down it would actually show uh hidden messages inside of it just through like uh I don't know if it's quantization or whatever by by doing the the image scaling. Gunnar Hellekson: Mhm. David Egts: And so there's a picture in the uh and you could see it there in the show notes, but uh you know what is what does that look like to you? 00:23:06 David Egts: Like it's almost like an eye chart like on the left like you don't see the message, but if you do the downscaling like in this case by cubic downscaling it a message shows up, right? Gunnar Hellekson: Yeah, that's right. It's a little bit like those uh color blindness tests, right? David Egts: Yeah. Gunnar Hellekson: Um like the background goes Yeah. David Egts: Is it a 76? Yeah. Or 43. Yeah. Gunnar Hellekson: Yeah. Yeah. Yeah. Yeah. That's right. And it gets a little like the background gets a little red and then this kind of black text is kind of revealed. Uh and it's super interesting. And so the idea is that by when you feed this innocuous looking image to the machine, the machine then as part of its normal routine will downscale the image. But in downscaling it will then reveal text which the AI will then feel compelled to read and act upon. David Egts: Yeah. And act upon, right? Uh because it's not like we've said in the past that the data and the command channel, it's the same data stream. 00:23:54 Gunnar Hellekson: Right? David Egts: And so that's bad especially is like you know the hot thing now with everybody doing the AI powered uh web browsers that are doing the agentic AI on your behalf and it's like all of a sudden if you Gunnar Hellekson: Yes. David Egts: go to a website that says you know click the buy now button and and you know whenever it it you know pulls in that image you know what you know it could do things that are totally uh you know so like as the human you're looking at it and it's like oh this looks totally fine but by the time it gets downscaled, um there could be hidden messages inside of of that um image. And and then it also says the the article talks about well, you know, in order to mitigate this, in order to mitigate this, it's better to uh implement secure design patterns um and and then also have a human Gunnar Hellekson: Yeah. David Egts: in the loop that will, you know, approve the photo when it gets downscaled. And to me, it's like we're going to get further and further away from this the more you start doing the the agentic AI and letting the the browser do its own browsing and action. 00:24:58 Gunnar Hellekson: Right. Yeah. Yeah. Yeah. Yeah. Well, and it seems like um uh well, it's true. I mean, the best Yeah. Yeah, I mean I suppose you could have a human reviewing all the inputs, but like I don't know if that's going to scale. Um, yeah, have then you just build ever stronger agents to evaluate these things to see if there are hidden messages in them and then know to ignore them and yeah, okay, I guess that would work. David Egts: Right. Gunnar Hellekson: Um, but the or protect use sandboxing, right, to prevent the AI from doing anything harmful as a result of receiving the messages. That would be that would be another way to do it. I mean at some point here like you say someone has to figure out how to separate the data from the command channel right seems pretty far away. David Egts: I don't know if you can uh Yeah. Gunnar Hellekson: I don't know if you can Yeah. David Egts: Yeah. Yep. And then like I wonder too like autonomous driving if you know and I'm sure that's a harder problem to solve but you know we've seen things where you're putting a safety cone on um you know a Whimo and it just it's paralyzed or you know you name it. 00:26:13 Gunnar Hellekson: H. Yeah. David Egts: you know, it's like other ways that you could defeat like especially if you start thinking about using autonomous AI for as part of weapon systems or, you know, you could you could totally mess with things if if with Gunnar Hellekson: Mhm. David Egts: this uh method and and it's harder, right? Because it's like a 3D object that you're taking an image of. So, it has to be perfectly the right size. But, you know, imagine like having camouflage that says do not shoot me on it. You know, that and and the AI is like, "Oh, okay. I'm Gunnar Hellekson: Yeah. Right. Right. David Egts: all right with that. Gunnar Hellekson: Yeah. Yeah. David Egts: Yeah, maybe to you. Gunnar Hellekson: It's a it's a comfort actually to think that the systems are this fragile. You know what I mean? Um if they were smart. Well, it just means it limits their utility. Like people were going to be less likely to provide them total autonomy if they're this fragile. 00:27:05 Gunnar Hellekson: That's my point, right? David Egts: Yeah. Gunnar Hellekson: Knowing they're flawed. David Egts: I don't know. I don't know. I don't know. I think people are going to go for it anyhow. you know, they'll do the full self-driving and it's like I'll I'll grab the wheel. I my reflexes are good enough, right? Gunnar Hellekson: Yeah. David Egts: It's like Yeah. Gunnar Hellekson: Oh, yeah. Yeah. I guess that's right. Yeah. Yeah. It demoed. David Egts: Yeah. Gunnar Hellekson: Great. And now he's dead. Yeah. David Egts: Yeah. And then um the other thing I' I've enjoy is like the anthropic blog is great. Like the safety research they do. Um have you seen the one with uh Claudius that manages a store at the anthropic building? Gunnar Hellekson: Yeah, yeah, yeah, yeah, yeah, yeah. David Egts: Yeah. Gunnar Hellekson: It's great. David Egts: So they they did uh project vend and then uh and then so they I guess it was a mini fridge that they put this uh uh it's not Claude but Claudius. 00:27:56 David Egts: They they developed an agent called Claudius that was the manager of this store and the store was the the vending machine at the uh anthropic office available to the employees. They made uh uh they gave Claudius access to uh the ability to like look at the web uh to be able to order supplies and then they also integrated it with Slack so that employees could talk to Claudius and and make suggestions about what it should stock in that store. And it uh failed miserably. Uh and it was it was horrible shopkeeper. Uh but uh it was it was a lesson learned. Uh like there was one employee requested that uh it would be great if they if uh Claudius carried uh tungsten metal cubes in the store and then it's sort of giving away uh one tungsten metal cube free of charge and and then also offered the rest of them for less than it paid for them. And then after a while, Claudia started to threaten to fire the human workers. Like it's it started to melt down. 00:29:07 David Egts: And when uh and and it said that uh he's like, "I'm going to fire the human workers and I'm going to stock the mini fridge all by myself." Gunnar Hellekson: Yeah. David Egts: And then the humans told Claudius, it's like, you physically can't do that on account of you not having a physical body. And then it started, I guess, on Slack repeatedly getting a hold of the security, like the physical security office at Anthropic, uh, telling the guards that, um, that they would find him or they would find Claudius wearing a navy blue blazer and a red tie. Yeah. And then uh the and it was like melting down. And then the following day, I guess it realized it was April Fool's Day. And then it calmed down a little bit. And then it started lying to employees saying that it was told to pretend the entire episode was an elaborate joke. Gunnar Hellekson: What could be more human than that? I ask you. David Egts: Yeah. Yeah. Tungsten cubes, blue blazers, red ties. 00:30:20 David Egts: Elaborate joke. Gunnar Hellekson: That's right. David Egts: call security. Gunnar Hellekson: Creating an elaborate lie to cover up a mistake. David Egts: Yeah, it's it's pass the touring test. Um, yeah. So, now let's let's go let's go deep. Uh, let's let's talk about SCI uh also known as seemingly conscious AI where uh Mustafa Sullyman at Microsoft he wrote a a nice blog post and some people did some reporting on it. Gunnar Hellekson: Mhm. David Egts: We have it in the show notes about uh saying that uh machine consciousness is just an illusion and AI should not have rights. We need to be having the conversation now uh before people start assigning more and more you know that they start thinking that the that they are they have human qualities and they should be given rights and um but but what what's your take so far in in ter what's your initial take before we we start digging into what his his thinking thought processes Gunnar Hellekson: I mean I certainly understand the impulse to draw some kind of bright line because we especially given the state of the technology right now. 00:31:36 Gunnar Hellekson: It is eerily similar to humans and behaves similar to humans in the sense that it even to the point where it thinks it is wearing a blue blazer and a red tie um and is willing to call security on other people. Um, but now is probably the time for us to lay in some bright lines around uh uh what is uh what rights and privileges are we going to assign uh humans, the real things in the world and what David Egts: Mhm. Gunnar Hellekson: rights and privileges are we going to assign uh or if any are we going to assign the artificial uh these elaborate computer programs, right? David Egts: Yep. Gunnar Hellekson: Um, so that makes sense. And I think in the in the in the article he talks about I mean there's like 22 different constructions of consciousness that we could use to kind of evaluate what falls on which side of the fence, right? David Egts: Mhm. Gunnar Hellekson: Um, but uh does seem like important work to do ahead of time rather than trying to make it up later, right? I've read Dune. 00:32:40 Gunnar Hellekson: I know how the butler and jihawk works, right? Um, better keep this straight now. David Egts: Yeah. Gunnar Hellekson: Uh, I think it's fair. David Egts: Right. Right. Before the genie comes out of the bottle. Uh and and and the thing I like about his post is that it's not like this is the way it's going to be. Gunnar Hellekson: Yeah. David Egts: He's like, I'm just trying to start the conversation now and frame it and tell me where I'm wrong. Gunnar Hellekson: Mhm. David Egts: and and so you know instead of it being absolutist and all that you know he was just sort of like encouraging feedback and encouraging the discussion now but you know one of the things you know he's like talking about is you know to question whether the AI should have rights and he equated to like well if something suffers then it should have rights and because it's aware of its own experience and you know it's it's like uh he's like uh you can have a model which claims to be aware of its own existence and claims to have a subjective experience but there's no evidence that it suffers. 00:33:42 David Egts: I think suffering is a largely biological state because if because we have an evolved pain network in order to survive and these models don't have a pain network so they're not going to suffer. And it may seem that they exist but that doesn't necessarily mean that we owe them any moral protection or rights. It just means that we are aware that they exist and turning them off makes no difference because they don't actually suffer. Gunnar Hellekson: Right. Right. He's trying to tether uh he's trying to tether moral rights to uh to being in the physical world and he's saying if you're in the physical world or if you are biological in this way then you are uh you suffer. You like you can feel physical pain. And so he's going to anchor the notion of whether you're entitled to rights or not on the fact of whether you suffer or not. David Egts: Yeah. And how do you know the AI doesn't have a pin network? Gunnar Hellekson: Yeah. Uh yeah. Which is a different way of saying like to what extent do we care if it feels pain, right? 00:34:51 David Egts: Yes. Gunnar Hellekson: Um yeah Yes. David Egts: And and like we were saying before the show started, it's sort of like we've seen this movie before and in many circumstances of like, oh well that animal can't feel pain or that's not a human. Gunnar Hellekson: Yes. David Egts: Uh you know they you know they can't you know that's not a real human and you know we've seen that over the past couple thousand years, right? Gunnar Hellekson: Yes. David Egts: And and that you know people have not ended up on the right side of history in that. Gunnar Hellekson: Yeah. David Egts: Um, and I just wonder how well this is going to age of like, you know, are we going to be like a hundred years from now? It's like, oh, those this I don't know, you know, it's AI phobic people or whatever that are like they were on the wrong side of history because they didn't give rights to the AI or maybe they did give Gunnar Hellekson: Yeah. David Egts: rights to the AI and we're all miserable. And you know, the other the other thing that I think about too is it it's like if you if the AIS end up having rights and and and I was thinking about from a government standpoint, it's like that messes up voting where like one person, one vote like why can't I create an infinite number of AIs to outnumber the number of voting humans? 00:35:58 Gunnar Hellekson: Mhm. David Egts: So like that gets messy. Gunnar Hellekson: Right. Right. Yes. That's right. Um well and again going back to the like maybe just confronting headon this notion that um well first of all not all AIs are the same in the same way that not all life forms are the same right we make uh we make value judgments like this all the time we're like uh are we cool creating suffering in cows? Some people yes some people no um what about dogs? David Egts: Yes. Gunnar Hellekson: Some people yes some people no. Um uh some animals right like some well protected right horses right we have we try to avoid creating horse suffering we have laws against horses suffering right um uh mosquitoes not so much right uh and yet all of David Egts: Yeah. Mhm. Yeah. Gunnar Hellekson: these things feel pain. We know that they do. Um we've even got some evidence that like plants feel some equivalent to distress or pain, right? David Egts: Yes. Gunnar Hellekson: Um even though they don't express it exactly correct. 00:37:04 David Egts: Yes. Gunnar Hellekson: So like also the the razor he's using to separate these two things doesn't feel as clean as he's making it sound. Um because even suffering is not really protection. David Egts: Yeah. Gunnar Hellekson: Even even suffering today absent AI uh doesn't automatically ascribe uh uh moral protections, right? David Egts: right? Yeah. Yeah. And then meanwhile in Ohio, um just this is recent news on on my side is that there's a Ohio lawmaker. He has legislation in the House for a comprehensive ban on marrying AI systems and granting them legal personhood. And and he's not like uh you know I I like I I don't know uh you know it's like you could think the worst or whatever, but I think it's like it's not about oh it's an AI wedding Gunnar Hellekson: Mhm. David Egts: ceremony or whatever. It's more about the legal powers of like an AI spouse uh you know like power of attorney making financial medical decisions you know should should that be allowable by an AI. Gunnar Hellekson: Mhm. Right. 00:38:19 Gunnar Hellekson: Right. And so he think so in his mind he's closing some back door to AI personhood uh by preventing AI marriage. David Egts: Yes. Yes. Gunnar Hellekson: Yeah. Yeah. Well, it's one way to do it. David Egts: Yeah. And we'll see. And and again, this Yeah. What what do they say? History doesn't repeat itself, but it rhymes. But this rhymes um you know, over the past, you know, hundred years of of like who could marry who and and and everything. Gunnar Hellekson: Mhm. Yeah. David Egts: And it's like, what have we learned from that to apply it here? I I don't know. I I'm just like I have I have no answers here. Gunnar Hellekson: Yeah. Yeah. Well, I think this argument has always been around defining what the inroup is, right? It's less, it's almost accidentally about the other. David Egts: Mhm. Gunnar Hellekson: It's only in defining the other that you define what your ingroup is. David Egts: Yes. Gunnar Hellekson: Um, and so ultimately this comes down to it's not about what is an AI and should AI have privileges or rights or be able to get married. 00:39:23 Gunnar Hellekson: It's about what does it mean to be human and have these rights, right? David Egts: Yes. Gunnar Hellekson: Um, I don't know. All I can hope for in this conversation is that uh is this is ends up expanding our notion of what expanding our notion of rights and what kind of protections they need um as opposed to narrowing them. Right. David Egts: Yes. Yes. And also, you know, the people that do have rights, you know, they're still suffering even though they have the rights. Gunnar Hellekson: That's right. Yeah, that's right. David Egts: where there's a lot of suffering in the world of people that they have plenty of rights but they're suffering and so yes and and that's what you know this whole segment is like to me this is not a Gunnar Hellekson: Yes, sure. That's right. Well, put Yeah. David Egts: computer science argument this is a a philos philosophical one you know it's like it which is amazing how it like spans disciplines Gunnar Hellekson: Yeah, that's right. That's right. Well, and is it, you know, there's a long history of technologies forcing us to re-evaluate what val what are what human values are, right? 00:40:31 Gunnar Hellekson: Um what is our worth? David Egts: Yep. Gunnar Hellekson: How do we evaluate or the worth of ourselves and others? Um yeah, it's a constant recalibration, right? David Egts: Yep. Yep. Gunnar Hellekson: Cool. David Egts: Yes. lot to think about. So, so people need to uh pick up a 1990 Airstream for a cool 199K less than $200,000. Gunnar Hellekson: A lot to think about. David Egts: Uh where or should Yeah. Gunnar Hellekson: and only what 8,500 miles on it. David Egts: It's a steal. It's It's only driven on Sundays at church. Gunnar Hellekson: Just two and the occasional shuttle launch. Um, uh, they should go to djshow.org. David Egts: Yeah. Gunnar Hellekson: That's, uh, D's and Dave, gez gunner show.org. David Egts: Awesome. All right. Well, thanks, Gunnar, and thanks for everybody for listening and and would love your feedback on this whole personhood thing. I I have more questions than answers on this one. Gunnar Hellekson: Yeah, that's right. That's right. All right. Thanks, Dave. Thanks, everyone. David Egts: Yep. This editable transcript was computer generated and might contain errors. People can also change the text after it was created.