AudioOnlyFinal-BrewingEpisode === Carolyn Ford: [00:00:00] Tech transforms, explores how technology is reshaping our world, particularly at the intersection of government innovation and human needs. In each episode, I talk with some of the most influential voices in technology, uncovering how they're leveraging innovation to solve complex challenges and improve the way we live. This episode is sponsored by Al Cyber Defense. Leaders in secure data transfer solutions for critical networks. Well, welcome to Tech Transforms. I'm your host Carolyn Ford, and today we're stirring the final potion in our Chills and Thrills series. You're in for a treat and maybe a trick or two. If you're watching us on YouTube, you're already seeing one of the tricks. And joining me today are two of my favorite AI Conjurers, Dave X or Mr. X as we, uh, came up with his evil name, I believe it was last year. Uh, Dave is the public sector field, CTO at MuleSoft. And Laura Klebanow, founder and chief storyteller officer for show and tell. So here's how today works. We have each brought a secret AI ingredient to our cauldron. Think of it like show and tell with a dark twist. And then we're gonna stir in what creeps us out about AI's shadowy side. So first we're gonna start with our favorite AI project or something that we've worked on recently. So I'm gonna kick us off you guys. Well, first of all, welcome Laura and Dave, Mr. X. Laura Klebanow Hello. Hey, Dave Egtss: glad to be here. Carolyn Ford: I, I've been very excited for this episode and thank you for, um, playing along. So, first thing I'm gonna do is talk us through my, my project. [00:02:00] I got the inspiration for this project by listening to a spot on NPR. I don't remember which show, but. AI expert came on and just showed how quickly he could create music, and it was actually really good music. You would not have known. That it was not, you know, top of the charts legit music. If he didn't do it right there on the spot, he was putting the prompts in. So I was like, okay, well I will do some theme music for our Chills and Thrills series. And listeners, you heard this at the beginning of this episode, what I created, so I went to, or I guess it's suno.com, and we'll put this in the show notes and I'm going to share my screen so you can see. My process, so going to Suno, I, I typed in what I [00:03:00] wanted, um, a description of the music that I wanted. It was way too long, so instead it had to trim it down, so I spent time trimming it down. Um, SUNO does not help you with the description. It didn't suggest like I expected it to like on some of the other tools. So what I ended up doing is popping over to chat GPT and saying, okay, here's the style I want. Help me get the right description. It kept giving me this weird circusy music. I wanted to be Danny Elfman. I wanted to be like, I wanted the, the to nail it, like the theme song for Beetlejuice and for Wednesday, this TV series and nightmare before Christmas. So. When I, what it got sent me back. I was just like, this is not Danny Elman. I walked away. In fact, I walked away for a whole day. I came [00:04:00] back. You can see the final prompt that I put in. This is what I got. Dave Egts: Yep. Alright, here we go. I thought that was great. Carolyn Ford: I loved that. Laura Klebanow: That was Carolyn Ford: so whimsical. Okay. That's what I wanted, right? I wanted the whimsical and so it got there. It's still no Beetlejuice theme song, guys. Come on. Speaker 5: Right. Carolyn Ford: Well here's, this is really a lot better than you're giving it credit for. Can I just say, oh my. You know what? Thank you, Laura. Because when I came back to it, I was like, okay, I can work with this. However, let me just throw one more thing in there. I went and searched for, for free music. Mm-hmm. Mm-hmm. And I found very similar, right? Mm-hmm. So the AI is generating the same thing. I, I found somebody on Suno that had used like a Danny Elfman like prompt as well, [00:05:00] and it very similar music. Speaker 5: Yeah, Carolyn Ford: I'm gonna change my background now. My filter. So now I'm somebody new for Halloween. I'm gonna kick it over to you, Laura. Share your project with us. Laura Klebanow: You know, I felt pretty strongly that what might be fun to talk about would be something that was potentially highly relevant to lots of people. Um, and what, and, and so for me that's image creation. I wanted to talk about that because I also think it's where you can have a lot of fun. Um, and it also has a lot of potential practical, potential practical applications, which to me is like always kind of the point. Um, how can you be imaginative while also using, you know, using it for practical applications. Not that I'm saying that using it to create. A Danny Elman inspired theme song isn't practical by any means. I do believe it's [00:06:00] practical. There you go. So my idea was, um, to use that practical application of image creation to show folks how imaginative you can be with it. And so I created a, just to give some examples of how much you can stretch tools like Nano banana, uh, is. Had the tool Imagine Images of the three of us as characters and saved by the bell. Speaker 6: Alright, Laura Klebanow: I'm excited because I think it worked out really well. Carolyn Ford: Okay. I'm excited too. Laura Klebanow: Oh my gosh. Speaker 3: Wow. Laura Klebanow: Then let me know if you can, I'm gonna put it, look Carolyn Ford: at Dave. Speaker 3: Yeah. I love it. Carolyn Ford: I do too. That's so fun. Look at the, Dave Egts: I know. I think I had that shirt. Carolyn Ford: I think so. Look [00:07:00] at Carolyn. Look at all of us. The hair, the clothes, I think saved by the bill was, was it nineties? Yes. Are we looking at nineties fashion Laura Klebanow: here? Yes. Speaker 5: Mm-hmm. Is this Laura Klebanow: the AI club? It doesn't really look like here. You're the president. This didn't really end up looking like us. Yeah. That one doesn't look like us, but I do love that it's the AI club Yeah. Club and that it's more intelligence gets saved by the bell. It's so stupid. But I think that playing and using the tools to like learning on the tools by playing is probably my favorite part of a ai. Yeah. Um, and so, like I said, I wanted to take a fun concept and use a very practical tool to mm-hmm. Show kind of how creative you can be with it. I think using reference and source material that is extremely like well articulated is [00:08:00] obviously like a key component of success. I won't say that I went to the level of saying we need to revive Dave's shirt. You know, or that I know Carolyn used to wear, you know, um, kilts to school. Carolyn Ford: I, I'm pretty sure I have that Laura Klebanow: skirt. The amount of specificity that, yeah, the amount of specificity is I think really like, you know, yeah. Key to success there. But you know, like I said, I mean, I think, again, I'm always excited about showing, uh, fun. You know, playing is the way to learn. And I, I think we can all agree about that. Carolyn Ford: I agree. What tools did you use to create Laura Klebanow: from an I and Nano Banana? But then I also, because I'm an overachiever, and this is gonna be clear from the. Um, it's gonna be clear from my next piece is I use the career planner gem because I've been thinking about all of the people who [00:09:00] are trying to be competitive in the job market, especially new job entrants. And just a quick shout out to a recent offer that I made that I wanted to mention on this podcast, which is that, um. I'm sure Carolyn and Dave, you'd both be willing to do this too, is if to any new workforce entrance or anyone in the workforce who's, um, making a change or looking to have a conversation, I think between the three of us, we could, we could definitely help. And so I just wanted to underline that, um, offer that I, uh, had made recently on my own LinkedIn. Um, but I was thinking about. Job entrance, new job entrance, workforce entrance. So I use the Gem Career Planner, um, to imagine that we're at the high, we're at high school graduation and saved by the bell, the three of us. Speaker 5: Mm-hmm. Laura Klebanow: And to help us plan our future careers. I love it. Information about who we are today. Speaker 3: Mm-hmm. Laura Klebanow: I have to tell you, I thought it was, [00:10:00] first of all. It, it was extremely biased because it knew who we are now. Mm-hmm. Speaker 6: So Laura Klebanow: into that. Um, but it was very accurate in terms of how like the tools and skills that we had in potentially in high school would've translated to who we became today. But in the, um, in this career planner, they have an alter, they have an alternative career tool. Mm-hmm. So I asked it like what our alternative careers for the three of us would be. And Dave was an audio AV engineer and sound engineer. Speaker 6: Alright. Laura Klebanow: You were a high stakes negotiator. Carolyn Ford: Oh, okay. I can totally see Dave as a sound engineer. 'cause Dave, you're, I mean, you're somewhat of, not somewhat, you're a musician. Speaker 3: Yeah, yeah, Dave Egts: yeah. Carolyn Ford: I could see you wanting to even dabble in that. Dave Egts: Yeah. Well, high-Fi action. Yeah. Carolyn Ford: Right. What about you, Laura? What was your alternate, uh, Laura Klebanow: [00:11:00] children's Carolyn Ford: book Laura Klebanow: author? Carolyn Ford: No way. I love that one. Which we're gonna get to the children's book soon. Yes. So, um, so Gem, I'm not familiar with that tool. What's the URL for it? Laura Klebanow: Oh, it's a gems are the, um, those Oh, in Carolyn Ford: Gemini? Yes, in Gemini. Okay. Yeah. Okay. Mm-hmm. So, yeah, I use Gemini too, and I bounced between Gemini. Chat, GPT and Suno for my project. I moved through all of them. Um, and I do love that offer. Absolutely. I, I'm happy to talk to anybody. And here we go. Dave is, uh, I don't know, officer, Speaker 3: don't, I don't Speaker 5: know Carolyn Ford: their names from chips, right? Or, um, Reno 9 1 1. Gotcha. Oh yeah. Alright, so Dave, what was your project? You're gonna put us all the shame. I know, but Dave Egts: No, no, no. I, I love the creativity and like, Laura, you were like, I like taking the [00:12:00] tools and take using them in a way they may not have been intended and Right. And, and that's sort of like what I did with the, the storybook gem. So inside of Gemini, uh, there's a gem to do children's stories and, and I'm like. Okay, that's cute. You know, if you like to do children's stories and things like that. And, and it's like, that's okay. And I went and, and I was like, I wonder if I could use this. How could I apply this to work and how could I do it for work purposes? So I actually, you know, was planning for a customer meeting and, and I did some, uh, Gemini, I did the deep research. I got a giant report from this customer and, uh, on this particular customer about how they apply to public sector and how MuleSoft could help. And then I took that, and then we had a prep call that I had the transcript for. I put all that into the storybook, Jim. And it created a storybook that put the customer as the [00:13:00] hero in the story. And, and, you know, and how MuleSoft comes along and helps 'em and everything, it turned out really, really well. And it's a great way to get customer's attention and everything. And um, you know, I wonder if it's the kind of thing once everybody starts using it, you sort of get desensitized to it, right? Sort of like the first time you heard like a notebook, LM Audio podcast, it like blew your mind and, and now it's like. Your uncle sharing you and, uh, you know, a summary and you're like, oh, you know, I've heard this before. Right? Yeah. So, anyhow, for us, I, and I'll share my screen, I, I was like, okay, well, thinking about the. Uh, you know, the Halloween theme and everything. I, I was thinking about the three of us in Scooby-Doo. And so what if we did a story of, of, uh, uh, like a Scooby-Doo story that, uh, that was based on tech transforms? So, and Dave, Carolyn Ford: before we go to the Scooby-Doo story, did you open your meeting with the storybook? Did you read it? Dave Egts: I for, for the one that I did, I, I didn't [00:14:00] do, uh, like, I just kept it to myself and shared it with people internally. And then there was another one that I, I shared it with a customer and, and it, like, they responded to me right away and it, it blew their minds because I had, they were the hero of the story and our coworkers were in the story. And yeah, everybody was all ripped. Even though like their coworkers are like, like way older, you know, but they, yeah, they're like 20 years old. Watch board stomach and everything. And I'm like, yeah, that's how they look in real life. So Carolyn Ford: I love that. I mean, what a great way to get people's attention. Right. And a fun. Fun way. So Okay. Now to, to us back to us. The three of us. The ghost. Dave Egts: Yeah. So, and, and you know, this, I, I don't know, it's maybe like a five minute thing. I, I won't play the whole thing, but we could put the link in the show notes and people could listen to the story and everything. Speaker 3: Yes. Dave Egts: But, um, what I did for the prompt for the, was that I, you know, I made it, it was, it was much more on the prompt side where like, I [00:15:00] originally uploaded like, images of us and it, it didn't work out well. Mm-hmm. So instead of doing the headshots, I said, you know, in the style of Scooby-Doo where, uh, you know, uh, uh, you know, Carolyn is Daphne and, you know, down, down the line of, of all of us, you know, go ahead and generate this. Speaker 3: Yeah. You know, Dave Egts: generate this story and, and also start off with the mystery machine, but change mystery machine detect transforms on the van. Right. So here we are. And so I can, uh, I can play the story. Speaker 7: The case of the grumpy ghost. The tech transforms van. A colorful vehicle that looked strangely familiar, pulled up to a spooky, creaky old house. Well gang said Carolyn, the host of the popular podcast. This looks like the perfect place to record our Halloween special on the scary side of ai. Like, are you sure about this? Ask Dave. He clutched a box of podcast snacks to his [00:16:00] chest. This place is giving me the heebie-jeebies. Carolyn adjusted her scarf and smiled. Don't worry, Dave. There's no such thing as ghosts. Inside the house was covered in a thick layer of dust cobwebs, draped from the chandeliers, like ghostly decorations, jenky. Laura exclaimed. This place is an archeological dig site. Let's get the equipment set up. Just as Carolyn started the show. A long spooky moan echoed from the hallway. Get out, it wailed Dave Yelped. Zoinks and leaped into Laura's arms scattering podcast snacks everywhere. Just suddenly a dusty. Dave Egts: Alright, I'll pause it there and leave it. Leave it up to everybody to get the list. Leave the listener hanging. Carolyn Ford: They don't know Laura Klebanow: what's gonna happen. It, they gotta Dave Egts: go the show notes. Yeah. Laura Klebanow: Check that out in the show notes. Everybody, this is rich. This is what I mean when I say I, I did not live up to the rich multimedia [00:17:00] delivery, but hopefully my imagination made up for it. And Carolyn, as you say, oh yeah, do as well as Dave. He's, he's an ultra user. Carolyn Ford: Well, and honestly, Dave's constantly sending you and I things like, go check this out. And Dave had us look at this storybook. I did a storybook yesterday, um, and I did it. I used to tell my kid, and then my niece now, who's 10, when they were little, I told them this story about them going on an adventure. So I built a, a storybook too, and I put in quite a bit of detail, but I, I was kind of general about some of the things, and I was surprised that the storybook came back like. Layla goes on her adventure and she meets, um, al the alligator and they, she meets animals along the way and then they have, you [00:18:00] know, some diversity, some challenges, and each one of the animals are able to help them through different challenges. Speaker 3: Mm-hmm. Carolyn Ford: You guys, I don't know how I feel about this because I did not tell. Gemini, what the challenges were or what each of the characters special skills were and it got more than one. It not only, in fact, I didn't even tell it that Al, the alligator's name was Al, so this is Al original I am, and it knew, it just named him Owl and then it knew their special talents and I'm like, oh, I thought I was just so creative and original. It pretty, it nailed it. Now I will say I wish I could print the storybook, 'cause I really wanna print it out for my niece, but you can't do that yet. Dave Egts: Yeah. And that's the thing. I think the sharing is hard and mm-hmm. Uh, yeah, like save it. I, as A PDF, you would think there would be an upsell opportunity to have it go [00:19:00] to Google Photos and it prints, and you can buy a print, you know? But someday, I'm sure. Carolyn Ford: Yeah. Yeah. I mean, you can't even save it as a PDF. All you can do is share it electronically with other people. Right. Still cool. And still super fun. Laura Klebanow: Yeah. But it is a little frustrating that you can't do that. Yeah. 'cause I, I felt like it was, I mean, that was initially, I mean, and this is one of those things where it's like, you know, there's probably reasons, right? Like, I mean, there, I always tend to think, like, in that type of scenario, there's always reasons why like, we're not able to do it. Like able to just like print, like print to PDF or whatever it is, right? Because I mean, and, and I don't know what those reasons would be, but. You know, I still think it's worth kind of registering that feedback because I would've shared several projects long ago. Carolyn Ford: Absolutely. Yeah. And I would, like I said, I want to print that storybook 'cause it is a story that I told, I've been telling for 27 years. Yeah, great. And it has like important meaning [00:20:00] right? Yeah, exactly. All right, well, now that we've brewed up some cool stuff, let's talk about some of the more chilling side of ai and to kick us off, if AI were a monster, which one would it be and why? Dave, let's start with you. Dave Egts: Yeah, so like a lot of times you're thinking like, oh, it's, it's gonna be the Terminator or whatever, and I was thinking about this and then I was like, oh, it's it. That robot from the Alien movie and I'm like, what, what was that robot's name? And I had to look it up. And his, his name is David eight. I don't know if you, you know who that is? Carolyn Ford: I don't remember it, but it's been so long since I've watched Alien. I don't even remember a robot in it. Dave Egts: Uh, like the, the later ones there is a robot. His name is David eight. Uh, played by Michael Fastbender. Carolyn Ford: Oh, that's why I haven't seen those. They're too scary. Yeah, those are, yeah. Dave Egts: Yeah. So, uh, and, and so like, and I put a link in the show notes where Okay, you could, you could [00:21:00] see the scene, but it was one of my favorite parts of that movie where it's like, you know, the human comes in and, and David eight is like trying to build more aliens. Mm-hmm. And as you know, he's a robot and everything. And then you have the eggs that like open up and then there's the, the, uh, face hugger that pops out of them, right? Mm-hmm. And, uh, David a. Goes over to, or you know, he, he says A human, come on over here, take a look. Something to see, to get the person to like look in there and have the Oh yeah. Speaker 3: Yes. Dave Egts: So, yeah. And that's where I'm just thinking that is, is it that siren song of, of like, uh, you know, uh, of our best interest in mind and everything. Carolyn Ford: Yeah. Yeah. Alright. Laura, do you, what, what would your monster be? Laura Klebanow: Yeah, I mean, I am not gonna. I'm again, not gonna live up to that level of sophistication. 'cause for me it fully is like terminators and Skynet. Like I, I mean from the moment in Terminator two, where the [00:22:00] hand is a blade. Speaker 6: Mm-hmm. Mm-hmm. Laura Klebanow: I'm talking about, that is to me, like the scariest thing, but I, I imagine it being more like, has, have either of you seen the devil's advocate? Speaker 5: Yes. Such a good show. Laura Klebanow: Yeah. Or Al Pacino is the devil and he can find the, he is like, what's your fa, you know, he talks about this people's, um, character flaws and how to manipulate them. And I think like over time as especially I think younger, you know, maybe like Gen Z or younger generations, you read about them sharing so much like with ai, um, and then also like. Turning to it for advice. Mm-hmm. You know, some of the more like emotional sides of ai. I think that that's, that's how it will influence mi like meta in a meta way, influence micro decisions that people make and ultimately like their fate. Yeah. Asking for advice, like [00:23:00] what, who's to say it's not giving you, it doesn't know anything. Right. It's so, but it's, but it, it feels so. There's nothing that that dopamine hit of you ask it a question and it answer so fast. Carolyn Ford: You're so smart and pretty and popular. Yeah, Dave Egts: right. It's like fancy. Carolyn Ford: You can, it's like feeds the delusion. Yeah. Yes. Absolutely. So my, that's a great Dave Egts: idea. Yeah. Carolyn Ford: I mean, if it's a monster, it's always gotta be Hal. I mean, that's my original ai, you know, from Space Odyssey 2001. Yes. Some of the. Reports that I've seen, like I saw a Gartner report for 2024 that said, battling mal information will cost enterprises more than $500 billion. Do you guys think AI is fueling that malformation monster? Dave Egts: Yeah, it's, it's scaling it out, right? Where people could use it for, you know, just misinformation. And imagine [00:24:00] how tailored it can be for an individual as well. So instead of having a. A tweet that you're blasting out to thousands of people and it's one size fits all. You could really target the, you know, the misinformation for a particular demographic, right? And, and, you know, use the, the Facebook ad algorithm or whatever based upon whatever demographic, what location, what age range, what, you know, all that. And you could really hone it in. Laura Klebanow: It's, it, it must be, but I think it's the reasoning piece of either you. Know enough about how to use it and you just don't care. Speaker 6: Mm-hmm. Laura Klebanow: About the implica ethical implications of it, or you are not that sophisticated of a user and they don't say that in any kind of, you know, in any critical way. I mean, it's just lots of us are using it all the time for everything it we're. We are perpetuating the misinformation that it's providing us. Carolyn Ford: What [00:25:00] fears. About AI do you think are overblown? So Dave, I'm just gonna do a plug for the Dave and Gunner show. You guys, if you haven't listened to the Dave and Gunner show, it is one of the weirdest, funniest, like Dave and Gunner's humor. I'm laughing out loud as I listen to this podcast, but it also, they also bring like the weirdness about technology in and just. The most bizarre things. So on one of your recent episodes, Dave, you, you and Gunner talk about spare human body parts, which reminded me of the movie, the Island. Mm-hmm. Speaker 5: Um, Carolyn Ford: yeah. Emotion tracking, smart glasses. Will you talk about both of them? Dave Egts: Yeah. So, you know, there like Garner and I have been like, I'll, I'll see weird stuff on the internet and then we, I'll save it and then we, we talk about it in the podcast and uh, you know, we've been talking a lot about. Organoids like brain organoids where [00:26:00] you can get these brain cells and then turn them into, you know, you can grow them in a Petri dish and then all of a sudden you could plug it in the brain organoids into a computer and, and have the brain cells actually do co-pro and um. That was one part of it. Recently, on, on episode 2 72, we talked about, uh, not just organoids of individual organs, but actually body Lloyds and the ethics of like, what if I can clone, uh, Laura, right? And, and then build a perfect replica of Laura. But I'll intentionally, uh. You know, damage to the development of the clones brain to not like, be functional at all. So imagine having like a room full of Lauras, like on slabs waiting for Laura to need a new liver or. Whatever. Right. Uh, or a kidney or a heart or whatever. And, and so, and, and we, it was interesting 'cause Gunner and I were [00:27:00] talking about that, of like, what, what's the ethics behind that of like, is that a good thing or a bad thing? It's, you know, in terms of having people, you know, having organs be available, is that, is that a good thing or a bad thing? And so we. Instead of having it be like a simple, easy answer, it was, we spent a lot of time like thinking about like, no, it's not exactly right. And then the other part is, what about like, oh, well what about cloning animals? So instead of having the, the animals, you know, you raise them, you slaughter them and everything. I could, I could do body lloyds of cows that are not, you know. Aware or conscious and everything, and they're just sitting there growing. And whenever you need your meat, you got your meat. Right? And, and is, is that ethical? Is it unethical? And so we left a lot of questions there, but it was a, a fun dialogue that we had that got pretty deep. Carolyn Ford: So in the island, they clone full people and their conscious and they're, I mean, you know, so there's that. But I don't know who put [00:28:00] this article in that Microsoft's AI chief. Says machine. Yeah, I did that. Okay. So talk about this, Dave. Dave Egts: Yeah. And this is like, you know, of the things that keep me up at night. I, I start thinking about like one of the things that in this article, and I was like, whoa, this is getting really, like, I wonder how well this interview is gonna age. Uh, so this was an interview with, uh, Mustafa Soliman, the, uh. Microsoft's AI chief saying that like, oh, well we have all these ais. Well, they don't have rights. They don't, they can't feel they can't do this, they can't do that, so they don't count. Right. And, and like I go back in time to like, oh, you know, women can't vote, or these people can't vote, or, you know, it's like they're not, Carolyn Ford: yeah. Animal farm. Georgia or Animal Farm. Dave Egts: Yeah. Well, and and to me it's like, are we gonna reach a point where the ai, you know, well, you know, the thing you [00:29:00] talked about is like, well, it's like, okay, maybe they could be conscious, but they don't suffer. They don't really suffer, they can't feel pain. And is that Speaker 5: that Dave Egts: right? And also does that, is that what unlocks the ability for you to have some sort of equality, right? Speaker 3: Mm-hmm. Dave Egts: And then I thought about it more of like, okay. What happens when it reaches a point where the ais do have rights, and then what does that cause that that, um, infinite surplus of ais, like, okay, now they have the right to vote, but now all of a sudden I have more ais than I do people. Speaker 3: Mm-hmm. So Dave Egts: does that change how voting gets done? Mm-hmm. You know, does, do they get an equal vote? Uh, especially if there are more ais. And so anyhow, it's like I would, I would. Read that article through the lens of like, look at the past a hundred years, 150 years of the things that we regretted that, uh, you know, that it's like, uh, and, [00:30:00] but you know, am I anthropomorphizing the ai Now I maybe, I don't know, you know, and all that, but, but to me it's, it's like that, that's something that keeps me up at night from a a right standpoint is like, what, what is that gonna look like whenever they. Maybe they're smarter than us. Right. And, and does that mean that, that, do we still deny them rights? Laura Klebanow: Right. So when you were reading that, when I was reading what you, I mean, when I was reading about the rights piece that you put, you know, in our planning notes, I was thinking about, and this was triggered by that show, that show, and you can probably tell a lot of my thoughts are triggered by television shows, like, um, years and years. Did you see the show years and years with uh, no. It's fabulous. And it's about, it's sort of like children of Menes. Okay. Speaker 6: Um, Laura Klebanow: if you like that movie, um, with Clive Owen, but there's a character, the, a teenage daughter. A teenage daughter who wants to incorporate more like body mods, tech [00:31:00] mods to her body. So like, at what point did she, does, do you stop? What, what point are you more technology than you are? You know Flesh. Speaker 6: Yes. Carolyn Ford: You know what I was just thinking about too, with the Google tools, the AI tools, I'm totally like shifting gears here. They really work well together, so, so I can switch over to my Google doc, our show notes. It keeps my screen up, which if I tried to do that right, yeah. On another platform, I have to start juggling things. So I will say like, I use multiple tools. I, I use the Google tools. I'm even using Jasper Chat, GPT Perplexity. But I do like the way Google integrates the different tools. It's nice. Mm-hmm. So, so. Let's shift gears a little bit and talk about creativity as an antidote, [00:32:00] and you kind of kicked us off this way, Laura. Um, but can creative a AI projects, I, I mean, can they help educate and disarm the public about how AI works? And there's, there's a lot of people that still. Won't touch ai. They think it's the great devil. Right. So is is creativity the gateway drug? For Laura Klebanow: the haters. I, I think that it just like, I mean, people fear what they don't understand, right? Um, they don't know how to admittedly like, I think we would all admit this about ourselves. We don't know when we get that dopamine hit, how to set guardrails for ourselves. Carolyn Ford: Yeah. Laura Klebanow: Um, so I mean, I think it's all of those things, but what I love about. I, I don't know that creativity is the antidote, but certainly it is the empower of the empowerer of some [00:33:00] being possible out of the availability of these tools, even though there's so, I mean, I think you can tell also just by Dave's like earlier comment, you know about how like using the tools for not what they're meant to be used for. In concept like adult canvases in Gemini might have been conceptualized to write code, but like, I'm not gonna be doing that one 'cause it's not my job. And two, because it can give you mal, there can be malware in there, right? Yeah. Yeah. But I mean, I Opening your But being like a, like if you're someone like me, and I think a lot of people are like this, that they go to museums and they look at. Paintings and they think like, oh, I wouldn't just give anything to have the talent. Like I would just give, I would just, it's just how talented must the person have been who created this? Mm-hmm. And if there's any sliver of being able to bring that home and. Be imaginative to almost like, I don't wanna [00:34:00] say, I mean, I'm, I'm thinking I think of this in a dark way, but like almost anesthetize us because honestly at this point it's, we're, it's too late. Like the badness of whatever is, like whatever's gonna make this more ethical is probably the gate already. Um, and so I think that, like I said, I think that, you know, someone left the gate open and the ethics are out the window already. So if we can, as I, this is a dark perspective, but if you can use it to. For the power of good, like, which is creativity, which is, you know, being able to imagine other worlds at different dimensions. And I think that that's, you know, that's the upside. Speaker 5: Yeah. Laura Klebanow: Any, any like artist or creator or technician, like worth their salt learns the tools when a new tool becomes available? Like, you're not gonna tell me like if new type of paint came out that Michelangelo wouldn't use it. Speaker 3: Yes. Laura Klebanow: I mean, he would at least check it out. Dave Egts: Yeah. Carolyn Ford: What do you think, Dave? About it taking our jobs? [00:35:00] Dave Egts: Yeah. And I've been, I've been, you know, hearing similar things like, uh, like I'll talk to like. People at at that, like you would think, like at universities, they would be very, very progressive in terms of like adopting and embracing ai. And like Ohio State actually is doing it, they're retooling their curriculum to put AI and everything, to have everybody be AI augmented, where there are other universities that they're like tenured professors that are, they're fixing their ways and they're not gonna change. And, and then that becomes a problem because they're like denying it. Right. And I, you know, to me it's like. It like, like Laura said, it's this new paint comes out. You gotta try it out, you gotta be able to use it. It's, I love that Carolyn Ford: analogy. Yeah. Dave Egts: Yeah. And this, this goes back to CGI and, and, and everything else that it's like those, it's not that, you know, the person didn't describe themselves as a, a visual set designer and they're doing clay. Whatever, or model sets and everything for the [00:36:00] Godzilla set, right? Those people re-skilled. There's some that clung to that, and that's what they're gonna do. But then there are others that they re-skilled themselves to learn it. Taking that artistic vision of what the special effects are and, and computer generating them. Mm-hmm. And, and I, I think that's something that we could do. And then, so, so replacing jobs, it's like, I, I look at it as. The way I've been telling people is you wanna look at like for any career job description, look at the jobs to be done in that job description, and then which are the things that AI could re easily replicate? And then let a use AI to do that part of the job. And then of those parts of the jobs that free up, what are the new things that you can learn to make yourself more valuable? But there may reach a point where. The jobs to be done can be com for that job description can be completely replaced by ai and then all of a sudden you become the next phone operator or elevator operator that [00:37:00] it's like, yeah, that, that job category is gone. Right? Yeah. So you, you have to be thinking about like the jobs to be done, what, what can be replaced and at what point do you actually need to consider changing job categories to be marketable? Carolyn Ford: That's right. And even in my job. You know, as as a storyteller marketer, I was, a recent report came out. I wanted to do a quick comparison of where my company supported this new cybersecurity report, and I threw it into an AI tool, and I'm not gonna say which one. It spit out a bunch of garbage, however it sounded so authoritative and so, right. Yeah. But were I not an expert in my field? I would have shared that garbage and I'm like. Where the f Did you get this? Yeah. Yes. And like it wasn't even, it was so, so off, but I only knew it was off because I'm an expert in my field. Totally. I Laura Klebanow: mean, [00:38:00] that's where they, that there's not, and then that, and the expertise is what, in a lot of ways, this is why I worry so much about new entrants to the workforce, because your expertise protects you from making mistakes like that. But like even on the, even on that career guide. Gem that I used it actually said, you know, because I said, can you imagine how the three of us would've would meet, you know, based on what I shared about like Bayside High and we're graduating and you know, whatever. And it was like. You know, Carolyn's team just had a massive, you know, department of Defense contract win, and they need Dave to come in to do some advice for how to structure the deployment, and they need Laura to come in to help write a communications plan about it. And I was like, wow, you're really smart. But sorry, not to take forever to get to the point, but the, um, the, what I, what I realized [00:39:00] was, you know, in having that conversation is that there was some value. To having, you know, the interaction because it said, if you wanna be really good at your job and you're an entry level marketing person, you could do this. And I'm like, okay. So let's just say you're like, be a, like there's a lot of smart people out there. You're 21, 22 years old. You can read something that it says, here's what a, a person who is great at this job would do. And make some kind of, you know, do have some sense of which parts to take and which to leave behind. You know? And even that helps. Like I, I swear, I swear to you both. I, I remember again, I'm having this obsession with the new workforce entrance. I used to hate networking. I used to hate, I used to be, it was excruciating to show up person. It was excruciating. If I had a tool that I could say, what would a person who's good at networking do in this situation? Seriously, then you can, then you can really fake it till you make it. Carolyn Ford: That's right. [00:40:00] AI allows us to become specialists because it can do what you said, Dave, take all that stuff we don't wanna do. Mm-hmm. And do it, and then we can specialize. Laura Klebanow: I just saw something about as, as people get younger, like they're more willing to have or could envision themselves having a romantic relationship with ai. Right. That's happening. That's a thing. Mm-hmm. How are you gonna know that's not weird if you've never experienced something else? Yeah, that's right. I mean, I just don't think it's like fair. I, I think that in, when it comes to fairness, I just worry a little bit like we have the benefit of wisdom, age, bad choices, good choices. I mean, I, I just, I, I think the ethics piece of it is super important. Like, we can't have kids being, you know, asking chat GPT if it's the right time to take their life. It. You know, I mean that's like, I'm like no. Yeah. I'm like we need to teach better, do better. But I dot think it does like segue into [00:41:00] your point in the show planning that we did about like hate speech and is in the overall like detriment to society that AI has. And I would be like remiss if I didn't bring up. You know, that this was completely predictable. When I worked in the very beginning of my career at the Holocaust Museum in DC there was quite a lot of fear around like, how the internet just spreads hate speech. Well, now it's like, you know, so I I, I would totally be like remiss in not bringing that up. 'cause I think it's a concern for sure. And if it can't, it, it is a mirror, right? Mm-hmm. So if what it's seeing is ugly, it's gonna reflect that back. Carolyn Ford: Yeah. Laura Klebanow: And that's like, that's, that's gives me the chills. Carolyn Ford: So I was gonna ask you both for one AI treat, what's your hope for AI and one AI trick? What's your fear? So Laura, the, the hate speech. And that brings it back to even what you said earlier, Jay, that how quickly [00:42:00] ai, not only can it generate awful content, it can tailor it and target it right down to the individual at ludicrous speed. To quote baseballs. So, alright, let me, do you wanna, do you wanna go with your tr your treat and trick? Laura, I feel like you already kind of hit on it, but do you, Laura Klebanow: oh, my treat is, um, you know, the power for every person with a computer and an internet connection to imagine a day in the life of. John Basquiat or Andy Warhol or it just the, like the treat of being able to live in a different Mm. A parallel universe. Live in a parallel universe because escapism is worth, is worthwhile. Speaker 6: Mm. Laura Klebanow: I think it's totally underrated. Carolyn Ford: I love our altered states, which is what the three of us have been doing for the last hour [00:43:00] with our filters. Yes. Dave Egts: Yep. Carolyn Ford: Alright. What about you, Dave? What's, what's one treat and one trick? Dave Egts: So, uh, for me, the, the hope is like, and, and Gunner talks about this on the podcast is, uh, abundance, where it's just like. So much more that you could be able to do than ever before. And it's just, you know, hopefully people are harnessing it for the powers of good. Uh, you know, the, the fear is, you know, all the downsides. Again, that's something that Gunner and I talk about on the podcast all the time. Hey, this new cool technology comes out, okay, what are the downsides? And, and so like thinking about. How do you like for kids? Like, imagine it's like that storybook gem and in giving that to like a, a young niece or nephew and say, Hey, try this out. And all the cool things they can come up with that they never, you know, like awesome. Right. And then, but imagine them using it for bullying [00:44:00] and. And so how, in the same way that, you know, I wish we did more of this. Like I remember when I was in school, I had home ec, uh, to learn how to, you know, be an adult, right? Which I don't know if we do that as much anymore in school, but it's like we almost need to do that with AI to teach a kids of like, oh, here's a cool thing that you could do with a. Uh, you know, a storybook and do something cool and everything. And it's like, okay, but also let's look at the harm it could do and let's teach them, okay, what are the, what are the outcomes if somebody does something that is very harmful? Yeah. Speaker 3: And, Dave Egts: and to show them that, oh my gosh, this isn't just kids joking around. This could be super duper bad. This could be really bad and lead to very bad consequences. So hopefully the education, we gotta be educating people. Start doing that at a, at a young age. Um. For the older people that are, you know, set in their ways and they, you know, believe whatever is on tv, because it's on tv, it has to be true. Right. Or I saw it on the internet. It has to be [00:45:00] true. Right. It's like, how do you, how do you teach them? Speaker 3: Yeah. Dave Egts: That this is, you know, how do I identify what, you know, uh, uh, propaganda looks like, what falsehood looks like. So, yeah. Mm-hmm. Carolyn Ford: So what both of you said. Perfectly, you know, dovetail into my treat. My hope. One of my favorite authors is an Irish philosopher, John O'Donohue. Um, and in one of his books, Anna Macra, which means Soul Friend, by the way, he talks about it, is just giving me chills thinking about it. 'cause I'm imagining myself standing. In my beautiful Ireland and he says, find your beautiful, hold onto that. And you know, life can be really hard. And I think ai, to your point, [00:46:00] Dave can help get the garbage outta the way. So we really can. Just hold on to our beautiful, I mean, look at what we're doing right now. You, you two are my beautiful right now. Aw. And you bring me both, both of you bring me so much joy and, and. We get to do this because of technology we've never met face-to-face in real life, Dave, which is the weirdest thing to me because we've been friends for years. Speaker 3: Mm-hmm. Carolyn Ford: In fact, I wouldn't be friends with you without Dave. That's right. Dave put it like, got us all together. Laura Klebanow: MasterCard. Speaker 5: Yeah. Laura Klebanow: He engineer of relationships too. No, I mean, I, I think like, just, just like. What? What do we have it? It's our imaginations and our ability to share that with other people. Carolyn Ford: That's right. Laura Klebanow: That's really it. Carolyn Ford: Yeah. And my fear, I guess, is [00:47:00] exactly the same fear that you two both have. That we rather than hold onto our beautiful, it's easier sometimes to go down. Right. And we find the dark and we, and AI can, man, it can take you dark. It really can. So I guess that's my fear, so, okay. If you could bring one piece of AI power tech into a haunted house, what would you use to survive the night? Dave Egts: Probably a, uh, like an EMP generator. Right. Electromagnetic pulse. So I could blast whatever and, and make it shut off. Carolyn Ford: Yeah. Yeah. What about you, Dave Egts: Laura? Because one does Yeah, Laura Klebanow: yeah, of course. Um, I mean, I'll take any of them 'cause I'm probably just gonna hide in the corner and curl up and try to disappear into a different world. Let's be honest, I'm a lover, not a fighter, so I just like something to Carolyn Ford: talk Laura Klebanow: to. Carolyn Ford: I know you guys, I would just take you guys. I don't need ai. I would take you guys when. All right. Um, there's a comic Dave Egts: book about that or a storybook about that that people should watch or check [00:48:00] out. Yeah, Carolyn Ford: that's right. It's in the show notes. All right. Um, what's an ai, urban legend? You hear a lot, but wish you could debunk once and for all. Laura Klebanow: I'm gonna start with you this time, Laura, but just that everyone, at least students are gonna use it to do all their homework and they're gonna use it to do all their, that's like basically saying, Hey kids. Hey kids, you don't, you don't take your schoolwork seriously. You don't, you know, you're not gonna, it doesn't give them any benefit of the doubt. Carolyn Ford: Yeah. Laura Klebanow: Like, I mean, I just, I don't, that's not, let it go, let it go. Also like, okay, so what if they used. GBT. What if a student used chap GBT to give them some ideas for something like relax? Dave Egts: You know, to me there's the whole AI stealing jobs thing that is like, you know, like, oh, we gotta deny it. Right? And then I also think there's a hyperbole on both ends of the spectrum of like, oh, it's good, it's utopia, it's dystopia. I think the reality somewhere in the middle where it's gonna be a little bit of both. Carolyn Ford: I agree. And the job, the job [00:49:00] stealing one is Yeah. Jobs are gonna change just like they always have. Look at the industrial revolution. Right? Speaker 5: Right. Carolyn Ford: It's up to us to, to adapt. You know, it's one of the things, I used to be a school teacher. Mm-hmm. There's, mm-hmm. There's a fun fact, you probably didn't know exactly eighth grade English. So one of the, my, my most powerful tools was adapt. Speaker 3: Mm-hmm. Carolyn Ford: Monitor, adjust, adapt. For anyone looking. To start experimenting with AI in their organizations. What's one small step they can take today, Dave? Dave Egts: Uh, I think it's just get started and, and, you know, start using the tools of like, what are the things? And even having a conversation with it. Like, don't treat it like a search engine and you ask for an answer and then you close the tab and you move on. It's like keep having a conversation with it and, and having a longer conversation of, whoa, what can you do? Tell me more about this. Elaborate. Yeah. And, and I, I think that's, [00:50:00] you know, to get some ideas and brainstorm, like for a lot of the people that I mentor, you know, it's like I could give them all the advice and everything, but I would be like, that sounds like a great homework assignment for you to brainstorm this with a, with an ai and to teach them that this is something they can do on demand. Where, you know, think about your, uh, you know, I, I know Carolyn, you said this before the, your personal board of advisors, I'm sure. For the longest time it was all a hundred percent human. Speaker 6: Yep. But my guess Dave Egts: is that there's some percentage now that is ai. Speaker 6: That's right. Dave Egts: And, and each of them have a different, maybe a different persona of, uh, you know, so that you have them play Carolyn Ford: well. And you've taught me that you can set that up in, uh. It's LM Notebook or Gemini? One of 'em. Dave Egts: Oh, notebook. Lm. Yeah. Carolyn Ford: Yeah. Notebook. Lm. There you go. So set up your red team, set up your different personas and have 'em red team, your ideas. Speaker 3: Yeah. Carolyn Ford: So, and you know what you just said just to start. As a young writer, that was [00:51:00] always the hardest thing for me, and I learned to just word vomit onto the page. I do the same thing to ai. I word vomit at it and I collaborate with it, and that's how I, I just start preach each. Yeah. It Dave Egts: doesn't have to be perfect. Yeah, that's Carolyn Ford: right. Because you'll never get perfect. Yeah. As my, my good friend Erica Pierce says, done is better than perfect because Perfect is never done. Speaker 5: Yes. Carolyn Ford: Yeah. Alright, Speaker 5: Laura. Laura Klebanow: Yeah, there's a um, there's a quote from Mad Men about the Greeks having two words for utopia, or having two meanings for the word utopia. And that u topos means the good place, and u topos means the place that cannot be. And I think that that is like my feelings towards ai. It's like, if you don't, I imagine, don't imagine it as like. Uh, don't imagine it as a place that can't [00:52:00] be for you. Imagine it as the good place, the place you go to figure things out. It's just your problem solving studio. Speaker 5: Yeah. You know, Laura Klebanow: um, it, it's the place you go to unpack complex issues and get the input of a robot, um, for how to break down those issues into smaller bites. And I think that's like, what if you can expect it to be that and not. Solving everything, then that is gonna be the way to start. Carolyn Ford: Yeah. And not to be your boomer voice, but life is what you make it. It is what you make it. That's my grandma. You know, you, you make your reality, you make you, you make it Laura Klebanow: you. Yes. Make it Carolyn Ford: nice. That's right. Alright, where can our listeners find you and follow your work, Dave? Dave Egts: Well, they could, uh, on one LinkedIn, uh, we'll put the link in the show notes show, and then they could check out the podcast@dgshow.org, which we could link to in a podcast as well. So that's sort of the, like I'm posting a lot [00:53:00] of the. I would say the more mainstream observations that I put on LinkedIn and then the deeper cuts go into the podcast that are stuff that it's like we really, we dig in and uh, so yeah, uh, we could talk a lot about that there. Laura Klebanow: Thanks so much for having me. And, um, folks who've gone the conversation valuable, please reach out to me, um, at laura@showtell.io or you can find me on LinkedIn and the link will be in the show notes. And I'd love to hear, um, from anyone who found this episode compelling, and it was a huge honor to be invited. So thank Carolyn Ford: you. Thanks for being my beautiful today, you guys, and many, many days actually. And thank you audience for joining us. Please smash that like button and share this episode. Tech Transforms is produced by show and Tell. Until next time, stay curious. And keep imagining the future. Thanks for joining us on Tech Transforms. If you enjoyed this episode, please smash that like button and talk about it [00:54:00] with a friend. I'm Carolyn Ford and this is Tech Transforms. I.