VIDEO FEED === Sierra: ~ I feel like AI just shined a giant flashlight on it. It was way faster, but it also like really forced us to have some tough conversations because I think everybody had a idea in their mind in what this product should be.~ ~And we were like, yeah, we're gonna go validate it really quick and then we'll move on. And that really was not the case. We need to take a step back and rethink this. ~~ We're ~~completely changing our product strategy and we've kind of scrapped the entire original idea.~ ~Welcome to Launch Pod ai, the show from Log Rocket where we sit down with top product and digital leaders to talk real practical ways. They're using AI on their teams to move faster and be smarter. Today we're talking with Sierra Hanh Ventrell, director of Product Management at Department List. In this episode, we discuss how they analyze customer interviews with chat GPT.~ ~Even uncovering a missed insight that led to a full product strategy reset. Built an AI Slack bot that flags issues in a new customer facing tool before they become big problems, and how they transformed sales enablement and automated documentation using Notebook LM and Claude. So here's our episode with Sierra Han Ventrell.~[00:00:00] Sierra: I feel like AI just shined a giant flashlight on it. It was way faster, but it also like really forced us to have some tough conversations because I think everybody had a idea in their mind in what this product should be, and we were like, yeah, we're gonna go validate it really quick and then we'll move on. And that really was not the case. We can just take a step back and rethink this. We're completely changing our product strategy and we've kind of scrapped the entire original idea. Welcome to Launch Pod ai, the show from Log Rocket, where we sit down with top product and digital leaders to talk real. Practical ways they're using AI on their teams to move faster and be smarter. Today we're talking with Sierra Han Ventrell, director of Proc Management at Department List. In this episode, we discuss how they analyze customer interviews with chat GPT. Even uncovering a missed insight that led to a full product strategy reset. Built an AI Slack bot that flags issues in a new customer facing tool before they become big problems, and how they transformed sales enablement and automated documentation using Notebook LM and Claude. So here's our episode with Sierra Han Ventrell. JEff: All right, Sierra, welcome to the show. [00:01:00] Welcome to Launch Pod ai, where we're talking about not just product, but AI as well. It's been a few months since I saw you. Good to see you again. It's been like three months, I think, since we had dinner. Sierra: To see you too, and we're matching today, which is always so fun. JEff: Yeah, totally. Yeah, this would be great. I'll be honest part of the idea for this show came from when you and I met in Dallas at dinner and we're talking about AI and, and what real people, and I think you just flat out asked the table finally like, alright, everyone. What are you all doing for ai? Because like we're doing some things, but I wanna hear what others are doing. And the conversation just took off to a level that it's always fun and I feel like people always learn stuff about these and it's always been an undercurrent in ai. But you asked this just like flat out question that was so great and direct and people were just like. We had people walking over from other tables to join in and I came home and, and talked to our producer m and was like, I just had the idea for a great version of this show, I'm so stoked to have you here 'cause this is like the, the basis of how we came to this idea. So thank you so much for coming on. I noticed that at once, but I'm doubly thankful.[00:02:00] Sierra: Yeah, thanks so much. I didn't know that and I feel so honored now. Uh, no, I think this is just really interesting, like everybody's struggling with this. So when I go to those events and when I'm in person, it's like one of my favorite questions to ask. Maybe a little too bluntly, but it's always fun to hear. JEff: It was great. It was, it was like the perfect way to ask it. People were so into it, so don't change anything. That was, that was amazing. That was a great way to, but yeah, so I, I think this would be a fun, we have several use cases that you guys over at a partner list are using to, to actually in your workflows within the product team and within product itself. Using ai. You got permission from the CEO to talk pretty deeply about this stuff, which is great. 'cause not every company always goes for that. So, very forward-looking. All y'all over there, which I love. So we're gonna just jump right in. There's a bunch of examples we can start with and maybe we start with this, this like pilot AI slack bot. I forget exactly Sierra: Yeah, so let me tell you a little bit about apartment list for those who don't know, just so this will make sense. So apartment List is a rental matchmaking platform. So we have both a b2C side with our renters and a B2B side with our [00:03:00] supply partners. And those are big property management companies, so large companies that own tons of big apartment complexes around the country. And they're gonna have a ton of different properties. So one of the products that we launched for them um, over the last few years is actually an AI leasing agent. So to pick up the call an or book tours, answer questions, all those kinds of things. And as we're rolling this out, we're offering pilots because adopting AI is. Really hard. So we want them to go through this experience with us and make sure we've got everything set up correctly. So we do these pilots with our partners. And what we found was these pilots are very labor intensive, so we're having to monitor a ton of different metrics. We're having to make sure setups are correct. They have three or four different systems that we integrate with. If the integration is slightly off, or the custom configurations are slightly off. Things don't look great. And a lot of times we're going into these pilots and what we call bakeoffs with competitors, we're being put up against somebody else and they're looking at the data and saying this, whoever comes out is gonna win this contract. So we actually decided that we needed to build something to help our support team during this phase. So [00:04:00] actually in May of this year, we did a hackathon project on what could we do to help them just approaching this problem of like, how can we get ahead of some of these like technical issues that may come up or. Monitoring these KPIs more proactively. And so we built a series of agents. It's actually an agent for every property, and then a supervisor agent at the property management company level, and then a Slack synthesis agent that takes all the findings from all those agents and sends it to us on Slack. So now we can proactively be looking at. How are they performing against our benchmark KPIs? How are they performing against the company's previous uh, KPIs, like pre ai and how can we synthesize all that together and flag any issues? So now our team is getting regular Slack updates for each of these pmcs, these property management companies, and they're able to see what are some of the issues , are the notifications going through? Is there an integration that's down? Is there a flag that we need to look into some of these numbers, they're not looking right and it's just allowed us to move so much faster [00:05:00] and. JEff: Awesome and like this kind of thing. So you're getting the notifications in Slack, but it sounds like the actual agent architecture and all these things kinda live, obviously not in Slack itself, but it's a great UI to have that 'cause it's where you already are. I think part of the big thing I've, I've seen about AI is don't, don't expect people to change workflows too heavily. And even though this is internal, and you would think you can just go, you just have to use this other tool. People aren't going to there or inertia is real and getting it there is, that kind of takes the, risk of, are people just not gonna use it internally because it's, it's some other interface or something. I guess my question is who's building that? Is that engineering? Was that originally hackathon? By someone on the product team, like what did that look like to kinda spin that up and, and how heavy or light was that kinda initial project? Sierra: Yeah, it was actually our amazing engineering team. It was a two to three day hackathon, so just to tell you that we had a. Prototype in two to three days, which is amazing. And it's actually, our infrastructure is set up for it. So it's actually built [00:06:00] off of all of the data that we already have at BigQuery, which was a pretty easy, you know, integration. And then we were able to just kind of use Slack to get that information out to our customer support team. So like you said, showing up where our team already is, our analysts are already in BigQuery. All the data lives there. We're already talking to our customer service reps in Slack. Why not make this easier? And so, yeah, it was a two to three trial. And like I full offering. JEff: Where does the AI actually sit in it? Because obviously like., at your customer's end, the, the agents that they're using that are taking the calls and helping to book meetings and tours and stuff like that, that's ai. When it's being processed and kinda coming in on your side, is it interpreting how performance should be using some, you know, chat GBT or Gemini or Claude or something? Sierra: yeah. We're talking about an AI that analyzes an AI basically, so JEff: We're getting meta really Sierra: Yeah, really quick. So yeah, so we have all those conversations that happen with the actual AI leasing agent for partners, but what we're talking about, the [00:07:00] tool that we built is an internal agent. So it's an agent set up for each property that's reviewing. The AI conversations, Leah is her name. And so we actually have all those different agents internally that, like I said, are then supervised by a supervisor agent, and we're synthesizing all those requests. So we're actually looking at how we're performing. We're doing all that with ai, JEff: awesome. What has been the biggest problems if I know the engineering team's doing it, so maybe you're not hyper, involved in building there, but what problems have you run into kinda taking it from this conceptualized, two to three day hackathon project to actually rolling it out? I, I assume just accuracy has been part of it. And then maybe like data safety or something. Sierra: Accuracy has been some of it, not so much on the data side, but more on the analytics, like comparative side, like figuring out what a good threshold is and what a good benchmark is. It's different for every client. So that's why we almost. Built these like three stages of KPI reviews before we flag something. Because an enterprise partner with, a hundred thousand apartments all over the country is very different than an s and b with 12 in their [00:08:00] little building. So, you have to treat those really differently. They're gonna have really different benchmarks, and then somebody who's already on ai, for example, if they're at a competitor versus somebody who's never very different. How can we make sure we're flagging things that aren't just noise for the customer support team, because that's the last thing we wanna do is make their lives like, Hey, everything looks red. You gotta pay attention to all these things. It's the worst thing we can do. So that was actually one of the hardest things we approached was like, what does a good baseline and how do we actually train the AI to be smart on where they need to put a flag and we need to. JEff: How did you all go about that? Was it just taking data of existing conversations and kind of, making it us usable and, and putting it in? Did you actually sit there and go through some and go no, that's bad. No, that's good. Sierra: We are now. Yeah. So for the prototype, we originally just looked at okay, maybe we can just do one, one step of, maybe we just compare it to their baseline and we realized that wasn't really enough. So even in the prototype, they, the team decided to do a three staged approach of three tiers of reviews, of different [00:09:00] KPIs. And that proved really successful. So now what we're looking at is we're trying to get really accurate with training it on, is it different segments? Is it size of property? What are the nuances in the groups or actually going through that now. But I think coming out of even just like the 3, 2, 3 days that they did this hackathon, there were learnings around, you know, we thought we could just look at previous baselines or we thought we could just compare to. competitor or the client itself, and that didn't work. We really needed those multiple stages of review and that really helped. JEff: Are you using chat GBT? Do you know, are you using Claude? Is it some mix of Gemini, Claude, Sierra: It's a mix. Yeah. I'd have to go back and ask my engineering team the details of it, JEff: and that's, I mean, I think one of the interesting things is it has changed so much in the past three to six months that, what it might be built on in six months from now is probably gonna be different because everyone releases new models and new things. So it's almost, I found agnostic to model at times because it's just, what's the best one at that time? You have to be Sierra: yeah. Yep. Everybody's moving so fast. JEff: So there's another example. We have a couple more, but real quick before we jump in if this is useful and you [00:10:00] found that example as useful as I did and you're listening on YouTube, give us a subscribe. It's really quick. Just give it the click. If you're on Apple podcast, if you're on Spotify. Give us a subscribe, write us a review. It's really important. That's how others can hear about this. That's how we get the word out. It would help us out a lot and we'd appreciate it. And we'll do more of these for you. I do want to go into the next one, which is literally the conversation we had at dinner that you told me about that made my just like ears perk up. And, and I, I, internally, I think I literally pulled my phone out, clandestine and it was like slacking our producer m on the side to go. Like I had a great idea. And you're using Notebook LM for sales enablement, but in a really a way I just had never thought of before. And, and Notebook LM for listeners will know. I love Notebook lm. So, you wanna talk about this one next? Sierra: Yeah, this one's a fun one. I think product enablement, especially on the B2B side, is always like a beast, especially when you get really technical and really detailed products or AI products, for example. So, like I mentioned, almost [00:11:00] every single one of our partners has a different technical setup, so we have different integrations Even if we release one product, it's gonna show up slightly different for every single partner. That's really hard for a sales rep to remember when they're being asked like, oh yeah, is it this one integration that it shows up? There's an icon, there's not an icon. It's so hard to remember. So we actually tried out using Notebook alum to do product release notes to sales. So enablement training. So we would drop, the documentation in there. We would share it almost like a mock pitch. As if like sales had to explain it to a partner, we would talk about how we would actually talk about the product and answer some of those questions like, can it do this, can it do that? What does this integration look like? And I feel like that's so much easier to consume when you're sitting on a plane headed to a client site versus trying to read through really boring product documentation. So the sales team really liked it. We actually just relaunched our market repositioning on the B2B side. From like individual to an AI powered smart platform and we use Notebook LM to do a mock how [00:12:00] back. JEff: Yeah, I think you talked about too that you guys were leveraging the, the ability for it to kind of turn release notes and things like that into a podcast too. So people kinda listen to that on the go and, and now I've started to do that. I've taken our own release notes and other things, and I always love that tool, but I never thought about for that Sierra: It's so much nicer than reading like your 10 page documents. Yeah, I think it works for set things, right? Like I said, I think the market repositioning was great because that is a conversational element, like being able to pitch a product in a certain way and the features you're talking about and how you handle injections. Like almost that sales training piece just as an introduction into a new. Is really interesting to hear in a back and forth conversation. Less when we start to get into the really technical details. I think those you can go on forever on like the one off questions and not just, you lose people's interests. So I think we've dialed it back over there a little bit, but I have really liked it for like when we're doing mock pitches or explaining a new feature to a partner, [00:13:00] how we talk about it. I think that stuff , seems to be getting some good adoption. . so we were able to roll out like an entirely new position in a couple of months with all these kinds of things that we did. So yeah, JEff: You guys had a really cool story there about. Using some of these LLMs and other AI tools to synthesize a lot of call data and interview data and kind of all this stuff. And it actually helped you avoid making what may have been a big misstep or kind of a, a wrong directional step product-wise. And you were able to kinda find it fast and, and move forward in a different way. Sierra: So we looked at bringing a new product to market earlier this year. We went through, prototyping, we'd done all the research. We sat in front of a bunch of partners and brought the prototype up and got feedback and did insights. And we went through all of that in a matter of a couple months. It was pretty quick. And then we went to synthesize all of our findings. And the goal was, all right, we're gonna come up with a long-term product strategy. We're gonna identify product market fit, and then we're gonna go to leadership and say, Hey, we really wanna invest, go big on this product. Well, we used AI to go through our research synthesis. It looked at all of our [00:14:00] notes. It recorded all of our calls it analyzed the prototype. And when we were going through. We were getting these results that were kind of interesting, it would be like, partner X ask for this, which is very , similar to this company that exists. They do this and it would keep coming up with that. And I was like, this is not a good sign. And I think our whole team, our whole team was kind of like, okay, why does it keep saying these things? And we kind of felt like there was a little bit of a gap as we were going through interviews. Like we weren't able to validate this. One piece this like product market fit piece just felt wrong. It felt like they kept comparing us to competitors that we didn't wanna be compared to. And it was a really saturated space. And so I think when we went through, we synthesized all of our findings in AI and we obviously played with it a bunch to really understand like what was under there. And then we used that to build our product market fit documentation. And at that point, when we were kind of translating over and having these conversations, we realized like there was. Bigger gap than we were expecting. And I feel like AI just shined a giant flashlight on it. We probably would've gotten there eventually, but yeah. To your point. It was [00:15:00] way faster, but it also like really forced us to have some tough conversations because I think everybody had a idea in their mind in what this product should be. And we were like, yeah, we're gonna go validate it really quick and then we'll move on. And that really was not the case. And so it was much more like, we need to take a step back and we need to rethink this. And that's where we're at. We're completely changing our product strategy and we've kind of scrapped the entire original idea. JEff: But it's amazing because you found that, because even going through and by hand kind of trying to draw out findings and, and synthesize it, you know, the old fashioned way, and one of my things has, has been you can't edit your own work. You always need kind of like two sets of eyes. The reason for that is, is I think at some level. We are bad at checking our own assumptions . What comes out is, is if you can have a kind of impartial third party just start to draw synthesis of findings out. You'll find sometimes it tells you something really different than you thought or maybe a little bit different, but enough that it matters. Sierra: Yeah. And that's exactly where we are right now. Like I think when we went through [00:16:00] our research, and this is when we started to poke at it and we're like, how does this compare? What's unique about our offering? And, and we were right, like there was a need for the product in the market, but it already existed in way too many places and we had nothing different. And so we were like, okay, why are we doing this again? So, it was there, but I think the more you can poke at it and Right. Allowing it. Unbiased and unobjective, like all of us wanna build this product. All of us have an idea already. Designers already got screens mocked up as an in his head. I've already got a strategy of mine that we wanna go with. Having it kind of like be completely independent. It can't hear things that didn't exist. Right? Like I think I've been in interviews with people where you ask them to synthesize on the findings afterwards and they say something that, the user may never have said, but it's just in there, it's in their head. And AI doesn't have that bias. And so for us, hearing all of that and seeing all that and really pushing it, I think that was also a really good. Move for us was really pushing the AI to continue to try and make it better and try and poke holes and ask these tough questions that we were getting from our team and [00:17:00] our leadership really allowed us to shine, like I said, big spotlight on the gap. And yeah. So we're halfway through that process that you just talked about. We're completely reshifting, but super excited to get those light bulb moments with the team when we re it. JEff: Like, I do think this goes back to a non-AI, just basic about companies, which, and I've said this a million times, like I firmly believe the best companies are not the ones that succeed more often necessarily. It's, they're the ones who kind of pick the very few things that they're gonna succeed. Incredibly, amazingly well. And they pick a few things to just be world class at and succeed really, really well. And part of that is saying no to a lot of things and, and being picky and being right about where you're picky. And this can help you do that. That means everything you say yes to is a million things you say no to. And now you can say yes to potentially the next big thing. For apartment list Let's dig in here from a tooling perspective, was this a thing? Were you just feeding into a chat GPT instance? Were you like, what'd that look like to actually kind of have the, the tools to give that feedback? Sierra: This one was really simple. This was just chat GPT, and we were feeding it all of our transcripts, we were feeding it all of our notes, all of [00:18:00] our conversations. And then we were just continuing, like I said, to push it. It wasn't just, synthesize, give us the findings. It was, okay, now compare this. To products in the market. It was, okay, now that we've taken this synthesis, we're gonna go create a product strategy, we're gonna create a product for fit. And then we gave that back to chat GT and said, Hey, assess this. How does this compare to what we heard? Is there willingness to buy? Like those kind of things, look out in the market and tell us what people are saying about similar products. Like you said, doing that sentiment analysis. So I think it was the follow ups as we kept pushing and we fed it with more of our perspective and strategy and saying, how does this compare That really allowed us to shine the gap. JEff: Did you find that? You had to correct for any kind of maybe wrong or slightly inaccurate beliefs that the AI had, like I've seen a couple of times where it, when we've asked to do like competitive analysis or compare this to what's on the market, it will kind of maybe either a, it takes competitor positioning or something that, that is maybe not a hundred percent literal and take that literally, or it will maybe be outdated [00:19:00] in a couple areas end. We have to kinda go in and correct it and or build up its knowledge a little bit to make sure it has kinda like the full view of current world. Sierra: Yeah, you definitely have to guide it. We definitely had to do a lot of guiding is like focus on these competitors and don't talk about these kinds of things. It would interpret, market website positioning always sounds a lot flashier than the product may actually be. And so we had a lot of those where it was like, Hey, this company says they can do this, and we dig in and do research. And we're like, yeah, that's not actually what they're doing. And so we'd say Hey, ignore this. It's not actually a competitor, but it was a good flag for us to go through those exercises. So yeah, I think for us it was more like a check. Like I said, it was a checks and balance kinds of things. Not as much as a fear flat out wrong in some instance. Yeah. But more like a, Hey, we need to look into this. JEff: Yeah, and I found this an area where. The, the memory function that a lot of these kind of LMS have introduced lately has been really helpful. 'cause now it remembers when, when we're like, no, that company doesn't do that. They say they do, but they really don't. Or what they mean is this, for a long time it, it only held within, maybe within that [00:20:00] conversation or within that conversation only even over a short amount of time. But now you kinda start to update those learnings and I've found that. I, I useche PT a lot for a lot of this conversational questioning stuff because over time it has learned all those things now holds onto it, across conversations and across topics. And that has just made it even more powerful. And then I think the other big thing we found, I'm curious to hear on this is like formatting wise putting these conversations in with markdown. So we'll use markdown instead of just a straight text transcript, and that way we can kind of tell better who was . Our interviewer and who was the person, the prospect or customer side interviewee. And that's allowed us to kind of delineate a lot more, for a little while. It would tell us stuff. I'm like, I think that was just our interviewer, telling, telling, positioning, not them answering something. And that's been helpful. Sierra: Yeah, well, weren't you the one who was telling me about how you guys mind your gong calls? JEff: Yeah, that's been super helpful that, that, I'll be honest, that was a, that was a thing we straight ripped off, and I'll give credit to Jeff Charles from RAMP on [00:21:00] that \ , he talked at a dinner we did in New York but he talked about specifically when they have Dead lost, they run them all through an LLM and all the gong transcripts, all just all the data they have around these things, and they're continuously compiling. Why did it lose? What happened? Was it features, was it pricing, was it timing? And if it's features, 'cause Jeff, Charles Lewis, you don't know is the head of product at RAMP since it was a very, very small company. But they will categorize, okay, loss for product. Why, what was the feature? What was the thing? What was the, capability that they didn't have? And they would prioritize that. And where there was, enough demand, they would go build it and they just made sure they weren't gonna keep losing a lot of deals for the same reasons. And they use that to find what people were looking for the product to do, where it was maybe not as good as they thought it was. It's a great product. We use it and it's fantastic. Sierra: I pushed hard for gong after we chatted. I already used it, but I didn't know the AI capabilities with it, and it's been a game changer for us. So with it, with these interviews, we actually didn't have it, so we had to. Pull the whole transcripts out and try and train it and pull out, some of our speaking. But I think it, that we [00:22:00] added our notes in, but now we have gong conversations. JEff: yeah. Well we're actually looking at exporting into Log Rocket for context, because in especially like cases like our companies having that data in there can really, really help with a lot of how people are using it, what features they want and stuff like that. So kinda giving context to this didn't work and a bunch of people flagged that this would be important or something. So a lot of this stuff is, is really interesting to see. For a long time now have gotten a lot of value from mining that data and just finding better ways to do it and we keep trying to get better. But yeah, anyone who's not doing that gong has a great API you can put it almost anywhere. It's text. You can, you can check out and there's just there's gold in the hills there, right? Sierra: their AI is also improving too. Like we've started to use it even for prep, for conversations like. I used to , have to have a rep send me like all the technical setup of a different account or conversations they've had recent product feedback before I hop on a call now I can just ask Gong to send me a synthesis and a recap and I don't have to annoy the rep and I can just show up and be prepared for a call. And then same thing with synthesis. I can ask Gong to [00:23:00] tell me, product feedback over the last few weeks of conversations if there's a new release and ask it for that specific feature. Talked about and I can hear recent product feedback. So I think it's really speeding up the game and I'm really happy with it. JEff: I work a lot with our sales team and it's a big lift. It doesn't seem like it. And everyone always thinks oh, what does sales teams do? Is the job really that hard? But there's a lot of manual stuff they're doing. You do a call, you have to make sure to send a follow up email and you have to make sure you actually recap it well and give like good, not just good like summary, but what was gonna make that person move, and I found one thing that was really useful is I, I always take a few sales calls periodically and I try to be involved because I think it's important to understanding marketing. Go to market is. Part of that is, is being involved at that end. And I found before Gong was doing that, I was feeding the transcript into chat GPT with a little bit of training and is outputting a great, basically like summary and takeaways that I could just send directly to the person. But now in a lot of ways I, I think Gong is doing a lot of that kind of half automated. So these tools are getting smarter every day, but they're, they definitely help you move faster and better. And [00:24:00] I like it. 'cause now you can, now you can expect, sales to do a little bit more maybe? Sierra: Yeah, I mean, every good product manager should be in front of their users regularly. I go to a lot of product calls and I try and hear that feedback and I try and be in calls with our sales team and be really close to them. We have some amazing sales leadership and so they pull me into a lot of calls, but it's important that I show up well and I'm prepared. So I think it one helps me as a product manager show up and be really prepared, but also gather all that research when I can't be there and in person. So I think it's great for product two. JEff: Exactly. Yeah. It is just been helpful. Like the faster you can get that data, the better. The last thing is, this was one that you brought that I had not heard about really, and, and was really intriguing. 'cause I, I don't know a single person who likes doing this. Over there, you, you figured out a way to use AI to kind of do better documentation and more real time documentation and kind of get a lot of that stuff out basically as you were launching stuff, which, I remember the days of having a dedicated. Documentation writer. And then people experiment with maybe, maybe of the engineers, write the documentation of product, write [00:25:00] the documentation, and just, none of that's ever very good 'cause no one wants to do it. But you, you guys were using AI to automate this documentation and getting a lot more values. Now you have great documentation, Sierra: this is a great example. This is actually one of our engineers who just took the time to play with an ai, which is one of the things that I would just say, try as many as you can and just test out and see what happens. And sometimes you hit gold. And so this was one of those great examples. Somebody on our Salesforce team actually exported our whole Salesforce code base. Into Claude and looked for what are the recent changes? And they were actually able to build an integration into Confluence that will then document all of the recent changes. So every time they do a product release, now they can run this script and it basically just updates all the documentation. And as soon as I saw it as a product manager, I was like, oh my gosh, do this forever. Never stop doing this. This is great. It was huge because how many times do you have to ping the engineer and say, Hey, what's the actual validation? Which field is this coming from? Like all the little technical nuances and now I can just search through Confluence and it's all there, and eventually you can just ask [00:26:00] Confluence. Eventually it'll answer your questions, but this was really great and they said it was really easy to run and use. And so now we've started to do more of this, where we're like analyzing our code bases, whether it's for quality or for like new feature releases and things like that. It's just something we're encouraging our team to play with and try out. JEff: Yeah. That, that seems like, , you were able to find out something was a product already. That seems like something that should be a product. I, I can't believe has been, it must be somewhere, there's gotta be some like YC company doing that or something like that. Sierra: Somebody can tell us about a tool in the comments and we'll go try it. JEff: Yeah, exactly. Anyone listening if you've heard of this, like penos? So, is there anything we're missing here or like any other things that have come, AI moves so quickly, I figure there's potentially new things since we talked a little while ago even. Sierra: Yeah, I think on the tool side, we've covered a lot of really great examples, but happy to talk about like how we as a company are approaching it, because I think that's JEff: Yeah, that's a good question, is how do you balance over there? There's one. One path, which is kinda give people a blank check and let them just go play with a bunch of stuff. And the other is only approved tools and there's like a center [00:27:00] of excellence and, and that, but I feel like it's probably balanced somewhere in between of, it seems like a lot of things you talked about, right? Where a hackathon or where an engineer playing with confluence and getting really, really good output from it and good data. Like how have you and the team kind of balanced innovation but not running up to the million dollar a month? Bill Sierra: Yeah, I think a couple key points, like for us, it's a hybrid of the two, but it really stems from like our culture. We actually were at compartmentalize, which is one of our biggest industry conferences recently with our CEO and a subset of our leadership team. And we were having the conversation on like, okay, do we need to hire a VP of ai? Do we need a person to head up all these things to get control? 'cause we do have a lot of people piloting and testing different tools. Do we need to create an approval committee? And we ultimately decided based on our culture, like we have a couple operating principles that are like, you know, use speed to our advantage. We wanted to move really fast and be a learn it all, not a know-it-all. And we wanted to learn really quickly. We just didn't feel like hiring a specific person to own that or bringing a committee in for approvals and putting red tape felt like us. So we landed on kind of this [00:28:00] AI native approach where we're trying to just get everybody to be their own VP of ai. Like we're pushing it as fast as we can and just encouraging everyone to have some sort of adoption with it. So yeah, maybe we will run into a space where we've got a whole ton of ais that are going on or a bunch of different tools, but for us it's just about moving quickly and adopting, and I actually just did that last week. I found this really cool product demo tool. Two days before one of the big launches. And I'm sure my engineering team loved the fact that I ping them and said, Hey, can we integrate this in our launch in two days? And we weren't able to do it. And then because it was like a little risky to put that last minute, but we did a fast follow and we created a demo that we could share out with our GTM team, and now we're creating an external demo and looking to embed it in the product for the next month. That was in. Less than two weeks. We got approval and review. So I think like removing red tape for us and just embodying our principles allowed us to come across a lot of these really cool things that we've talked about today. JEff: can you tell us what that tool is? Sierra: It's called Story Lane. I'll give them a nice shout JEff: Okay. Oh, I was looking at that earlier today. I was looking at [00:29:00] that today. Sierra: it's cool. It's really cool. I looked at two or three and it was so easy and I was like, all right, this is, this is great. And I got, like I said, I think it's about our culture. I got really quick technical review and they were like, yeah, let's, let's play with it. And so now , we launched it to our go to market team last week. JEff: Five second overview. There's like three things. I'm kinda looking at AI for demos and stuff like that. How are you guys using it in the demo there? This is all, now this is totally novel. Like none of us realized this was coming up today until we just started talking about it, so Sierra: Basically what I did was we just completely relaunched our platform where properties manage their content. And when you move everything and when everything looks different, it can be a little jarring to users. So we're using it a little bit as a tutorial, but also a walkthrough for both our internal staff and our external. So our internal can help train our partners on it and our partners can understand where things are. So that's why I was saying I wanted to actually embed it to do a little mini walkthrough when you first log in. We weren't able to do that in the last minute because I literally found it, I think 48 hours before our final launch. It's a little too close. But. We did a fast follow with a video that you can kind of have a clickable [00:30:00] interaction and I was able to just record everything. And it kind of came up with an AI walkthrough and added all the notes on this is where you would see validation, or you can't enter this type of thing in this field, or this is what this icon means, or this is where to find things. It did all that for me and then I just had to kind of check it, clean it up, make sure it looked right, and then share it out with the team. I thought it was, it was really easy to adopt. So that's my favorite thing about these tools is just try 'em, just play with them. JEff: I mean, I think that's been the biggest thing. Right on. On our end, our policy has generally been, I think we look at our AI spend. Across the company, across and department department fairly regularly just to evaluate are we, we're spending a lot? Is this really and so far, like even with a lot of people experimenting a lot of things, it hasn't been that high. But what's been great is I think when I talk to teams about, oh hey, have you tried to make this faster with, doing this? A lot of times now the answer is, yeah, that's what we're doing. Let me show you the process. And they've found a bunch of ways to, do something. Several times faster without even, going up to the executive level to get approval because the tools are [00:31:00] pretty expensive and we have certain tools are approved and we have just corporate, accounts for them and other things are trialing one by one. Sierra: but I think that's the great part about it, right? We, because of this decentralized approach where everybody's kind of moving forward, we do have a center of excellence and they are actively sharing insights. So like their, their job is to say, Hey, we tr we piloted this. Tool, here's what we found. Or even open pilots up like we get a lot of ping that are like, Hey, this team is piloting this tool. If anybody else wants to jump in and test it out, please do and give us feedback. It's the next 30 days. And we get a lot of people who are really eager to jump in and try it. And that's like all about our culture is just pushing that. JEff: Nice. I love it. Sarah, this is great. Thank you so much for coming on. I feel like I learned two or three new things so hopefully others learned a couple things as well. And hopefully it was fun and useful for you too. But this has been a pleasure thank you so much. Sierra: Yeah, absolutely. Happy to talk about this. I think it's, like I said, it's one of my favorite topics to see how people are using it, and I just wanna see everybody try it um, in product. So I think it's great that people are getting into it. JEff: Yeah, if people want to get in touch and, and maybe have use [00:32:00] cases they think be interesting or wanna reach out and ask you a question is LinkedIn the best place or is there another way? Yeah. Well, thank you so much for coming on. Listeners, if this was useful and you liked it as much as I had fun during this conversation, subscribe on YouTube if you're watching there apple Podcast, Spotify, any of those things, give us a subscribe. Write a review that's super, super helpful, but helps get the word out. Sierra, it was a joy. Thank you so much. Great to see you again since April in, in Dallas. and hopefully we'll be back in Dallas soon and we can hang out again. Sierra: Yeah, that would be great. Thanks so much for having me on. This is fun.