Cory Liebgott - LaunchPod === [00:00:00] Cory: The AI hype is there, I want to be involved, but how can it help me? How can I use it while staying within the guardrails of feeling safe about how I'm using it and you know, where my data's going? Jeff: Welcome to Launch Pod, the show from Log Rocket, where we sit down with top product and digital leaders. Today we're joined by Cory Lieb, got VP of product at Optimizely, the SaaS experimentation platform, trusted by brands the world. Over in this episode, we discuss the AI Gamble Optimizely almost made around benchmarking and why they pivoted to workflow agents instead. How agentic AI plus Human, the Loop could make teams superhuman and the hidden cost of listening to the wrong users while Corey was at DocuSign and how it almost caused a big misstep in roadmap planning. So here's our episode with Corey Lieb. Got Corey, welcome to the show. Thank you for joining us today. Cory: Pleasure to be here. Thank you for having me. Jeff: This is fun because your, you know, your background, you came from the dark side. You started out as a marketer. You transitioned into product, which I feel like you made it official. I just kind of bother our product team [00:01:00] until Defacto just having input at some level. But I feel like you, you ended up choosing the perfect path for kind of combining that, you know, the superpower that you bring to the product team, which is a marketing background, and you are now a VP of product over at Optimizely, which if you're a marketer. mean, I don't think there's a single, marketer in the US who hasn't heard of optimizing probably a lot more around the world than just the US even. we're gonna take a minute to kind of talk about everyone's favorite. Topic on the planet right now, which is ai. 'cause Optimizely and this whole AB testing experimentation world is just one of those places where the potentials for utilizing AI to move faster, better, stronger is incredible. But there's an interesting story here, right? Around kind of how you and the team at Optimizely looked at implementing AI and maybe kind of looked at roadmap, socialized it, and realized maybe there's some better paths there. So, do you want maybe set context? Cory: Definitely, and I like this topic because everyone's talking about ai and trying to figure out how to [00:02:00] incorporate it into their products and what works best. And in fact, last year when we were. Working on our AI roadmap, and as you mentioned, we started to socialize it with customers and partners. We made some incorrect assumptions initially around how folks would wanna use AI and their data and so on. So we had these sort of different levels of how we were gonna approach it. Different horizons throughout our roadmap. And as we built on, you know, from conversational ai, generative ai. we assumed what would be our third horizon of benchmarking data. Hey, we're gonna have all this great access to data and our customers are gonna wanna know, you know, what other customers are doing and where they fit and so on. And the feedback we got was quite the opposite, and it was more along the lines of, I don't want you to couple my data with anyone else, and if that's what it takes to get benchmarking data, I'm actually not interested in using ai. So that was a big. Realization for us Jeff: On face what you're saying seems like the perfect thing. But at the same time, [00:03:00] you know, going there is my first response is like, I don't want anyone to know how. I'm doing though. I want that to be secret. I wanna know what y'all are doing, but you can't have my knowledge. Cory: Exactly. Exactly. And so we heard that loud and clear, and luckily before we did any of that and we started socializing our roadmap, we got that feedback from our customers. And so we pivoted and so. Instead of horizon three being, let's look at how we can use AI for benchmarking data. We said, no, we're gonna go down the age agentic path and we are going to be very clear about our governance rules and how we use your data. Optimizely's AI Suite is branded as Opal. We build it on Google Gemini, and our partnership with Google is such that we do not send any of our customers data back. We train the LLM and we do not couple one customer's data with the other because we, we learned early on that's not actually what they wanna use AI for. They wanna use AI in the context of their own company, their [00:04:00] own brand, their own data. And then we were like, alright, this is great. Now we can set forth in ag agentic AI roadmap based on that. And what we've done now is. We've built agents where you can train them based on your brand guidelines, your tone of voice, give them custom instructions, give them things like example, shots of your best email blog, and we want others to be like this and so on. And so we, we really pivoted and now we're all in on the agentic ai. Jeff: I think one of the most important parts as we as product people have kinda walked through this evolution. 'cause no one prepped anyone for ai, right? Like 2023 chat, GPT three dropped and it was kind of like a bomb went off. And even people who had been thinking about this for years. We're not fully ready to take advantage. And you ran into situations like this where you had great ideas and great intentions, but as you kinda walk through, right, horizon one you know, make interfaces easier, horizon two, you hit, you know, kinda what you guys have called horizon three and you got pushback. the idea this brings forward to me is this idea [00:05:00] of high scrutiny, product plans and the willingness to kind of be humble enough to say. I, or we don't have it all figured out and we're going to go put scrutiny to all of our assumptions. We're gonna go, socialize it with our buyers and our prospects and our customers and make sure this is what they want. I think someone said it best at one point in product, we doesn't actually matter if we ship software. It does at heart, right? You need to ship it to, but in, in reality, the thing we're doing is we're solving problems and what problems do they want solved? What are the most impactful things that they need help with? That's what we're doing. The software just kind of makes that happen, but having this kind of high scrutiny, low ego look to how we're bringing in ai, has been just so important across so many organizations, and this is such a great example at Optimizely of being in tune with the customer base, right? And you didn't get halfway down delivering benchmarks and have some large beta customer set trying and going, no, we don't want this. You kind of got ahead of it and you were able to act quickly as a result, Cory: And I'm really proud of us as a company, in fact, for being so nimble with it because it is moving so [00:06:00] quickly and every company is.. Is hammering away at how can we incorporate AI in the right way that will be valuable for our customers that will solve real problems. And we're moving at breakneck speed. It Optimizely too. But we're listening and we're pivoting where we need to and I'm incredibly proud of the teams for doing that. Yeah, and this is one example of it. I'm also proud of the teams for and for our leadership, for living what we preach, so to speak. So we're not just building AI tools. For our customers, we are incorporating AI tools in our day-to-day. And the obvious ones are our own marketing team, of course uses Opal and our Optimizely solutions. But even across product and engineering, we are using our internal tools like Opal. We are also rapidly looking at and trying out external tools that can help us deliver products more quickly. And so it's a big mental shift for everyone of. The AI hype is there, I want to be involved, but how can it help me? How can I use it while staying within the guardrails of feeling safe about [00:07:00] how I'm using it and you know, where my data's going? But we're doing the same thing in our day-to-day as product folks as well. Jeff: what did that actually look like? Because we went through a similar process a couple times now where we've gone to our prospect and our customer base and said, here's how we're thinking about this problem. Does this resonate? But what did that look like on the optimize side to kind of go, we're gonna deliver benchmarks. What do you think? Cory: In terms of how our customers reacted and then how we reacted. Jeff: Or even, how do you handle that kinda communication? Like do you have a set group that you're going to regularly, is it more ad hoc? Is this a general practice y'all do? Or is it kind of more you wanted to stress test this one? Cory: our typical process, we have customer advisory boards, we have close relationships, and so we meet regularly with and sort of pressure test these type of ideas with those folks. And then we meet regularly with a lot of our customers. To get that firsthand feedback. We have early adopter programs and then we're out attending conferences, industry events, visiting customers and getting the feedback that way. And so we have a lot of opportunity to sort of shop our ideas and get early inputs. But what it looked like [00:08:00] was quickly learning that, you know, we needed to make a pivot and then being upfront about it. And that's another thing I love about our leadership is, hey, we. We're talking last year about going in this direction. We heard you loud and clear that's not the direction that makes most sense. So now we're going to pivot and here's, you know, the plan around that. And we're going the agentic ai approach, because that's gonna be more valuable for you to help with the mundane tasks of, for example, as a marketer, rewriting a blog by industry and having an expert in every industry. Now you can get an industry agent that can do that for you. That's more valuable. Then say, taking the risk of coupling your data with others to get benchmarking information. Jeff: and of course now we've all seen some of those tools where it was feeding in data like that. And all of a sudden outputs were not just showing the sanitized safe version across different companies, but you know, companies were leaking industry secrets and stuff that was supposed to be, hidden. And I'm sure optimizing clearly would not have had that mistake. But I think that's the fear in people's minds is we've seen it now from some of the maybe newer startups of that kind of thing has happened and now [00:09:00] everyone has that risk. We went through a similar practice, right? , You know, on our end we were launching what. We ended up kicking off in in the spring, which was this idea of agen ai, which takes, feedback from across, not just your session replay and your analytics, but everything from dev launch tickets, you know, like your Jira, your linear. Through to support desk tickets, gong calls apple App store reviews , social media, , voice of customer stuff, and kinda aggregates it and helps you consolidate all that, which is already kinda spread across a Franken stack and put it in context of this digital experience and help you understand what this customer feedback means to the real experience and how widespread it was. And basically. Gets away from people having to do this manual aggregation. Our prospects and customers said, here's how we're thinking about this problem. Here's how we're approaching it, does this resonate? And we found two really interesting things. One was the problem was real. Two was, the way we were actually talking about it at first was really far off what it should have been like. We were. Really burying the lead from a messaging standpoint when we kind of talked to some people. And I think that kind of high scrutiny [00:10:00] led to a much more successful launch when we finally did that later in spring so, you know, at Optimizely, you're coming out with a product not only really hits the need, but people feel a part of which makes them, you know, feel even deeper entrenched to a tool like that. Cory: I think it builds trust because in product we're not always gonna get it right. We're gonna try things and sometimes build things that aren't the correct solution or aren't exactly hitting the mark. But I think the ability to acknowledge that learn quickly, pivot. And move on. It absolutely builds trust and in fact, how we approach our development with ai because of that feedback that we got early on, every time I'm in front of a customer, I'm at a conference presenting about what we're building with ai. I have a slide that's around like, here's our guiding principles around how we don't send your data back to the LLM and we don't couple it with others. And hey, these are questions you should ask any other AI vendors you're working with because. If you're not thinking about it yet, you should be. We're hearing this very broadly from our customer base, and so now I proactively share that in all my [00:11:00] presentations. Jeff: I love that, that you took kind of a potential misstep and not only did the team over Optimizely correct quickly and get on the right path and you know, was incredibly fastly able to. Pivot, but you're using that also as an additional like gain that you all have as you have this insight now that people really don't want that. So you can position against that. 'cause it is, it does seem like such an obvious. Path to go down. But you know, that's not true. And I do think there's some level of right in this world where AI is moving so much faster than any of us thought it could. Right. I feel like I went on vacation for a week earlier this summer and came back and a new model had been released and I kind of felt like I was already behind Napal. But also our CEO. Told me, Hey, this feature is almost ready for some alpha testing with our customers and should be ready for rollout. Probably, you know, early Q4, late Q3. And my response was that when I left, that was a Q3 2026 feature. What the hell man? He is like, yeah, some new things happened and it worked a lot faster. It's probably right now more valuable to move fast stress test those ideas. And be able to be nimble and pivot quickly than it [00:12:00] is to be incredibly high scrutiny before you take that step, before you kind of put yourself out there and always be right, but move, you know, maybe half or a quarter as fast. Cory: Yeah, absolutely. And I think like a lot of companies, we do our annual planning, we do our quarterly planning. Everyone likes to plan. You hope to stick to the plan. But right now, I mean, we are re scrutinizing our plans mid-quarter because we're getting rapid feedback and the market is changing so quickly and we're changing our plans. And that's hard. It's hard for teams to adjust. It's hard when you're talking about your plan and now explaining how the plan is changing. But. I think that is the new normal, particularly in the world of ai. And as you said, you go on vacation for a week and you're already behind. I feel like if I'm not reading about it for a day, I'm already behind. Things are just changing so quickly. So we have, yes, learned to adapt even more so than we did before to just absolute changes almost daily. Jeff: Yeah I think that right, the, you need to have the vision of where you're going or you're never [00:13:00] gonna get there, but you also have to have the low ego to understand almost in no world, is that correct, probably. We can know generally where we're going, but the timeframe, you know, it's a bit of a crap shot. And we have to be okay with the fact that we're probably not going to be right, but we can have the right intention to start and being, low ego enough to just be willing to pivot. You talked about this a bit and I feel like we kind of moved away from it too quickly. The path forward at Optimizely is agentic. And I feel like this is an area a lot of people are kind of coming to realize is in, in reality, we want to do less boring work. We wanna do the interesting stuff, but what can we push off to agents? How did you guys kind of come to this is the solution and what does that look like? You know, what does a modern experimentation look like, a platform look like with agentic flows and how does this help kind of also self? I just wanna know like, how can stuff like this help my team? Cory: Yeah, I can give some of the examples of the things that we're building and rapidly launching. Again, very quickly, getting feedback. But for experimentation, for example one area of our business where we have amazing tools where you can do ab [00:14:00] testing and experiment, but we heard feedback from our customers, Hey, sometimes I. Have a hard time coming up with an idea or I've used all my ideas. Now what do I experiment with next? So we have an experimentation, ideation agent. And this is just one example of the many out of the box agents that we offer across our product suite. But you can set this agent on your homepage or any page and it can run through and give you suggestions on what to test of, Hey, based on your headline or the text here or the imagery, this is based on your own data and optimize these domain expertise. And then it can make suggestions for you. And then we have agents that can actually run that, that experiment and then summarize and share the results with you. And so kind of every step of the way where there's still a human involved, the human doesn't go away, 'cause anything that you're gonna actually put into production and make it do things, you still have oversight there. Another thing that we heard loud and clear from our customers that they wanna have. Those controls over. But if you're struggling on where to start or where to, what to do next, or to your [00:15:00] early point those mundane tasks or repetitive tasks that you just don't wanna be spinning your wheels on, that's where our agents can really help. Jeff: I love the human in the loop aspect of it because one, I feel like that's kinda become a recurring theme. I've seen, especially doing this show across verticals from medical to, you know, experimentation. The whole kind of thing we're taking away from. How the work at Optimizely went is really interesting because it kind of leads into another experience you had kinda earlier in your career. I mean, you've just had a great list of companies here and ones that I've used heavily. But you know, before you were at Optimizely, you were at DocuSign and responsible for basically the real estate vertical which I have, you know. Been lucky enough to go through the arduous task of signing papers for real estate more than once. One time I got to use DocuSign and one time was physical papers and I gotta say, Hey, thank you to whatever team's over there, because it was far better in DocuSign. But. It's a really interesting story. I think we can extrapolate from around this concept of like the squeaky wheel problem and how is [00:16:00] product people and you know, people responsible for setting product roadmap and vision. How do you kind of listen right in a little bit of a warning you know, warning allegory, I guess because you all were in the situation where you had. A really loud persona. Persona. Really willing to talk, really willing to bend your ear. And another one that it was hard to get in touch with, but that probably did not indicate who was actually most important. I'll stop leading into the story and just, do you wanna take over? Cory: Sure. No, this is a good one. 'cause this was another great products learning moment where we, so as you mentioned, you know, DocuSign, e-signature, our flagship product, real estate signatures, our number one use case. And as part of overseeing the real estate vertical at DocuSign, oversaw a product called DocuSign Rooms, which is a digital transaction management product. So it included not only the e-signature component, but think like tasks and collaboration features for real estate teams. So there's multiple users, multiple different personas in there, and we would hear mostly from. The [00:17:00] transaction coordinator, so this is an more administrative role. They handle all the paperwork and sort of background details while the real estate agents are more so in the field showing customers houses doing what they do best. The transaction coordinators were highly engaged with us as product managers, very open to giving feedback, which as a PM you love and you wanna talk more and more to those users. And so we had been honing in on their use of the product and their needs and not hearing as much from the real estate agents. Real estate agents, again, they're in the field, they're on their mobile device, they don't wanna be bogged down with the administrative side of the job. And so what we learned quickly, or not so quickly but we're able to pivot once we learned it, was when you think of the decision makers and the people buying the product at the real estate brokerage, they're like. the real estate agents are our money makers, they're the ones we really care about, and we need to make sure they get what they need to [00:18:00] get in and out and, you know, get the proper data, information, electronic signatures and be on their way. The transaction coordinators are great, but their workflow, we would prioritize below the workflow of the real estate agent because again, at the end of the day, the revenue's coming from. The agents and it was such an awakening for us where it was like, gosh, we've been focusing on the wrong persona. The transaction coordinator's important, but they even themselves would say. After we had this realization, Hey, we're gonna prioritize the real estate agent workflow ahead of some of the requests you have here on the backlog. They were like, oh yeah, prioritize them. We're just giving you our feedback. But of course, you know, we wanna make sure that that they're on the go mobile flows, et cetera are absolutely prioritized and streamline first. And so we, we shifted and then it was all about the agent and how can we make things better and easier for them. So that was quite an awakening. Jeff: I feel like this is probably not an uncommon problem where the people who are most valuable to the buying decision and the kind of [00:19:00] continued value of the product are probably not. Always going to be the ones who are easily accessible or going to be the most talkative, whereas you're probably gonna run into kinda operational support. People are very happy to give feedback and they're in it all the time, but, you know, prioritizing them could get in a little bit of trouble. Can you walk us through a little bit deeper, like how did that epiphany happen where you kinda went from, this is great. We have all is great feedback to, oh no, I think maybe we've listened to the wrong people or the not the top priority people. Cory: so before the realization, we had advisory groups of customer users based on transaction quarters, based on being real estate agents. Different feedback, right? Different things they were asking for, which was like, okay, now we have to prioritize between them. And then having close relationships with some of our key customers, a lot of big brands that we were highly engaged with their leadership teams and meeting regularly to review and it started to be a common theme with them where we'd say, okay, we're weighing sort of the top 10 we're hearing from the [00:20:00] transaction coordinators. And the top 10 we're hearing from agents. You know, we've got these other things we need to work on. If you can wave your wand and pick what would you prioritize in time and time again, those leaders would say, focus on the agents. Focus on the agents. And then when we would go back and talk to the transaction coordinators and say, alright, we're gonna focus on this and this for the agents. They're like, oh yeah, that's fine. Yeah, we're still gonna give you our feedback, but definitely prioritize them first. And so it just, it took a bit of. talking across from the decision maker, the buyer, all the way down to each of the users to sort through that. And yeah, we prioritized accordingly. Jeff: I think what helped there is you went back and you made them feel heard. You talked about, you know, you gave maybe as a very quick explanation, but you gave the explanation of why, and it was something that. Fit within the industry very much normally for them, I assume so it's probably, yeah, that makes sense. You should take care of them first. Cory: It wa it was and honestly, you go into that not knowing what to expect because you get so much feedback and you wanna please everyone, but you cannot [00:21:00] possibly. And it's really hard to say no when at heart. We are all problem solvers. We wanna help people, we wanna make the products better for everyone. But I think to your point, it was like taking the time to explain. What we're doing and why. I think that's a big part in product. You cannot say yes to it all, but if you can explain, here's what we are gonna do and why, that's why we can't do these other things right now, oftentimes the other users or stakeholders will be receptive. And again, it builds trust because you're not just saying. No, I'm not gonna do what you're asking for. You're saying we've prioritized this because, and this is what we're gonna focus on right now. And it's a dance too. 'cause you wanna keep those other users and stakeholders engaged and still get their feedback. But I think you're right. I think being open about it creates the opportunity for them to buy in more and be supportive when you've gotta make those trade offs. Jeff: No one wants a black hole of knowledge, right? I think, you know, no one wants to feel like I gave you something and just was left hanging. Even if the answer is kind of no or yes, but [00:22:00] later it feels a lot better to kinda get the full circle than just to give that feedback and just go, did you hear me? Is anything happening? But I think so important here and probably something a lot of orgs, even bigger orgs, can take away from this is. When you're doing feedback, you know, make sure you have that view of, kind of gather user level, gather persona level, but also look at and make sure you're understanding, you know, back to we're solving problems at heart. That's not, you know, software is not the end game. It just happens to be how we do it. But we're solving problems and figuring out which problems are the most important ones to solve. Is a huge part of the job. And make sure you're prioritizing that rightly, and having that kind of step where you could go back to the actual buyers and say, here's what we've gathered. Which one should we solve first? There's one thing you touch on that I wanna make sure we kind of don't overlook before we go. that is how the team at Optimizely is using AI in their own workflows. And I think this has been you know, as much as people wanna know, how do I, you know, my board comes to me and says, Hey, sprinkle a little, you know, AI on that thing. The big thing that almost everyone really wants to talk about [00:23:00] is. How are you getting value in data? How are you being more productive from using some of these tools on your own end? So I'd love to kind of dive in more there, like what is the, what does that look like on your team at Optimizely and maybe across product and even across the org. Cory: Yeah, absolutely. Optimizing as an AI first company. True and true. So again, not just in the solutions we're building for our customers, but in how we are using AI in our day-to-day, which I really like that. And in fact, I got my team together the other day because as a product org we're chatting constantly and sharing new AI tools be it Bolt or Claude and how are you using it and what did you think of the, you know, the prototyping capabilities of this. And I tried this one for vibe coding and here's what I thought. And so. We are in the thick of, we are trying all kinds of different tools. We have some that we have enterprise licenses for we use copilot a lot in our day-to-day, for example, through Microsoft, and I'll talk about some of our use cases there. Our engineers work with Claude and we're exploring other solutions. [00:24:00] So it's very much evolving and growing in terms of the AI tools. That we're incorporating in our day-to-day work. But I got my team together the other day because I was like, Hey we're chatting a lot about this. Let's hop on a live call, carve out a little time, and each person come with an AI tool that you've been playing around with, and demo it, share it, show it so we can all get more ideas from each other and sort of help to hone in on which ones might be the best that we wanna get. An enterprise grade license for and use consistently across the team. So we did that and there were so many great examples. I think some of the common things that, that we're using AI tools for and more obvious use cases are like PRD writing and refinement. To an enterprise licenses through apps like copilot, have integrations to Jira, for example. So help with writing your PRD and then get it, get the acceptance criteria and get it translated into stories that you can share with your engineering team, like making that very seamless. What was a very time consuming task before um, also, you know, less [00:25:00] technical PMs in working with engineers or other technical folks and trying to translate. Whether it be snippets of code or just technical language that is not as natural to them using AI tools to say, can you break this down for me in layman's terms, or Can you make this into something visual that I can better understand? So team, the team's been doing that. We use the heck out of copilot for recapping meeting notes. Action items. Summarizing things. If we're out for a day or a week, the work doesn't stop. But copilot can summarize, here's what you missed in your chats. Here's what you missed in your emails and help you with what you need to focus on. I also loved one of the PMs on my team. She's like, I love the microphone feature on copilot and the AI tools 'cause I'm a talker versus a writer, so I wanna just talk and sort of brain dump. What I'm thinking about, be it debriefing from a meeting or to start to articulate around A PRD, I just wanna like voice it and then copilot, cleans it up into [00:26:00] this beautiful document that looks like I spent hours on it. So those use cases I really liked. We also would, Optimizely have developed an internal Opti GPT tool. Jeff: this one was really cool when I heard about this a little bit earlier, so. Cory: I, I absolutely love it and we continue to enhance it. But this is everything from, if I want a query to say, tell me about this customer. Which products do they have with us? How much a RR are they spending with us? What have their recent support tickets been about? Like if I get an escalation or an issue or I'm just checking in with a customer in A QBR, what a quick way to get a sense of and sort of get an executive brief of what's going on with this customer. Similarly. To, if I'm a new employee or I work in one product area and I wanna know about another product area, I can query about our products, get information about that. We have an hr capability, so if I wanna know about policies or, you know, time off or US holidays, I can query Opti GPT for that. So that's been really cool. And then we also use external tools where [00:27:00] we're not uploading sensitive information, but we can do. Market research for what's publicly available to, you know, when we're considering pricing or things like that. So we'll use chat, GPT perplexity, for example, to do market research, and then we're tinkering with tools like cursor, lovable rep to do vibe, coding, prototyping and so on. So those are some of the things that we're doing. But again, every day we're actively sharing hey, we came across this, or I tried that one. And just sharing those ideas across the team as we all learn. . Jeff: I think that's one of the biggest things is how do you kind of enable people to try things. How do you set kind of Mission parameters, not to use the rocket metaphor which being at log rocket, I always sway between like lean in and dad joke it up with rocket metaphors and avoid it at all costs. But I feel like there's a healthy balance. But how do you set the mission parameters there of here's, you know, the big picture, what we're trying to do? But go forth and try things. And it's okay if, you know, sometimes it's not gonna work, but when you get something that works, share it. You know, we try and post a lot about these kinda wins of process. How are you [00:28:00] using tools to enhance and how could others potentially use it? One thing we found on that, I love the kind of opted GPT to ask questions internally. We built a similar I don't think it has quite a as catchy name. I forget the actual name now. But you can query in Slack what does this customers. A RR or what are their contract terms or again, HR policies. But one really interesting thing about AI, I feel like has been emergent capabilities. You build something to do X, but because you did it right to do that, it has the ability to do, you know, Y or Z or Omega or some other fantastical thing. We launched into our product, not for us, mind you, but for our customers. We launched what we call Ask Lea, which is the ability to kind of query. In a chat function not like build charts, but those same kind of questions, you know, how is this customer doing? Or how are people using this new feature? But one of our CS team actually realized she could ask it, how is this customer using the product lately? And, the insights it gave. She basically came back and said, this is better than a lot of CS tools we have for account health right [00:29:00] now because it's going in en able to actually talk about engagement. And this has been a huge kind of new internal tool that we'd even build for, but now is used across our CS team and some of our SDR teams that focus on the inbound around. Understanding customer and user health. So what we built for external became an internal tool that has super wide adoption now. Because that but because someone took the idea and shared it, I feel like is the biggest thing and talked about that. But it all still comes from the human element of what are the key things that are really interesting and having those human conversations up front. Cory: Yeah. ' Jeff: em, but you can't replace those. Cory: Yeah. Absolutely. And I'm a big advocate for share information, share learnings, just like we've been doing today, even when it's fumbles and you don't get it right, but what did you learn and how did you pivot? To me, that's where I learn the most from others as well. And so I do regularly have these sharing sessions with with the PMs on my team. And then we also set OKRs around. Making sure we're incorporating AI into our day-to-day work in this way and so on. So yeah, very top of mind for me. Jeff: Yeah. And then I think [00:30:00] just the last thing I wanna share 'cause what you said there sparked my mind real quick is, you know, it's sharing and kind of trying and everything like that. One of the biggest ways I've figured out to learn is actually try to do something and get it like halfway there and kind of fumble my way through it. Just try until I can't do it, and then show an engineer and be like, Hey, look at what I did. It's pretty cool, huh? And like, oh my God. And you can see their, you know, smoke come outta their ears and, but then they take it and kinda work on it a little bit. And like two days later I have a great finished product. Cory: Yeah, I love that. Jeff: for everyone is. You know, co-opt your engineering team off sprint points if you have to. Except if you are one of our people at Log don't mess with the sprint points here. Well, Corey has been an. Absolute blast to have you on. I so, so, so appreciate you coming on. To anyone listening, if you've made it this far in, you know, at this point I think you listen to the whole thing. You owe the, you know, subscribe. You're gonna like the other shows. If you listen to the whole thing through, give a subscribe on, you know, Spotify apple Podcast YouTube, if you're on it, write a comment. Write a review. That's how people find us. Tell a friend though, honestly, tell a colleague, be one of the cool people who discovered it before the masses did, [00:31:00] and show other people how you learned so much from incredible people like Corey here. Corey, I really appreciate you coming on. This has been such a blast to kinda talk about AI and product and especially what's going on Optimizely and how, you guys are building tools that should make, people like my team's lives easier. If people wanna dig in more, ask more questions about how you're thinking about this kind of stuff is LinkedIn the best channel or is there a better spot to reach out to? Cory: perfect. connect Jeff: Yeah. Cory: I'd love to hear from you. Jeff: Awesome. we'll have to have you get back on at some point down the line and see all this stuff is turning out. But Cory: Yeah. Jeff: for coming on. Look forward to staying in touch Cory: me. Jeff: thanks for the time. Cory: Appreciate it. Thank you so much. This was great. Jeff: Awesome. Have a good day. Cory: You too. Bye.