Best of LaunchPod 7/4 === [00:00:00] Welcome to LaunchPod, a product management podcast brought to you by LogRocket. Today, while those of us in the U. S. are taking time this week away from work, we're bringing you some of our favorite snippets from LaunchPod. On today's episode, you'll hear from Kristen Dorsett about prioritizing big bets, Steve Chazen about how working at Apple in the 90s with Steve Jobs transformed how he managed product, and Roman Gunn, who talks about work back plans focused on the design North Star. First up, we have Kristen Dorsett, Viator, talking about how she prioritizes and gathers data and how that impacts her Big Bet initiatives. kristin: There's a lot of ways to pick and a lot of ways to prioritize and gather data. I've definitely found over time, a lot of them don't work that well, or a lot of them will yield you results are mixed. There's a few really good ways to do it. It varies depending on your business and the usually the good ways are the hardest ones. So how do you guys do that over at Viator? What does that look like? How do you discover what you need to be working on and what you need to [00:01:00] prioritize? And I think you put it well, like what are the big rocks in your play? So I'll talk, I'll talk a bit of how overall we plan as a company and then how that translates into what we work in product. So overall as a company, we have a sort of longer term strategy and vision for where we want to go, like where we're going to play, how we're going to win. And then that translates every year into our annual company OKRs. And this is okay for this year, these are the five things that we're going to be focused on as a business. And then what we do is have a bit of a bottoms up process of, okay, what are those big bets that we want to make as a company. that we think will help us achieve these outcomes that we're looking for. And I say it's bottoms up, it's a bit of bottoms up and tops down, because we have some things, bigger strategic things that we know what we want to go after, and capabilities that we want to be building. And so some of those we like, we all have to write proposals. So even I will write a big bet proposal every half for things that I want to see happen. And then we basically assess all of these proposals and decide, okay, what are the ones that [00:02:00] we think will drive the most change. immediate short term impact, as well as what are the ones that are the more longer term strategic rocks that we want to be making sure we're making progress against. And we get that list down to 10 to 12 things typically. And those are the big initiatives that R& D Focus is on and that takes up about 50 percent of our team's capacity is the aim. Sometimes it's more, sometimes it's less. And the goal would be like, these are the big bets. These are the big rocks. And then teams each have their own individual charter of this is the part of the journey we want you to be optimizing. This is the customer problems we want you to be solving and the teams can fill in the rest of their roadmap with, with the team driven work that helps drive their charter forward. I think that it's great to have the agility to move quickly, but at the same time, you need enough time to actually do something impactful. Oh, and back to the prioritization of those bets, which I didn't really answer. So, for every single bet proposal, we do have an opportunity size. It's the very back of the envelope, and we're looking at, [00:03:00] again, the short term versus long term. impact. And so we were weighing, okay, that actual opportunity size, what are the biggest opportunities we could be going after, layered in with the strategic side. Like sometimes we'll do something and maybe it's a bit smaller opportunity size, but is more strategic. And then sometimes it's just those compliance things that come in that you just have to do. always the things you have to do that just there are no, uh, no negotiation. That's when you want that optimizer in there, that right person, you know, is going to button it up, do it right, and I'll just get it done. Yeah. Hyper detailed people on those. Going kind of one step down below, you know, that kind of planning, how do you all get there? What fuels kind of. Those proposals are, you know, what gets you to that kind of level of knowing what you have in the hypothesis that you think is going to move the business, right? I'm sure there's lots of different kinds of data out there. I think PMs are in the enviable and probably unenviable position of there is no shortage of data information you can use. So what's the process over in the product or via Tor to [00:04:00] understand that from a data perspective or to get your inspiration or drive these projects forward. We definitely have a lot of sources for data. So we start at the more macro level, looking at the market, and we do market research around, okay, what's the market? traveler trends and what are the different, the different things that different personas of travelers are looking for specifically for experiences. So that's one, one, one lens we take is how do we super serve those personas that we've decided we want to be super serving. Then we take it down to user research and we have, we do a lot of qualitative Research with our customers existing in perspective of what's going well in our product now, where is the friction in our journey When we're rolling out features, we do usability testing before we roll them out to make sure that people can do the thing they're trying to do. And so that produces a really rich data set. We also have a voice of the customer program where we're collating all of the data we get from all of the written and spoken contacts we have with customers. So everything from [00:05:00] customer service contact. But phone, email, chat, as well as all the reviews that come in, as well as external reviews like App Store or Trustpilot. And we put it all in a big database, use a bunch of models to get insights out of it. And it helps tell us some of the themes of where we can do better. And actually it like it's one of our most valuable tools, like it can tell us even down to this page in this destination is causing problems. And so that's qualitative. And then on the quantitative side, we have product analytics tooling, obviously, which is like we, we have BI tooling, where we can help you use it to spot opportunities as well. I'm really curious, though, there's always a customer program, because I read, you talked about it before, and it gets into the point of even being a queryable database of customer feedback. And that's one thing in a B2B tool, you have hundreds of customers, maybe, maybe A few thousand, but Viator has what I think I read 300, 000 vendors alone, let alone end customers. So [00:06:00] how do you, can you talk about how you do this? Because that's just a massive, that could be a product unto itself. Almost that's a massive undertaking. I mean, it is basically an internal product. It's owned and driven out of our customer experience team. They actually created it originally because they were looking at All the contacts we have coming into our call center and they were trying to figure out how do we make some of this self service so that these travelers don't have to talk to a customer service agent at all. And so we started with just that job in mind and over time I've just added more and more data to it. It is an internal product and I think there's a very, there's a small team that owns it and continues to bring it forward. This is one of those things I always love to hear about because I could talk to you for an entire hour just about this, you know, can you talk about maybe with that level of data? What? What have you been able to do with that? What is that actually driven from an end result? Can you talk maybe about a project or a change that you guys have been able to pull out that was material that came from that program? There's dozens. When I, when I asked the head of this program to give me some wins, she had a very long list. But what I'll talk about is we [00:07:00] have recently been investing a lot in our app and we're seeing more and more of our travelers using our app versus our mobile web or desktop experiences. But our measurement of our app isn't as strong. Because it's less mature. And so we were trying to figure out, okay, what are the biggest customer problems we're seeing in the app? And so we mined the voice of the customer data. And the overwhelming theme was manage my booking was just one of the key jobs to be done that people were really struggling with. And it was actually driving cancellations. And so it was one, it came out loud and clear in the data, and we actually, it's a big enough opportunity that it became a big bet. And we've spent the last year trying to make those jobs to be done a lot easier. And we use the voice of the customer data to mine, okay, what are the other, when people are in this part of the app, what are the other things they're calling CS about? Or what are the other things that we're hearing about in the app store reviews? And we were able to come up with a short list of, okay, these five jobs, like, we're not meeting expectations right now. And we've been working through [00:08:00] and fixing them. And how does that kind of slimmed down process work? Because I'm sure with that many users, it's not like something where you can go off of, oh, 20 people mentioned. I'm sure there's even the most esoteric things get a few mentions. How do you, how are you kind of really narrowing in on like what those five priorities are from this? So I think we use voice of the customer as a canary in the coal mine. Often of it's telling us where the problems are. It may not be able to tell us the size of them, because again, account, it may only be , 20 of these customers that actually picked up the phone and called CS about it, but it's enough and it spikes in the tool. And we're like, okay, there's a problem there. Let's go size the problem. And that's where we will then lean into our product analytics tooling or do user research studies to really get into the underlying why. Next, we have Steve Chazen. VP of products at alarm. com, who came on to talk about his experience working at Apple in the 90s and how the experience working with Steve Jobs impacted how he works with product today. Steve: So Apple was, I joined in [00:09:00] 1991 when Apple was a rocket ship. They could do no wrong. Um, by 97, the wheels started to fall off that bus. Steve had left two or three years before I joined and the company really had lost its way. What was working before and all their unique value was usurping the company. by Microsoft, right? Windows 95 kind of copied everything that was unique about the Mac. And there was really a vacuum of leadership. So I really loved working at Apple, but I didn't like those times. And I resigned in right around 1997, right after jobs had come back as a special advisor to Gil Emilio, which was really more of a marketing ploy then to get the company some attention and extend a lifeline to them, but they really had no. Product roadmap, no developer roadmap. It was really dark days and almost nobody knows how close to going out of business. Apple came in those days. And so I wrote a note to my boss and I just said, Hey, look, I'm a fan. I'm a shareholder. I'll be [00:10:00] rooting from the sidelines and copied sjobs at pixar. com, which was the only email address I could find for him at the time. And, uh, Couple days later, he calls with my boss on a speakerphone and told me to get back in the game. Get off the sidelines. He used a more colorful language. And I got to essentially write my job description because I told him what we should do. And he said, go do it. And I was able to do that work directly for him for the next 18 months. I told him it would take two years. We got it done in a year and a half, and it was a really exciting time. It really wrote the, the book. Formula for Apple's resurgence, really listening to customers, understanding what the product needed to do versus pushing what you had on the shelf at people. And it changed, changed my life in the way of how to build great products and understand why you build great products. Wow. What was the project you worked on? So I reconstituted what was called the Apple university consortium. And if you remember the early days [00:11:00] of Apple, the Mac was so different back in the late eighties that no one really wanted to write software for it. So jobs went to the university market and picked handpicked like 25 schools and said, if you, will endorse this platform for your students, teach them about technology, which is going to be in their lives. And that's the mission of the university. We'll give it to you at the cost. It costs us to make it. And so in the early days that worked, there was a bunch of very well known universities that built a lot of the early software on the Mac and standardized on it long before anybody thought. A student needed to buy a computer in college and that worked in the early days of Apple and I rebuilt it again in the late 90s when the same problem was happening. All the developers are leaving the platform. Windows look like the future and a lot of universities had committed so heavily to Apple that if Apple went out of business, they had to start over. They were completely 100 percent requiring every one of their [00:12:00] students to buy a Mac on matriculation. So it was tough for them to admit that future. So we knew they wanted us to survive. We needed them to survive. So it was this match made in heaven. And every. Three to four months. We'd invite them out to Cupertino. We would show them what we're working on would get their feedback. At certain point in the presentation, Steve would kick out all the Apple people like me and talk directly to the university leaders and really drill into them why they were important and why we needed each other. And again, most people don't know, but the foundation for the Mac OS then was partially built by these universities themselves. It, it, the Mac kernel was built on a Carnegie Mellon foundational shell called Andrew or Mock. And Dartmouth and Harvard and MIT all built a lot of the plumbing for it. In return, we gave them an open source license to the Mac, which again, most people have not even paid attention to that because they didn't want to commit to Apple in the event [00:13:00] Apple went out of business unless they had the code to continue their enterprise. And again, it was very fortunate that Steve agreed to do that and that helped turn the company around. And the product we showed them was the iMac, which. With many ways was built for the university customers in mind. Got it. So they were, you were working together with all these universities to develop the next big operating system. And that's what sort of usurped the Windows 95. It was the future of the Mac. Again, there was only the, what was the system seven, system eight, that was their platform, but they had to take what the universities had built, which was the foundation for Next in itself, and then build on top of that all the components to make it into a robust operating system with all the security and platform requirements. And a lot of that was built by these university partners, converting it from what was running on the Next platform to the Mac hardware. And then later on doing it again, when they converted [00:14:00] to the Intel platform. How did you get that done in 18 months? Was it a mad dash or nights and weekends? A little bit of both. I live in, I lived in New Hampshire at the time, but my head. My, my office was in California, so I would spend two weeks out there living at the Saratoga Inn in Saratoga, California, commuting to the Apple office and working well into the evening to make that stuff work. And that was also the same place we put the university leaders in Saratoga, and so we gave them a nice place for them to work together. So we. Benefited from having all these motivated individuals focusing on the same problem, which was save Apple and then committing to buy the platform that they were putting their ideas into. So it was really a very unique time and it worked the iMac, not just the. The software and the hardware, but the marketing around it saved the company, even though technically it was no [00:15:00] different than what we were selling 18 months before it gave us that leg up to go. And in the meantime, the thing we were doing behind the scenes was the think different campaign to get people to think different about Apple and why they wanted a computer in the first place. Right. Very cool. Yeah. It goes to show how much of software is just getting the messaging out and people knowing what you do. And that's great. Are there any. Like Steve job anecdotes or lessons that you take have taken with you or maybe some that you've shied away from having worked with them Yeah, that's a good question So one thing is really understanding why you're building the product a lot of companies They have a hit product and then they kind of like Milk that forever, but they've stopped thinking about why they built the product in the first place or who they built it for. Steve was always insistent that the people who were building the product were using it constantly. The feedback came from the users of it, not from the designers or a piece of paper. The other piece, which is [00:16:00] unique to Apple and is something I took to the rest of my career is typically marketing and the sales are on one side of the business and product and engineering are on the other. And when times get tough, they point at each other. The product people and the engineering people say, Hey, why can't you sell what we gave you? And the marketing and sales people say, What you gave me can't be sold. And at Apple, it's not like that at all. Before you start writing code or building a design, you decide what the message is, why it exists. It's reason for being. And then everything stems from that and you get a very clear picture of why you're putting a feature in or why you don't need that feature. And saying no becomes this liberating activity because you don't have to add that feature if it doesn't support the reason it should exist to begin with. And a lot of companies forget that, but it's, it's something unique to Apple, right? It's in the same people's brain. They know why they wake up and build this feature or why they message it that way. Because. It needs to solve this problem better than [00:17:00] some of the product that might already exist. And finally, we talked to Roman Gunn, VP of product at Zeta Global, who came on to talk about his experience with what he calls a design north star and how he uses that to create a roadmap based on customer needs, opportunities, and industry demands. Roman: So I love to start with something called a design north star. So essentially work with great design book to create a end state that we're trying to reach. What is the ultimate thing that we're moving towards? And it doesn't have to be perfect. It doesn't have to use all our standardized components. It doesn't have to have all the answers in terms of deep flows, but it has ultimately where and why we're going. And because of that, it brings the reality of the end state much closer. It starts to one feel more attainable and feasible. And two, it gives a common language and something for people to point towards. What are things that I've learned in my career is that. It's unfair to make people imagine at [00:18:00] times you want to give them something to reference and by having a visual aid that they can say, Oh, this is one polish to, this has all the elements that I want and three, like, how do we get there now? I think that's the thing that we really try to build. And then based on that, we then start creating. Different pieces of how do we build towards that? So we start figuring out what are the most essential solutions. What are the solutions that scale into other parts of the product and essentially the entire architecture of how it's going to work, but it all starts with, this is where we're going. Because without that, I think a lot of companies fall into what I call the incremental march to nowhere. They just build and measure and build and measure, but. Where are you going? It's very thick. You moved like another mile, but a mile towards what? So that's what we want to avoid. And when you're building that work back plan, what are you working back from? Because I think given the nature of products you work on, you could be looking at where you're going to be three years out. Where do you want to be a year out? There's always this kind of idea of what's critical. [00:19:00] And then there's all the add on things that you could add on starting from the end state. What's the minimum we need to do to launch? What we need to launch initially and then go over, how does that work? How are you looking at all the additional things you can do afterwards? So we start with the end state and then we figure out what are the most impactful points in getting there? What are the things that the industry demands right now? And what are the biggest opportunities for us to grab? Because I always find it's interesting to balance between the necessity, the function and the sizzle, right? You have to deliver both in lockstep. Because you're going to be missing out on different biotypes if you don't focus on both. So I always try to think of both the tactical and the strategic. The way I talk to PMs about this, sometimes it's like, look, you have to think about 60, 70 percent functional ethics that people aren't going to necessarily celebrate or share, but it's going to make their life significantly easier, they're going to be locked into your ecosystem and the other 30 to 40 percent is going to be the things that cause people to buy your platform in the first place. Do you have a [00:20:00] process for how you understand what those different elements are? Are you talking to customers? Is it gut feel? Is it? A little bit of both or someone else. Yeah. So you definitely need to work with individual business. So you have, you definitely need to work with the people who are in the weeds, actually using the platform because those things aren't aligned to always. And then you do have to intuit a bit of where things are going to go, right? Because it's one thing to build what a competitor has. It's another thing to say, I want to be better than a competitor. And you can't do that by copying. You have to do that by finding inspiration. And sometimes that inspiration is best found, not within the competitor, it's best found within an entirely different space. One of the things that I appreciated working in an agency environment earlier in my career is being able to see how much of the problem sets we try to tackle are similar from industry to industry. There's a lot of, if you do some good pattern matching. Like 70 percent based kind of problems are the same. Like, how do we like typotize things and [00:21:00] scale them and make them repeatable? And we operationalize the folks that are hands on keyboard. How do we keep the strategic users happy? And it's same from industry to industry. The difference is the working solution. So if you think about those different pillars that you have to hit, you have to pattern match against it. And now that you know that there are patterns there, you just look at like the most interesting and compelling and easy solution, even if it doesn't come directly from your shield, right? Having experienced across all sorts of things. Like we said, the windy road gets you to a good spot. When I was reading our background here, you have five pods over at Zeta. What are they focused on? It's not just all generative AI, is it? At Zeta, we actually have before the pod layer level, the Layers, so we have the intelligence layer, which is where all the pods that I head up are. Let me also have the experience layer and the data layer. So the data layer is all about how we ingest information, how we make sure that there's great governance around that, great security around it, et cetera. We have the experience layer for how do you execute on all these things? How do you actually create segments? How do [00:22:00] you send your emails? How do you target on CTV or other display channels or direct mail, et cetera? Yep. Then you have the intelligence layer, which is how do you combine all of the data that's ingested into something that's meaningful and actionable. And in order to do that, we have things like analytics, we have things like forecasting, we have things like recommendations, we have things like personalization, and we have things like MLOps and generative AI. So those are the pods that we're tackling. And the next that we're going to be standing up is, It's going to be splitting off attribution from analytics into its own dedicated space. I won't get on my soapbox today, but. Attribution is one of the ones that, uh, I have lots of opinions there. It's actually come this summer when the cookie less world kind of crumbles. We're going to have to figure out how do we really attribute it in a meaningful way. Right. What's the right point of view to have in a world where people do want to feel like they're secure? That their privacy is being respected, but they [00:23:00] also still get the relevant type of marketing to them. So that's something that we really have to focus on. And how does the marketer really credit the right channel? Cause you have last touch models, you have first touch models, but all those models are out of touch. So we have to figure out the right way to do it. It's such an interesting problem. I've worked at 10 plus companies at this point, over 20 years of working in marketing and no one does it the same. I don't think any of them were dead on wrong either. So it's just a passionate topic of mine, but, um, I digress. But I think this is right. And this is going to be the big challenge. I think again, come this summer, where it's just like, is this is an opportunity for people to unite on what is the formula for this? Can we standardize this? Lots of other technologies, whether it's USB C or whatever have you, they said, okay, enough of all the wires, let's pick a universal standard. Maybe it's time for attribution to do the same. And that's it for this week on LaunchPod. Be sure to follow us and review us if you like this podcast and come back next week to hear a brand new episode. See you then.