Video === Andy: [00:00:00] Will more data change the decision? Or is it just going. To make you feel better? I just think that's the trap that in the day and age that we're in, we have so much data and so many tools but I think At the end of the day, people make decisions, not data. And we can be more pragmatic on when we use the data and when not. Jeff: Welcome to LaunchPod, the show from LogRocket, where we sit down with top digital and product leaders. Today, we're talking with Andy Boyd, CPO at Appfire, Which provides a portfolio of over a hundred collaboration solutions. In this episode, Andy details his no fail growth formula from driving PLG at IBM Watson, the ways he uses data to drive experimentation velocity and avoid decision paralysis, and how he slingshotted his career by focusing on working with great people rather than chasing titles and compensation so here's our episode with Andy Boyd Jeff: Hey, Andy, how's it going, man? Welcome to the show. Glad to have you on. Andy: thanks Jeff. It's great to be here. I'm excited for the conversation we're about to have. Jeff: I am really excited because, I've known AppFire as a customer now for a little bit, but you've had a really [00:01:00] interesting history. You have spent, past several years at AppFire, a major period of. Just hypergrowth. You guys have 20 X in four years, but before that, your team actually over at IBM you and the team had some mild fame for introducing the first growth team for IBM Watson back in the day. So maybe tell us a little bit of that first. Like how'd that come about? Andy: Yeah it's really, it's so much a part of the reason why I'm here at AppFire. , the short version of my career is prior to joining IBM, I was a developer and I worked with small startup companies doing product management and marketing. And then I worked with IBM in various different roles as part of the team that launched the Watson platform and different roles in Watson and then a little bit in the cloud business. But the part of my career that you're talking about, Jeff, was. There was a point in time at Watson where we launched a growth team and it was the first growth team really in Watson for sure. And so nobody really knew what the growth team was supposed to do. If you've done product management [00:02:00] in a large organization like that, the product managers are very focused on meeting with customers. Finding new features to release. There's a lot of like meetings and cadence and making sure the plans are going to Going, the plans are going on track with what they expect, but when we launched the first growth team, like nobody really know what it was and it was awesome. So our mission was to ultimately drive. Monthly active users that was, and so we just got to work and a lot of that kind of thing that you would expect in a larger company, like the plans, the meetings, the cadences, we just had a whole lot less of that because people didn't really know what we're going to be doing. And so we just got to get to work on doing all the fun, hard things to improve the product. And so what happened was we set out, we built the team, we identified some of the ways that we're going to drive monthly active users. And then we just got to work and we were working for a while. And as and we'll probably talk a little bit later, the real power of a growth team is in compounding growth, [00:03:00] driving the metrics over time. And after a little bit of time, we started getting some great successes in some of those metrics we were targeting. And people started talking about what is this team doing? Tell me what you're doing, tell me more, teach us how to do that. And that's what happened. Jeff: Now, when you're saying kind of growth, is it more in the lines of demand gen going out and how do you drive acquisition or is this, how do you drive usage? What were the levers you were looking at for driving uh, MAU there? Andy: Yeah that's a great question. A lot of times growth can be thought of as like the marketing aspect. We were definitely squarely in product. So we touched a little bit on the website. And helping improve the performance of that website. But we were very focused on the product itself. We were looking at when we started, we looked at the whole journey from sign up to activation to retention, et cetera. But after a short amount of time, we really got focused on. The deeper parts of the funnel as it related to the product. So activation, retention, [00:04:00] conversion referral, Jeff: So what I'm hearing is IBM was actually an early player in the PLG space before, it was cool to call it PLG. Andy: I think. So in retrospect, we were the growth team. But the work that we were doing was really PLG before it was coined the term PLG. So I think we were definitely one of the first teams doing growth or doing that PLG work. Jeff: Yeah. I love it. It's everyone. I remember a couple of years back whenever it was talking about PLG as this new innovative When I first came to log rocket, we were doing PLG and it was, get people into the trial drive usage and, we had, our kind of aha moments laid out, but there was so much content out there from thought leaders talking about this new magic thing. And this is only six years ago. I joined the team here and I was just thinking, Yeah. I've been at three companies already that have done this. This has been going on for a long time. They just didn't have a name before. It's always great to see these kind of, other spots where this was coming through, but you [00:05:00] didn't like I started a marketing and I had to work my way through the quantitative side, deal more with technical people and finally get into driving PLG. That way you came at it from a quite a different way. You started out as an engineer. And you were just curious. It sounds like you just wanted to make things better. And you morphed that into PLG or like, how did that. come about Andy: Yeah, the definitely earlier in my career before I joined IBM, I was doing some web development and I started doing the website. I was responsible for the website of the software company. And as a developer, I was curious. I wanted to see how I could improve the performance of the website. So I just started doing a lot of things that people do in performance, digital marketing. Optimizing the landing pages, running different tests. And I really enjoyed it. It turns out if you're a good developer, you're probably good with data. You can make some of the changes in the website itself to start driving some of those metrics. So I really had the right combination of skills, [00:06:00] the development, the the data and just the curiosity to improve it. And so that was like an earlier experience. And once I really started to enjoy pulling the levers and improving the performance of the website, then I wanted to go further. I was like this is fun. Jeff: keep pulling at that thread, right? Andy: Exactly. I just like to pull the levers. So I was like the next place I should probably focus on is the product. And so I just started suggesting some changes with the engineering team and seeing what the impact was. And I really liked that. That got me a lot deeper into what I thought at the time was more interesting things. And then somewhere around there, I just found out that was actually a job that was product management. Well, I'll do that. And I haven't really looked back since, but then fast forwarding to that experience at IBM I definitely joined in a product management role and had more of the traditional product management experiences, launching some new first of a kind AI products. When we were launching that first growth team, I really had that unique set of background where it's a [00:07:00] little bit entrepreneurial background in some of the data, the experimentation types of techniques and tactics, also the product experience. And so when we put all that together, it was really just a great time to launch that team. And within the team I was in, I definitely had the right set of experiences to be able to do that. Jeff: It's always interesting to hear everyone's path here because I took the counter one I started marketing and realized I liked fiddling around and changing the website to get people to do things It's a much more dangerous path to do that because I didn't have the technical foundation. I just lucked through it before we had chat GPT and I'll leave the story out, but that's how I accidentally crashed, Dynatrace's website on a Friday afternoon when the entire engineering team was in Linz, Austria done for the day. So I think. As of a couple years ago, at least, they still had the rule of no deployments on Friday afternoon in the U. S. So, you came in you helped grow the beginnings of growth and PLG at IBM for Watson. But then a [00:08:00] funny thing happened. It sounds like of your team's success permeated and suddenly you found yourselves not just doing growth experiments, but you're training others within the company as well. What did that end up looking like? And that go? Andy: Yeah, that was a really fun experience. And actually at the time, our team was the growth team and the first growth team for Watson. There were also other pockets of people starting to do some of this work as well. So we weren't the, we weren't the only ones after some period of time. But the experience you're referring to was. I had this awesome opportunity to basically work with a number of different teams to essentially teach them how to do this kind of growth work. I fast forward, to the end of the story, our team had launched about 20 different growth teams across all different areas of the business, different product lines, different geographies. We work with some product teams. We also work with customer success teams. And so it was just an amazing set of experiences. And [00:09:00] the way that it worked is we just had this small team of people that were like me doing all the growth work, like the people on our team, and they just wanted to teach others how to do it. And we built a very simple framework. I think today those frameworks have advanced further and people have a lot like more sophisticated methodologies, perhaps. But our process was really simple. I could still apply a lot of it today with some like adjustments to some of the new ways of thinking, but a lot of it still applies. So our process was really pretty simple. We would go meet with a team. They said, Hey, we're interested in working like this. They would raise their hand. That was really important. I said, we want to do this. They had some buy in and we made sure that they had some prerequisites in place. So having the data, some understanding of the customers, and then we had these two modes of operation. We had a kickoff, really simple and then we had this, how do you operationalize it? So the kickoff was just a couple steps. It [00:10:00] was select a process or a metric that you want to improve. Then what we would do is we would help that team walk through and demo it. As a user and document it what does that really look like after they would demo it and document it visually. Like we had these massive virtual whiteboards with all the steps of the process, we then get the data for each step, look at where the opportunity is and we prioritize where we thought we can make the impact based on high impact, easy to execute. And then we would launch our first batch of experiments. And that was the kind of kickoff and then operationalizing. It really just started to feel a lot like agile. You'd have your backlog, you'd be running different experiments, keeping track of what's live. And then once those things are done, you're just capturing and sharing the learning. And that was the very simple process. And that's what we use to launch those 20 different teams across all different parts of IBM. Jeff: I think one thing I love about this is, yeah, like you said, maybe there's a little bit of more complexity you can add or some upleveling that [00:11:00] has come about from, back then to now. But in reality, I think most of that, if we looked at it and stripped it away, does it really add that much? Or are you gaining, power orders of value on that? Or are you just adding complexity to add complexity in reality? I think we've found in general, at least here, when you strip down a process like this, just component kind of fundamental bits, that's when you're really gonna get the big the big levers. I think I heard a story about the website wasn't converting. When you guys were running this and at heart, it sounds like it was developers want to be marketed to any different way than maybe product people do or different than marketers do. And that was the core finding, but it drove a lot of follow on. Andy: Yeah, I think that one of the projects that we worked on I think by the way, getting started on a website, it's always a great place to start with some of these techniques because like the feedback cycles and the loops are very fast. We did a lot of work on products and I love working on products, but if you're in marketing or you're in product, you can apply these in both of those areas and domains. But the [00:12:00] project that you're referring to was a developer website for some of, at the time it was the Watson APIs. And one of these websites, we really thought that we could get it to convert at a higher rate. And so we said, Hey, what could we do here? Working with the marketing team to drive performance. And you're absolutely right. I think the first thing that we did together was we said, when we looked at it, we're like, developers don't want to be sold to developers want to use and they want to buy. And so I think that was probably like the cornerstone of the work that we were going to do. So we said, how do we make this website work for a developer, teach them how to use it, help them understand that by, but don't quote unquote sell to them. And so like in the end, there was a couple key, key learnings, which I'll come back to in a second, but what we did very practically with that mantra of how do we orient this towards a developer? We just started that process. We looked at all the different pages, the journeys that people were going through, and then we just started looking at the data and the [00:13:00] analytics. What are people clicking on? What are they reading? What are they engaging with? And ultimately what we did is we got rid of all the stuff that developers were ignoring. We really just focused on the most engaging parts of the content. And it's obvious in retrospect, but probably the three real keys to success in this whole project were the first is we got the demo right up front. Get the developer in, help them understand what the product does. That's super important. The second thing, which maybe people don't immediately jump to was documentation. We thought was that people would come into, the developers would come into this website. They'd be looking at the demo, they'd be reading the content and they'd go sign up for a trial, get started, and then they would use the documentation. But what we saw was that, like I said, developers want to learn. They want to use it. They want to get hands on the developers were looking at the docs and looking at the actual function calls, inputs and outputs, and they were using that as pre sale to [00:14:00] learn. We're also using it post signup, but the documentation was really a cornerstone of pre and post support for the developer that was important. And the last one is pretty simple. It's the kind of things that everybody would expect. It's just elevating the signup. So just make it easy, no matter where you are in this journey to get that API key and get started. And so that, that project ultimately, I remember in the first couple of weeks on the homepage, when we started, we increased that signup rate by 50 percent as we went across that whole website journey, and it wasn't a lot of time. Just because again, the amount of volume in the cycle time. But as I recall, we collectively about double the sign up rate on that specific website with just that very simple process. And those three takeaways were the key learnings in that project. Jeff: I think it's something overlooked so often is people want to sell based on, value or people want to sell based on benefit, which I think is. Not a [00:15:00] bad thing necessarily. It depends on your audience, what does it do? do I use it? Where do I sign up? How do I get started? And it's, you want to make it super easy. And then the docs are part of the product. I think you mentioned that earlier when we were talking, the docs are part of the product experience. Andy: Yeah, Jeff: can answer those three things really quick, just was it do how do I use it? How do I get started? Value comes value is also important there, but that's the questions people are going to answer So and that applies to developers, but I gotta say as a tried and true marketer for 20 years I can't tell you how many websites I land on that tell me, we, we transform how you operate with this. We, change the world and the ability to do that. And I find myself like flipping through, going to features pages or going to the docs. I just want to know what do you actually do? What am I functionally going to do differently once I start using this tool? Andy: yeah, absolutely. And I like what you said, too, about the audience or the persona that you're communicating with. I think as I reflect back on that particular project, that was really key [00:16:00] to we had a part of the experience in this case, the website that was trying to do all things for all people, like the C suite down to the developer. But as we really focus this one on the developer a lot of the stuff that was on there just started to fall away because it wasn't what that it's not, wasn't what that developer was interested in. So, yep. Jeff: no I love the framework and just the simplicity of, what do you want to change? Look at the experience and now let's start to drive it. How do we affect that change? I think it's a framework that, that more people could learn and honestly, it's probably more complex ones out there, but sometimes, simple is straightforward and best. And what is it? Slow is smooth, smooth is fast, sometimes it's Easy is smart. Something there about ease and not making it overly complex. Andy: Yeah. Jeff: you took this experience and these wins and many people would have been happy to have a great career at IBM, but instead you did the thing that I feel like so many people are talking about doing, but don't actually do. You went and wrote a book [00:17:00] about it. You could say the knowledge here could, could spawn a book, but you actually went and did it. And I read through a good portion of it. It's 150 plus pages. This is a proper book. This is not some ebook. And you published the enterprise growth playbook and that, that got you speaking over at what is it? Growth hackers conference. And that kind of opened up a whole new set of doors for you, huh? Andy: Yeah. I appreciate that. I think it had always been one of my goals to write a book and I just really liked teaching people what we were doing when we were working as part of that growth within Watson, but even more broadly. So I just wanted to teach other people how to do it because it was, I enjoyed it and it's a lot of fun. And we learned a lot of great things when we worked with so many of those. Teams across the business. And so I, I wrote a book and it really, if you were to read it a lot of what we're talking about with our process, it's just very practical. If you're trying to launch a growth team, you don't exactly know where to start. It covers that kind of process of how do you get started? How do you structure the team? [00:18:00] I think one of the other things that was really important that we learned in this experience at IBM was also creating the right culture. So you can follow a process and you can do work. But one of the things that was also really important was creating a culture where people could experiment. More importantly, if you're experimenting, some things are going to work, some things are going to fail. And if you're really using data to drive growth, the positives and the negatives are both data and you want to learn from it. So it talks a little bit about how do you create that culture. Which is really important, especially at the leadership level. And then I also talked a little bit about how to operationalize some of this more holistically with some of the tools and the data stacks, things of that nature. So it's just a very pragmatic book in how to really get started. And you're right, this is how I found my way to Appfire. I could share a little of that story if you'd like, Jeff. Jeff: Yeah, definitely. First, I just want to flag honestly, for anyone who is, looking to either a get into growth or B think you know it, but you [00:19:00] want to learn more. I would heavily advocate checking out this book. Like it's a wealth of resource. And it's at the, is it andyfboyd dot com or. Andy: Correct. This is just my personal vlogs. Jeff: Yeah. And it's online, but it's there. And it's I cannot recommend it highly enough. I started to skim it just for, this to talk with you and found myself reading, I think almost all of it at this point, because it just really cool lessons from it. Andy: appreciate that. Jeff: in there before we go on to the next one that can make sure you go check this out, because if you want a career in growth, you just want to get better really good practical advice. But one thing. I liked I can't remember if I took this line from it. So if I'm plagiarizing, you please let me know, or if I just synthesize this from it, but I like the concept of. It seemed like one takeaway was you can teach the framework, you can learn it from the book, there is no silver bullet, right? So often, I see people who want to talk about what'd you do? Or what did this company do? Or what'd you look at the fastest growing startups. We had Jeff Charles from, who's the head of product at ramp speak at one of our events recently. And, people wanted to ask what'd you do to grow ramp so ridiculously fast? And, the thing is what they did probably has [00:20:00] nothing to do with what your company would do, what you cannot pot, like there's a world of companies that could not use the developer framework that we just talked about on their website, but the process you talked about is dead on. And I think it's so awesome seeing it spelled out here where you go into examples, but the examples are how you think through it. the end results are, you give the example of what you actually did, but that's not the thesis of the thesis. How do you think about getting to that example? Andy: Yeah, absolutely. Jeff: an important piece. Andy: Absolutely. That was really one of the key foundations of when we did this work, like I remember when I had this experience when we started sharing some of the work that we were doing with other teams within IBM just like you said, people come and ask us, what were you doing? And they were thinking that we would give them this silver bullet. Kind of the classical example is in the early days of Dropbox, when they would give you more storage space for Sharing Dropbox with other people. I think that's awesome. I love that example, but you don't always find those things and people were expecting when they came to talk to us about what we were doing, [00:21:00] they thought we were going to give them all these like really interesting silver bullets, and a lot of times what I would say is if I would show you all the things that we did, you would be rather uninspired. Even when we talk to the example of the website, it's just these are like good practices, right? So when you look at a lot of the tactics it can be rather uninspiring. But what I would tell them is the thing that's really the magic is the process. And the reason the process is so valuable is because of this wonderful thing called compounding growth. One of the fundamental premises of the work that we did, but also what you would find in that book is that notion of compounding growth. And it's the idea that, Every day. You're just trying to get a little bit better. Just a 1 percent improvement, 1 percent improvement. And when you stretch that out over time, that compounding 1 percent can get really big. That's the real power of it all. And so then the corollary is like, how do you really take advantage of that? , I think that's really where is the [00:22:00] operationalizing and the process comes into place. Because what you have to do is increase the cycle time, like assuming you have enough volume to run these processes with your products or whatever, you really have to try to optimize that cycle time. The faster you can do more loops, the more learning you can get, the more gains you can get, which then it just accelerates that exponential growth curve. And so that's, that to me is really a lot of the magic behind all this. It's just optimizing that cycle time is where you can start to accelerate that compounding growth. Jeff: And that has to go a little bit back to, there was a section there called data for decision making not perfect data and people make decisions, not data which I love because I've, mentored a lot of junior marketers and growth people who talk about , 95% significance, or, we're having to run a test for three months before you get results. I'll admit this is a portion I skimmed a little bit more cause I already had a pretty deeply held opinion on the subject, but even just the title hit home and is that kind of what that's talking about is [00:23:00] how do you make decisions, how do you get enough data to make a decision, but not so much that you're just paralyzed waiting? Andy: Yeah, I think that's really important. Definitely ties into that cycle time. I think that the whole idea behind that statement of data makes data for decision making, not perfect data. It's just, people make decisions, not data. And so I think that there's going to be a lot of cases where you have all the tools and all the data and you can do the stat significance, 95%, whatever. Great. But there's going to be also a lot of scenarios where you're just not going to be able to do that. And it could be a couple of reasons. It could be that you can't measure the thing that you're trying to measure. It could be that just the experiment that you're running or the thing that you're trying to test, it just takes a long time to gather the data to really make a definitive call. It could also be sometimes that whatever change you made isn't significant enough to really show one way or the other. I think one of the things that I've tried to do back then, and even still to this day is. [00:24:00] trap you can get into is trying to delay decisions just because you want better data. And I think that's just going to slow you down. I think what you have to realize is people make decisions, not data. You got to use the best data that you can. And you got to try to decide and make the best decision you can at the time. And that should be able to accelerate speed in certain cases. Jeff: If you can make 10 decisions and, seven of them are right. As long as you're not, as long as not one just terrible, horrible, really badly impactful one, you're probably better off than waiting to get perfect answer for one. One thing I think what's the quote from back in the day? said you go to war with the army. You got not the army you want. I feel that way with data at times is you gotta go with the data you got, and that's why we hire people who have done some of this before and have thought through and likelihood of being catastrophic and you're wrong is probably less likely than just the catastrophic outcomes that can come by just being stagnant and not moving at all. Andy: Great point. There's sometimes there, there are certain decisions where the risk, the impact is high, the risk is high. [00:25:00] Do all the diligence, but there's also a whole lot of questions that if you make the decision, it's probably not going to be catastrophic. And so make the best decision that you can. The way that I really came to learn this was in an earlier point in my career, I had a manager and a mentor who had this awesome quote. He used to basically ask the team, we'd gather around and we're going to make a decision on something. And he used to ask the team, will more data change the decision? Or is it just going to make you feel better? And he's if the answer is Jeff: I love that. Andy: better, just make the decision and go. And I thought that was an awesome lesson that I still carry to this day. And I just think that's the trap that in the day and age that we're in, we have so much data and so many tools to really know with science. But I think sometimes we can fall into a trap of leaning on that too much and what we really have to realize is At the end of the day, we are making the decisions, not the data. And we can be more pragmatic on when we use the data and when [00:26:00] not. Jeff: So it sounds like the kind of high level here is have the right relationship with data for decision making, look at compounding, small wins over time or probably better said, there are no silver bullets. It's, understand your audience and what you're trying to do. Take the small wins act quickly and have the right relationship with data to fuel the right decisions, but don't get too messed up on perfect data. And you got, a fast process for growth here. Andy: Great formula. I like it. Jeff: yeah, it's someone really smart came up with that one, man. The fun thing is it didn't just end here, right? You did this and would wager crystallizing these thoughts on velocity and growth. I know this ultimately led you to being introduced to outfire. And it seems like probably this framework, as well as the thought of, if you were to take the thinking of how you grow IBM Watson and apply it in APIs and apply it to app fire and the just innumerable number of portfolio of applications you guys run again, there's gonna be no silver bullets that really port between the two. But the framework works [00:27:00] and the velocity you built a process to work in, you entered this company that grew something like 20 X in the past four years. So it's a good thing you were already for a bit of speed because this is, ludicrous speed. Andy: I, and I think you're referring to one of our more recent press releases, which is just awesome growth. It's been an amazing journey at AppFire and you're absolutely right. When I joined all those experiences at IBM were a part of my journey here at AppFire, it's how AppFire sought me out. And I definitely brought some of those growth team mindsets and principles to the role of that part. It's obviously grown much beyond that. We've got product organization, growth team, design, et cetera. But some of those experiences at IBM were definitely foundational to me being a part of this great team that we have here. I laugh because when I first started. We had a very tiny little growth team like one person. And I'm very fortunate that we've been able to grow that team. We have an [00:28:00] awesome set of leaders that are driving that. And they've taken a lot of my ideas they've taken their own ideas, but they've accelerated that growth team. A lot of these growth mindsets far beyond what I ever did. And we're just really fortunate to have them. And, um, I'm just thankful that we get to do some of that work here. The other interesting thing that you touched on too, was also the notion of the scale of the portfolio. I think that's a really interesting part of what we do at AppFire is having this broad portfolio. It allows you, there's definitely differences. If you're a traditional SaaS company, you have one or two products. There's a lot of things that you might do when you're a really broad portfolio. There's some differences, but there's also a lot of really interesting, unique opportunities that it creates as well. Jeff: There's all sorts of stuff we can get into here, but I guess, first of all, how do you go about there's so many kind of degrees of chaos potentially here between, what is a hundred plus applications you guys manage you've grown just. Several, [00:29:00] like it is about 20 X over the past four years, I think based on the press release that it said between all that, there's just exponential areas for things to get out of hand. So how do you look at that? How do you make sure you have the team set up for it? How do you get a handle on all the things going on? What does that operating cadence just look like? Andy: Oh man, that's. Jeff: Do we need a whole nother episode for this? Andy: Yeah, we can spend a lot of time on there. So how do we operate this? I think that some of the principles that we operate by are we are totally a people first. 100%. So I think that having the right structure in place and in large degree, some of the big teams that we have are managed by just great leaders like product leaders, growth team leaders. We have peers over on our engineering side that are leading those different Areas of excellence and then it's not just like the function of building a product, building a growth team, but we have shared services around analytics and platform that can serve all teams. So there's definitely like a [00:30:00] people organizational part that map what we're trying to do, but then also forms a foundational layer. People is a big one. I think there's also we've put a lot of structure in place for how do we set goals? How do we then regularly report and track on them with an operating cadence and model and tools? That's another part, which is, when I used to do my grad school, we talked about things like management system, and I thought that was all just a waste of time, but in a broad portfolio, like what we building that right system of. The processes and some of the tools to support it critically important. We do then, I think another pillar is having the right systems to be able to have insight into the work that we're doing, how are we achieving those goals? And it's not even just how we're executing our goals, but it's having some of the metrics and systems to be able to understand where people are working on. And as your strategy evolves, how do you start making shifts? So I think that it's probably if I were to [00:31:00] sum it all up, it is the, it's the people, the right leaders with the right structure. It is that sort of operating model of what are the meetings and the cadences and how are you driving goals, et cetera. And then it is the systems and all that has to really work well together. And. We could spend hours talking about it. It's super interesting. It's critical to how we deliver great products. And I personally probably underappreciated it when I was earlier in my career, but now it's so critical to what we're doing here at AppFire. Jeff: Yeah. Just thinking about all the things we do here and. We're growing fairly quickly. I'm extremely proud of rate at which we've been growing and adding customers in that, but thinking about just the speed. In iteration, how can you possibly even spend the time you need on gathering customer feedback? And as an example, and I know we, we talked earlier, there are some interesting, processes and you've had to get really tight about how you do something as simple as gather customer feedback to, apply [00:32:00] it to one application you're working on or another product you might be working on. does that look like? Like, how are you guys operating that world of, thousands of customers, quick iteration, lots of products. How do you do customer feedback? What does that, how does that work for you? Andy: Yeah. So I think you're touching on parts of the conversation that we had around scale and hypergrowth. And I think that's been one of the really interesting things about AppFire is because we have this. breadth of our portfolio, you just have to do and think different. And I think what we've landed on, if I can compare and contrast, like more traditional experiences with say Appfire and a lot of my traditional experiences, when you did things like customer and product research, the process was you identify something you're trying to study. You then go out and recruit different people for interviews, whatever. You then do the interviews, you then synthesize it, you then produce the report, you then share it out, then you feed it into your backlog, whatever. And that seems like a very simple process, but the process of recruiting the people, doing the interviewing, and [00:33:00] synthesizing, it is a long lead time to do all Jeff: I'm very familiar with it. Andy: that. Yes. And so I think where we've started to come with that fire, and I will use this in any role I work with now, is thinking about The different types of insight that you want, and then the right mix of tools. So ultimately the way I see it is when you're gathering this kind of input, you're trying to make a decision for sure, some sort of strategic direction, but the data coming back to our point about data, the data that you're using, you're trading off between. The speed at which you can get it sort of the volume of the data that you're going to have, which is maybe a little bit about precision and then also the quality. So you're trading off about against those things. And so now we have a larger set of tools in our tool kit. So if you're doing something really strategic, we might have something strategic and high impact. We might have something that follows that longer arc, but we don't have to make a decision as quickly potentially. So [00:34:00] we can do that longer lead time project. We might then also have some things where the decision is not as like high risk. So we can, we have ways that we can condense recruit faster, ask more targeted questions. Maybe we don't have all the volume, maybe we don't have all the like precision, but we're getting better data to make a good decision and the impact isn't as great. So we can do that on a tighter timeline. And then the lowest level of granularity, we've started embracing a lot of different tools, like things that you'd be very familiar with product analytics AB testing, et cetera, and run things quicker without having to have all the input. And then we're learning based on the behaviors and that way we've really built a portfolio of tools that we can use and we can execute those based on like needs around the strategic importance of the decision, the risk and the speed at which we want to make the decision. And so that portfolio of tools has helped us operate at a larger scale to be able to answer some of those [00:35:00] questions. Jeff: And especially with the advent of some of the AI kind of being applied to maybe the faster, maybe less high touch tooling there, you can get even more and more insight and start to, condense the speed even at which you can operate on the fast end. And just get more out of it and make more certain decisions. But to your point, I feel like , you got a second book in you, Andy. We could call it no silver bullets it's all about creating great product experiences and stuff like that. I think we've pulled a bunch of stuff already out in this. We get, we basically got an outline for you in the podcast. Andy: all right. Sounds good. I'm here for it. Jeff: Looking through all this, I think one last thing I'd love to talk to before we wrap up here is you made a move that feel like the risk behind it is probably one of the least understood pieces that, that come about in careers. Like you and I have, you went from IBM. One of the largest companies in the world to Appfire, which is a high, incredibly high growth. In terms of scale, several steps [00:36:00] smaller. And I've just seen that move fail time and time again, of course, some people succeed and you're a shining example of that and it's gone great. But how do you de risk something like that? If someone wants to go from, a high level contributor. There to, more executive role at a smaller company. Like how did you de risk that? Andy: Yeah, that's a great question. And it's one that I talk about a lot when I have the opportunity to mentor other people. Um, look I'm very fortunate with. Finding this role at Appfire, I often say I feel like I won the lottery when I joined here, and I really believe that and so the way that I made the decision and kind of the advice that I give different people as they're considering different career moves is I've always prioritized just two things in my career. The first is learning. I've always wanted to focus on. new and interesting things. Just always learning. In fact, where I learned this lesson, I remember vividly I was on my way driving back at the time I was in grad school. I was driving back home from a class. [00:37:00] And I remember thinking it struck me like, like just a thump in the chest. I was like, I need to be working for like the next 30 to 40 years, whatever that number was. So I was like, I was watching some of my colleagues in school and they were chasing different like titles and they were chasing different salaries. And I was like, you know what I'm going to do? I'm going to prioritize learning. I'm going to focus on doing interesting things. And as I build a collection of awesome experiences. At some point, I can switch and focus on like title and compensation. But right now I'm going to focus on interesting and new and learning. And the funny thing is I've never flipped the switch back. I've just always looked for things that are really interesting. Appfire is an awesome business. It's super interesting. The portfolio it's a kind of experience and product that most people don't get. The second thing that I've always prioritized as people as I think about my transition from IBM to Appfire, when people are making these moves to different companies, there's all this advice out there about ask these different things about the financials, the direction of the company, like investors, if it's [00:38:00] a company that has investors, like all of these things that you should be asking. And I did ask some of those things, but I actually didn't ask a whole lot of them. Fortunately it hasn't mattered. AppFire continues to be amazing. Jeff: You've done okay with the decision so Andy: I feel very fortunate. But I prioritize people. And I literally remember that as I was considering making this move to AppFire, I had the opportunity to meet with some of the people I'd be working with. And I was thinking, these are the people that I just want to wake up every day and work with that was literally the decision and day after day, that continues to be the case. I prioritize working with great people and a great culture, and that has continued to pay dividends. And I certainly know people that have made decisions to go to different companies and it hasn't worked out for them. But for me, Appfire has been an amazing company. The business continues to be super exciting, very interesting. The people in the culture was like ultimately why I decided to join. And it continues to pay dividends every [00:39:00] day. Jeff: It's funny you mention that because I have a corollary to that which is when we're hiring One of the rules i've built over time is if anyone has a real significant red flag on culture We take that incredibly serious like we over index a single person having a really negative and i'm just talking like went to the wrong school. If someone, if we take some out to dinner and someone notices that the person is treating, the the staff of the restaurant badly or something, or they notice any little thing like that, we really over index that piece, because I think that's really indicative of like you said, wanting to work with great people. And the contra to that is really painful. But I still remember when I, interviewed here at log rocket, we used to always have people, eat lunch with us and really come in and interact with the whole team. If you were interviewing for anything from SDR to engineer to anything, you came and you had lunch with the whole company. You sat down, we had a big table. And it was great when I came and did it. I remember like just the range of conversations were, building to, some of the engineers were talking about really esoteric technical topics and then switching to, books they've read that were [00:40:00] completely. Fiction or classics. And it was just, I had a blast and just left the room going God, even if this thing goes down in flames these are gonna be great people to know. And I'm going to have a blast doing it for, whatever time we have. And Andy: Yeah. Jeff: it's worked out really well, I've looked at it that way for a while of, the worst case scenario, I probably. Hopefully it's still land on my feet, but it's going to be fun. I'm going to learn a lot and it's going to be a fun experience either way. Andy: Yeah. Jeff: Andy, it was a blast having you on, man. I don't want to take you all day. I want to, think we have two more podcasts probably in us, so you might have to come back. But I know we're running out of time today, if people want to reach out, people want to learn from you. If people have questions or even just want to say, dude, that was awesome. What's the best way to reach her? I know I want to drop again, Andy F Boyd. com. The enterprise growth playbook. Amazing. Go check it out. You just have to give an email and you do, and you get it. And it's fantastic. Great investment. But any way else are you on LinkedIn or Andy: Perfect thing is LinkedIn. But I, whenever I get this question, I would say, connect with me on LinkedIn, [00:41:00] spend a lot of time on LinkedIn, just connecting with people and also reading what's going on in the industry. And also app fire. We're an interesting business. We're in the business of. Equipping and connecting teams to plan and deliver their best work so you can connect with us there as well. Jeff: awesome? Love to hear it. Again, thank you so much for coming on. I had a blast. I hope you did. again, hopefully we can have you on again sometime soon. Cause this is a great, Andy: I really enjoyed it. Thanks for having me on. Look forward to part two and three and connect again soon. Thanks Jeff. Jeff: Thank you.