Eric Metelka === Eric: [00:00:00] The engineering director goes, actually, I think none of this data is correct. And my hits the floor. Jeff: What do you do there? , how do you recover Eric: you don't recover, Jeff: Welcome to LaunchPod, the show from LogRocket where we sit down with top product and digital leaders. Today, we're talking with Eric Matelka, head of product at EPO, a next gen A B product testing platform. In this episode, Eric talks about the process he used to cold start G2s review product and how he turned it into self sustaining flywheel. His secrets to running great AB tests that really showcase the impact you're having on product and how he recovered from the director of engineering at Cameo, calling him out in front of the entire engineering team that all his data was wrong. So here's our conversation with Eric. What's up, Eric? Welcome to the show. Thanks for joining us today. Eric: Yeah. Thanks for having me, Jeff. Jeff: This could be a good one. , now you have a cool history because you popped through a whole bunch, right? A bunch of brand names I think people know about you, right? Cameo, Spy Hero a company called Power Reviews let's just jump right into it, man. I'm super curious because as a marketer, this is something we deal with a [00:01:00] lot. You were very early on involved in kind of the cold start over at G2. You were early employee and you literally helped them get over the cold start problem of like, how do you get reviews? Tell us about that. What was that problem like and how'd you tackle it? Eric: Yeah. I had met the founders who had impressive backgrounds. This was actually their second company that had just sold a company. They had worked on a company that did quoting software and automating quote CPQ and so this was like a second swing for a lot of them, which was really exciting and something that they learned from being in that enterprise. Sales software space is that you have to deal with analysts. You have to deal with Gartner and Forrester, but that's a very relationship driven process. And you have to invest in that relationship. And they thought that there should be a better way. That isn't just reliant on you having good relationships with a few point people and, with [00:02:00] software, with the internet. You have the crowd and you can have the voice of the people who are using the software, who are the practitioners. And that's really what the whole idea behind G2 was about. Started out being called G2 crowd. And really it was about democratizing the voice around software, but the. The thing of it was that we wanted to get people to come and leave their opinions, leave their reviews of the software that they were using. But we were nobody, like I said, we're this little office in Highland Park, Illinois. So we had to figure this thing out. In fact, I came in to do digital marketing for them and I was doing, basics of SEO for them, but it didn't really matter because without site content and reviews coming in, which we were maybe getting like one or two a day. There was no real growth to be had, pumping out SEO content. [00:03:00] Wasn't really going to make a change. They tasked pretty much everybody that was there that was a non engineer with figuring out how do we get reviews When you look, in hindsight, it's like very obvious that it's like, Hey, if we can figure out a way to get a supply of reviews that makes you do better in SEO ranking. Eric: So when someone searches for Salesforce. Alternatives or Salesforce versus some other CRM, what, name another CRM that you pop up highly and that allows them to command attention with those and companies, and then help also drive their attention to G2 and even their users to G2 as well. So that was basically the problem. The thing that like worked pretty well in the beginning was, because they had been as part of, they had successfully sold a startup. They. And they were mainly sales guys. They had a large networks and so they knew people use. CRMs like Salesforce, and they [00:04:00] could get that category pretty populated just by reaching out to people on LinkedIn that they knew and just being like, Hey, would you do this? But moving beyond that which speaks to something about having some founder fit, right? A founder network that allowed them to get a little bit of early traction. But and then okay, you have this chasm that you need across which is how do I get the next category of interest? And our next category I believe was actually marketing automation. So pretty adjacent to crm a new category to fill And so that was a matter of What is a value prop for people to come and do this? and the interesting learning there is like you can think about the different things you can do here be like your voice will be heard and nobody really cares it doesn't really catch other people will learn from your experience doesn't really do anything that you know, you can have the do good and charity angle, which works a little bit, but not great Really at the end of [00:05:00] the day, it was cold, hard cash. It was saying, we will send you a, I think it was 20 at the time gift card for leaving a review. We no requirements on one star or five star. We didn't care. Just if you come in here, you answer a requisite number of questions that, you were going to earn , that gift card. And so that's how it worked. But like figuring out a large part of it was how do we source these people? And. That was a lot of pounding, my head on the table until we figured out that you could really search LinkedIn for the software that people use because they list it there and they say, I am an expert in Salesforce as one of their skills. And you do that and then I would reach out to them on linkedin And that was an inefficient process because all controlled by linkedin. And so the hack there was I figured out how to guess people's email addresses From the company that they worked out and their name. Most people follow a pretty Normal naming [00:06:00] scheme, first name dot last name at company. com. I built this little thing in a Google sheet that automated these different permutations, figured out which ones were valid. And then automated sending out emails to these people asking them to do this and giving them that value proposition, the gift card in return. And so it became this thing where we went from, like I said, one to two reviews a day to 10 reviews a day to a hundred reviews a day. And that's what really led to a good amount of growth that led G2 to raise their series a. And so that was. I honestly was doing things that don't scale and hitting your head until you figure it out. Jeff: There were very, down days while doing that, trying to be like, is this ever going to work? But eventually You can brute force the problem and then automate the problem. And that was really interesting. Fun start. I think in retrospect is obvious, right? Let's pay people money. To have them go give reviews. Did you try other things or was there other attempts? Or was it pretty obvious right from the get go that let's just [00:07:00] offer cash and take the swing Eric: Yeah, we brainstormed a couple of those ideas, right? The voice being heard idea of the, is it charity? Is it just the gift card they want to gift card? What's the gift card to, there was a permutation we did at one point, which was more of a sweepstakes which is based on the amount of content you leave on the site. You would earn a iPad. And we found that just led to very low quality reviews. People would review things like Gmail, which weren't, really additive to what we were trying to do. So we did try a number of different things. , but it was clear, we weren't keeping necessarily tally on which one was better, but it was clear once we did enough of this at scale, that the message around the gift card was what worked best out of any of those options. Yeah. Jeff: you know I love that you started from Where are the people? [00:08:00] That we want to do this and how can we find them and finding just that raw data source and turned out to be linked in which is helpful because but even this is I think 2013 at this point, so we're talking 12 years ago, but already the penetration of LinkedIn was so high that you could reliably look there and they had this kind of data, but I'm sure it was not easy to pull that out and synthesize that into usable format. Did you have to do any kind of maneuvers or acrobatics to get that into a usable format? Or was it pretty straightforward to get the data at scale to use? Eric: Yeah. At the start, it just was basic. Copying and pasting from one window to another a, this person came up Jeff Warren is here. I'm copying Jeff Warren company name and doing that. And then how I learned about product through doing this was how do we automate it? The next thing that I did was I hired contractors and I said, okay, now I'm doing this. Now let's go after a bunch of terms and more people and have more people do this. And then became, how do we then make that even [00:09:00] more efficient? And I worked with the engineers. To one automate giving out the gift card because there was a whole operation behind just getting people their gift cards and making sure, we fulfilled on our promise. And then, as you said the collecting of the data and what we actually built was a private Chrome extension that when one of us browse over to LinkedIn. It would grab the copy on the page that was, that we could see it found keywords that were plugged into it. So we were looking for Salesforce or uh, HubSpot, and it grabbed the person's name and it put it into a database on the backend. And then now you can see, Oh, now I got that. Now, who do I want to target? I can hook up some mail automation into it. And then once somebody has paid once somebody has left a review, how do we then automate the payment via gift card? And so you get this whole engine eventually. [00:10:00] As things start to work and working with those engineers was the eye opening. Oh, hey, I can work with the engineers on the team. We can build something. This is super cool and just this is just fun. Like these are my people. I enjoy doing this with them so that was really how Yeah, Jeff: as a marketer, you don't always think about the idea that, Hey, I have a problem. Let's go solve it with software, bring in some engineers. And then for a lot of people, it doesn't click on. That's literally how software solutions are made is someone has a problem. You bring some engineers and we can define the problem and you define it. You, build requirements and you. You have people who can build the thing, build it and the output is a software solution. Maybe there's a hot take, maybe not, but a lot of marketers are a lot closer to product managers. I think they realize or at least a lot of good marketers. I'm curious as you're going through this, right? I love that. You're kind of who are the people? Where are they? What's info we need? How can we get it? Okay. Now, how do we streamline that? How we incentivize them to do the thing we [00:11:00] want them to do, which is in this case. Leave reviews because you, you know, without reviews. G2 has no value prop. And then how do you start to streamline it, make it more efficient while you were looking at this? Was there a strategy in, we're going to go category by category? What were the big growth levers or was it just scattershot? If you're familiar with linear a little while back we had non you, who's their head of product. And he talked about when he was at Everlane, the number one job of product was like. Launch category. And all that mattered was the next category launch, because that was the inflection point every time for revenue. And without that, they're just flat. So that was their main goal. Did you guys have something similar where it was, launch new categories here, or how do you view linear progression or upward progression? Eric: no, it was a very similar strategy where It was like starting with that CRM category that had that founder fit with it. And then going category after that. So I, someone chose marketing automation. That was [00:12:00] before I started there. And then we started then looking at, okay, what are the sizes of these categories? How many people can we find? And we may make steps missteps along the way. There were times we did, we're looking at like middleware. Like I remember looking at like IBM Informatica for some reason. I knew nothing about it. And it was really hard to just find people who listed that versus saying they use HubSpot. But once we refined it, then we could figure out, okay, here is a, big enough TAM essentially of this, and then we could go to those vendors. That was really the power of them packaging that up. And we would say to a HubSpot, for example, I don't know if it was exactly them, but we could go to them or Marketo and say, Hey, we have these reviews, we have these reviews from your competitors. We have only, five from you. Or for your company, we have 20 for them over there. We're going to publish our report of our grid which [00:13:00] G2 is known for, of basically, the same two by two that all the analysts use. You don't want to be left out. Come work with us to figure out, like to source your users and, let us know where they are. And so they can come and leave a review. And that became part of that cycle of identifying people because we had enough clout, we were showing up in their SEO results so they couldn't ignore us. And we could, point to their competitors at the same time to, Hey, we're working with this person. So yeah, that became really part of it. As we figured it out. Jeff: It's classic network effect, right? Is get big enough where now it makes sense to add more and more. And then you can take this thing that you guys were doing somewhat manually, if somewhat automatically, but definitely on your own and you were driving it. And at some point you get big enough where now the next level of incentives kick in and it's not just the individuals who are making cash to leave a review. It becomes the companies. Are incentivized to [00:14:00] bring you the reviews yourself and you don't have to worry about anymore and you can offload the cost of the Reviews to the companies and they'll incent their users to do it great kind of flywheel effect, but it all starts from right doing that like small little set of things That's not really scalable Prioritizing figuring out how to grow to a tipping point that brought his own myriad of problems, I'm sure that you had to address, but no interesting to see how that was thought through like the cold start problem. Eric: yeah, and at some point it did switch to going from we had to generate the reviews bring the supply to, became demand driven as you're saying, and really this was no longer needed at some point past when I left G2. So really completely switch, which is awesome, right? It was exactly as you said, kickstarting the content and having that overcoming the cold start problem. And then eventually you just lock that off and you found this flywheel that works because you no longer need that thing that, that kickstarted it. Jeff: I feel like it's rare that you see the cold start problem really deep and get to dive into how did you [00:15:00] solve this very finite problem and. You have Airbnb who talks about building scripts that literally looked at Craigslist ads and automated the follow up to those people to convince them to put stuff on Airbnb. Here it's, you know, look at LinkedIn who has the data and first manually and automatically reach out and, and send those people and drive them to do it and then hit the point where it becomes his own marketplace free flywheel. Eric: cameo, it was finding a few YouTubers that had very engaged audiences and they were some of the people that really created some energy there that, that started to get noticed. Yeah. Every company has a story, but they're not all talk to them. Jeff: right. Find that what is that thing that is going to be, so tasty for the user that they're gonna be willing to go and do it. And you can do it in a repeatable kind of manner. And maybe it's not always going to be the end goal, but it gets you right. You said it got G2 to a round. And they raise money and they continue to grow. It, got cameo into the next level of attention there. So the cold start problem, I think is a really [00:16:00] interesting one, but there's another thing that you've run into kind of in, in your career here, which I think is equally. Interesting. And I think potentially even, the cold start problem is not talked about in detail as much, but the lore is always there, right? There's the story of Airbnb. There's the story of all these companies that went from zero to one. And we know them because they're now giant billion dollar companies that, are pieces of our lives every day. Maybe the thing that gets talked about a little bit less is these companies that you need to find a second thing. How do you add the next thing you have one thing that works really well, you're selling something or you have a motion that works and drives revenue and you want to add the next thing you want to go deeper or wider. And that's seems to be like the problem you ran into at power reviews where you got there. So maybe just give a little background what power reviews is. I think it's a less known than G2 potentially. Still reviews though, still reviews. Eric: brand or retail or website, whether it's. A Shopify e commerce [00:17:00] site to a big retailer like on Nordstrom's you see reviews there, right? Everybody, Amazon really revolutionized this. And if you want to sell directly, you need reviews. It both gives you SEO content. So that there's a nice through line there. And it also gives social proof, right? Which increases conversion to purchase. And so instead of handling that life cycle yourself. You want to go with a vendor who's optimized for getting that content in the same way, right? And getting as many reviews on your site as possible. We also handle moderation services for that as, as well. So real, really interesting company. I joined them in 2015. And what had happened was that there were two main vendors in the space in the U. S. There was Bizarre Voice and Power Reviews. And at some point in the late 2000s Bizarre Voice acquired Power Reviews. But in the due diligence process, the U. S. government sued as an antitrust [00:18:00] violation. Because you had the number one player acquiring the number two player, even in a market like review software. And so for a couple of years, PowerView sat frozen. They had the customers that they had, but they weren't able to invest in it as they were expecting to get acquired. And eventually had to be spun and spun back out. And so this group in Chicago acquired those assets in the business. And so we came into this. With very little knowledge of the code base. Or really, we weren't part of the founding team that had gone through the product market fit journey, but we had several million in ARR to start. And yeah, that was the situation we found ourselves in and. Some favorable terms, I have this theory that I've learned, which is that if you don't go through the pain of figuring out product market fit and what works yourself, what the use cases are, it's very hard, when you're sitting on, Hey, I just [00:19:00] have this recurring revenue that's coming in. How do we really figure out what to do next? We haven't, when you haven't been through lean times and really had to viscerally learn lessons. It becomes really hard to figure out how to innovate and do something. So what's next? Which is why maybe founder led companies perform better than, manager led companies because they felt that pain and really know those lessons deep in their bones. Jeff: It's almost like the three generation rule of wealth, right? Like generation one creates it, in which case this is a founder's game, product market fit generation two, maybe the first kind of round of early employees who come in, see that early struggle. And then you get in your case, you're coming in as generation three, the kind of post spin out, you ARR. And now you got to figure out the next steps, but you're divorced from you said the struggle of the first. So she walked in, you have millions of dollars of ARR. What seems to be on face, a very favorable setup. And the goal is what grow error. The goal is to add a second product. Like what was, cause I know it ended up being [00:20:00] second product was the goal, but was that initially what you walked into and what you were brought on to do? Or was it, this is what you discerned. What you all discerned was the next step. Eric: Yeah, it was grow ARR. It was keep the current customers, keep the lights on again. We didn't know this code base. So a lot of it was actually know, like digging into like keeping the infrastructure. This is a thing that was run by. Sharing files between different FTPs, which is which is crazy to think about it. But yeah, it was how do we make sure that we keep this thing on, that things don't break, but also how do we grow the business at the same time? And really we didn't know how to do that. It's one of those things I look back on in hindsight. We didn't really know how to ask the right questions about what are the prospects looking for. We didn't have like real segmentation and understanding of ICP. At one point we wanted to do something that was more self service. We had a competitor who was coming up that was [00:21:00] shop, a very Shopify native and was selling to these e commerce brands that were riding this wave of. Being able to do everything from Stripe and Shopify. And we didn't know how to compete with that. Okay. Let's do something down market. Okay. We had ideas and we implemented something, but we still really didn't, we didn't talk to companies that were of this profile and really figure out what their needs are, how those needs differ, what the price point should be. We were just like taking what we had and massaging it into something that looked. Self service and e commerce native for them. And so there, in hindsight, it was a lot of lessons about the wrong things to do and relying too much on assumptions and intuitions instead of getting out of the door and really talking to. These people so that we could build something that they wanted instead of what we thought that they want. Jeff: Do you think the problem was [00:22:00] coming in? Like not having built the original, you didn't, like you said, is it, I know it's a bit of dingo through the struggle, but it's also, you don't have the context of what was tried. What was the thinking behind, not even just like what got done to get you there, but what was the thinking behind why you did those things to get you there. So you didn't have, there's a lot of institutional knowledge is probably lost in that time. Eric: Oh, yeah there's none there. There's close to none so it was that it was you know Look again some divided focus where we had to spend a good amount of our r& d budget overhauling the existing software and infrastructure. But with that other half of the budget, we didn't do a good enough job. Like I said, just making sure that we were asking the right questions, running the right discovery processes to know what should come next, what would add value and, generate higher ACVs contract values as a result. But. It was also a really interesting time to be at that table and see what is working and what's not and learn [00:23:00] the business. I'd say one of the biggest things that I got to do there was as a PM say, I'm not just going to be a PM, we have problems all over. That we need to figure out like one example is in our implementation process. We had issues with getting companies implemented in a fast enough timeframe to then recognize the revenue because you don't recognize the revenue on your books until the company is implemented. And so how can we remove friction from that process? And doing it and saying, I will be implementation engineer for a week or two, or I'll be implementation engineer for these two companies that are coming in. I will implement them myself. I will learn the pain points, and then I will figure out where we can adjust. Those were when I did those things, those were the best things. Those were the best learning experiences, and I understood the customer pain the most. But when it was what is the next revenue product much worse at that without, having the [00:24:00] right person to guide me and learn how to do that. Jeff: you've clearly successfully, accomplished a cold start problem over at G2 and found a way through that. It's amazing how coming into just a different situation like that, you can have people who are clearly talented, know what they're doing and accomplished. And you've gone on to, really much bigger things since then. But, I think you said it best when we talked a little bit earlier. Unless you know how to succeed, there's really no path to a second product or to a second life here. And it seems to be, this is one of those cases where just so much institutional knowledge was lost and the things that got them there just weren't carried through that. There was a huge burden just to get over, to understand that piece again. And maybe, in retrospect, hindsight's being 2020, you could go back and say, Oh, we should have done this different. And different outcome. But in the moment, you come in thinking, we know what we're doing. I've done product before let's move this forward. And just, it's [00:25:00] a, it's, I think a much bigger hurdle to get over than people realize coming into an environment like that. Eric: Yeah. And you need to figure out like where are you going to source your learning from you can, G2 was a new company. They founders who knew what they were doing, for me, there was really nobody like really above me. And this is a similar situation where there wasn't a. Seasoned leadership team to learn from. And there is absolutely something to being very driven and reading what you can and even networking with people. But there's also something to having those veterans, those experienced managers. In the company who know what they're doing and who you can learn from and make you better. And I think this was also a case of, being a couple of years into my product career and not having learned enough because I was constantly just teaching myself and learning from experiences. But you also [00:26:00] need, the, you need to learn from various sources and that really helps you grow. Jeff: I think the other thing we glossed over real quick, I just wanna make sure I hit on this for five seconds is one thing that people really under represent, I think, or under or misunderstand about going into almost any functional leadership role is, yes, you have to be really good at the function, but you just brought up RevRack, right? Revenue recognition. And it's funny how I found that in marketing, after I've transit, once I hit a point of somewhat senior as a director and had like budgets that I managed and outcome focused, and especially now as a, leading a whole department, understanding balance sheets and accounting is such a huge part of the job that, that no one, potentially prepares you for. I'm glad you brought that up. There's under understanding rev rec and expense recognition and just a few. very baseline pieces can really accelerate your career if applied at the right time. Cause you can come in and talk the language of the people who want to invest money and you can potentially do things more efficiently because you understand [00:27:00] how the money works, not just how the product works. Eric: Yeah, the CFO can be a real good ally because also they're looking at the impact that you're making directly. And so they can be yeah, a real good ally for you on that leadership team. This is always also why I'm married to an accountant. Jeff: that, and it makes tax season so much easier, Eric: It, it, it does. Yes, it Jeff: Yeah. So I know we talked about the mentorship piece there and maybe there's something I want to come back to, it's important to know, you hit spot hero. As your next kind of jump, which is a great kind of consumer oriented Chicago company. And I don't think Chicago has enough of those, but you learned a lot, but what I really want to focus on is that this is something near and dear to my heart. I'm at heart, my background is in demand generation and I ran sales and marketing operations teams for a long time. And now that you're at EPO you really live and breathe this, but even, going through cameo, this world of. It's really hard to run a great AB test. How do you run [00:28:00] really accurate, really good tests that have a high fidelity because too often I think you can run something and. You can get an answer doesn't necessarily mean it's the right answer. And if you pursue these things that are you were not rigorous about how you gathered the data, that's actually potentially more dangerous than just taking a flyer because you are potentially pursuing something wrong, but confidently, or, you are overconfident in your, potentially either unfounded or wrong decisions. But you learned that the hard way and then full circled it. And now you're the man to listen to on running great AB tests, thanks to your background at EPO. But maybe we hit on cameo and it's just, you ran into an interesting story here around the exec team Eric: The similar situation where I came into Cameo to experiment and lead growth, but I never done growth before. And so I was teaching myself in a number of ways here. And I didn't know a lot of things that I know through that [00:29:00] experience. And one of them is, as you were talking about Jeff is how experiments need to be measured in a rigorous way. That the traffic, when you assign traffic to an experiment, you have to make sure that it's. 50 50 if you have two variants because if you don't, you throw all the stats off and your results don't work. This is called a SRM error or sample ratio mismatch. You don't need to know any of the terms. I certainly didn't at the time, but things have to be balanced. That makes sense. And so I was running a number of experiments, bought a vendor, brought them in we were analyzing them. Using the vendor's tools. And so I'm sharing results. We've had some bad experiments. We've had some good experiments. And I'm in a meeting with leadership going over one of our latest results and the engineering director who I'm working with goes, actually, I think none of this data is correct. And my hit hits the floor. Jeff: What do you do there? Like, how do you recover Eric: [00:30:00] you don't recover, I'm a cartoon like, I'm Scooby Doo and my like jaw hit the floor or something the You know, and I'm just like what do you mean? What are you talking about? And then I'm like slacking him on the side. Like why are you doing this? Please stop come talk to me before You tell the CEO and the CTO and everybody that matters at this company that like all our experiments are wrong. But you know the situation was set up for that to happen was the disappointing part not through Maybe mainly through ignorance not anybody's willful not through any negligence or ill will, just that we were sending our event data to a vendor that had a black box. And when you start peeling it back, and when you don't have visibility into are these things set up in the right way, were they implemented in code as expected? You can run into these situations where the data isn't correct, and we had to pause. We had to do deep dives into the [00:31:00] data. Is this what we expected? Super hard to recover. The solution, at least in this part, was having a stronger data science team with some leaders who knew what they were doing, come in and, really take hold of that experimentation program and put their stamp of approval on it. And in doing so. We had some more transparency into how things worked. We moved some of the things away from that vendor. And that created and built, helped build up more trust because we could then have the auditing that you needed to say that this, these things were correct. But it completely shoots your credibility in, in having that happen. When you come into the experimentation, you see a lot of things are a lot of vendors in the space. I think everybody's trying to do a great job, but we need to make sure that there is transparency so people don't end up in these situations. Because without any trust in what [00:32:00] you're doing, you can't prove your impact. And as a PM, proving your impact is one of those key skills to get promoted and move up and grow and have the career path that you want to have. Jeff: And part of it is making sure you have, equal distribution of traffic or can you have, this is something I've always wondered, this might be a little nerdy in the weeds, but can you have unequal distribution as long as you're measuring conversion rate and looking at like statistical significance or Eric: you can have unequal distribution if your test is set up to measure that. So you could have, , I could do a test that's 30, 70. Ideally you want to have equal. You want to do 30, 30 and 40. That's. Held out from your analysis But you're fine as long as you're, the test that you're using to measure that is saying, okay, I know this is 3070. And that is fine, but what the problem usually is that you're setting it up for 50 50 and it's actually 47 53. And even that, 3 percent on the margin, it matters what your sample size [00:33:00] is, what your margin is there, but even that can throw it off. And you need, to have trust in your systems that they will tell you that. So that you do not end up in this place of lack of trust. Jeff: What else do people need to look out for here? If you're building a, can we quick hit on what are the things to look out for to make sure you're running a great A B test? I assume equal traffic or at least understanding of ratios there probably an offer that is big enough that actually has a sizable enough impact to test. Eric: Yeah, it is having sample size measurement and making sure that you're hitting statistical significance, there, there's another thing that I never knew here, which is that there are these different types of tests that, that you can run. The kind of standard that we have across a number of the vendors in the space is what's called a sequential method, which allows you to F5 and which means to refresh your web browser and to look at those results as they're coming in real time and that method allows you to do that, but when you're [00:34:00] doing that, you're trading off with some of the time to run the experiment because. You're being allowed to what's called peak at the experiment results. And so there's actually a couple of different types. There's Bayesian, there's fixed interval there's different types of the sequential method that have some slightly different trade offs to them and different efficiencies. And, that's something that you might not know or understand the implication of because you can run other experiments. Like you can run a Bayesian experiment that takes. Bayesian is interesting because you give it some past data that you have. That's I expect it to be this value, and that allows you to actually run experiments faster with less sample. There, there's different things that people don't know about. But really the key, you ask me like, what is running a good experiment? Having a good hypothesis is always the start. A tool can aid you in, in, in finding that, but it really comes down to you, which is, I have a belief, it's based on quant data or qual data or some intersection of the two, that says, if I make this [00:35:00] change, these users are going to do this thing. But what that thing is, this is the last part about like healthy experiments that I want to emphasize. Is, it can be a click on your page, but hopefully it's really tied to what your North star or business metric is, right? Because if I'm at Cameo, what I care about is creating more orders because that is our North star metric is more people are purchasing these Cameo videos. And yeah, I can get them to click to the next page, but if it's not resulting in more orders and more revenue. Ultimately, like it's not having a material impact on the business. And so that's what we want in a healthy experiment is be able to bring in those business metrics as well and understand how they're being impacted. Jeff: Yeah. I remember years and years ago I was at a company where we ran a lot of webinars and that was something we were really focused on because we had seen, we'd done a bunch of data analysis that showed that was one of the best, indicators or people who did that were more likely to, engage in a sales process with us. So we really [00:36:00] hyper focused on driving that and had a lot of testing going around. How do we get more people to sign up? We found a lot of tricks that drove up signup, but in the end, really what we cared about is we wanted people to come to the webinar, see the product and see how we showed the new demo, the use case, and ultimately go. To opportunity. And a lot of things we did that opt registration didn't do those two follow on things because we just made it easier to sign up. Single click. We know who you are because we send you an email. If you click on it, we'll just auto enroll you. Cool. We got, five X the signups, but it did not move the needle on how many people actually showed up or maybe slightly because you remove friction. I do think that there's this idea of some level of friction in a process is good because it helps you weed out tire kickers very early on. Eric: absolutely. There's the, I think of it as like you're pushing traffic around, right? If you're like a funnel on a website, you can go from, Oh, I got them from the homepage to the product page and there, that's happening at a higher rate, but then the, product page to [00:37:00] check out has now decreased. And so all you've done is just push traffic around. That's netted out to zero effectively. And you're right, because what happened was you actually didn't. that traffic, you didn't give them enough understanding. I was working with a customer that sells yeah, a multi a hundred dollar product. And they were running into this exact situation where they got more people to get to their product page. But it's a multiple hundred dollar product. You need to give people An understanding of what it is how it benefits them the social proof, you know If you go through that too fast, of course, they're gonna drop out right and this also might be a multi session journey as well So are you doing things to hook them and get them to come back so that they're more qualified? And if you speed through that all too fast, it's not actually going to do its job at the end goal you have. So you do want to measure, what you're doing are you getting more clicks? But you need to connect it to that outcome or you're just pushing that traffic. [00:38:00] Yeah. Jeff: balanced traffic. You need to have a strong hypothesis, but all those things need and you have good analysis and a good tracking. I guess another thing you need to have is actually being able to track. The things you want to track. If you don't have that understanding of your user journey and why people do things, sure, you can change your button color from red to green, and maybe it makes people slightly more likely to click because of some weird color theory thing, but you're not predictably moving the business forward because you don't have that foundation of why are people engaging in the first place? And going back to let's make it real world. There's a great example you have here from cameo, where you were actually able to move the business forward because you realized there were certain elements within, in the process where people had maybe two open of a form. And you made it, more broken up and less of just a giant blank screen. Maybe you can just give us, walk us briefly through the idea there and how that works. I think it's a great example. Eric: So in Cameo, there's celebrities and influencers on the site [00:39:00] and you can request a video from them and they'll, you can give them whatever details you want. They'll respond to it. Each of them will respond in a different way. And basically it's just on, once you're ready to make this decision, and some of these decisions are multiple hundred dollars decisions as well. There's just a box, just a text box for you to fill out. What would you like this person to say? And that's really intimidating, especially because most purchasers of cameos at the time were first time purchasers. How do you know what to put in there to make, to get a good video back? And, you could see this both in the qualitative and quantitative data, the qual, the qualitative data, Hey, let's walk through this. Where did they get stuck as we walk through the user journey together? Or, there's a lot of, I don't know what I'm going to get. And then the quantitative is you just look at the clicks and where people exit that form. They were exiting after that. That field, they didn't know what to do. And so I had the data that I needed that said, this is a lot of [00:40:00] cognitive overhead for the user. How do we decrease that cognitive overhead? And we had a bunch of ideas and the one that we really went with. Was, okay, let's template this out. If I want somebody to say happy birthday. Okay, who are they saying happy birthday for? How old are they turning? What do you what is the wish that you want them to have for, this person to say for their birthday? Make it easy to fill out. And when we ran that we found that it increased conversion of purchase by a very material amount. And that was really what we needed. And we actually ran into this thing where it seemed too good to be true. Let's talk about data trust again with experimentation. And you run into that. There's actually a law and it's called Taiwan's law where. If something's too good to be true, it usually is. And so there was some skepticism. So what we did was we ran it again and we replicated the results within a few tenths of a percentage. And that was it. That was the big win. And [00:41:00] that is all, as you said, that is understanding the problem, the hypothesis, my goal was increased conversion by understanding the problem pretty deeply. We were able to make a big impact and that was a reason that I got promoted and moved up at Cameo was because I could s point at the results that I made and the impact that I had. Jeff: I'll be honest. I I looked at cameo and almost pulled the trigger a few times on it. And one thing was my wife and I are big jeopardy fans and Matt Amodio, like longtime ultra Jeff jeopardy champion we found out was on it and I think for her birthday one year, I was going to grab that and do that, I ended up doing something else, but I'll be honest, had I hit just a big open text field I probably would have been at a pause. And that's a lot of friction that. I work in marketing, my job is messaging and I probably still would have been like so it's a good understanding of user and motivations there and solving a problem. Another problem we have here, Eric, is we are quickly running out of time and I don't want to steal your whole day, but we [00:42:00] have, Just pages of notes and stories that we'd never even got to talk about. So maybe we'll have to have you back on again at some point. I'd love to go into more of this stuff and talk more about Epo and some other, cool experiences you've had. So maybe we'll just have to leave it for that. But until then where can people find you? If people want to reach out about Epo, about you, find out more about, some of the behind the scenes stuff here. Is it LinkedIn? Is there another, a better place? Eric: yeah. LinkedIn is great. Where at, I'm the head of product at epo. We're at get epo, EPPO. You can also find me at BlueSkyNX. I've had the same handle since I was in the third grade. It is Eric, E R I C 3000. So if you want to find me there, you can find me with my third grader handle. Jeff: Nice awesome. It's been awesome having you on like I said, we'll have to have you on again Because there's a lot we didn't cover but thanks for coming on man. This was a pleasure Eric: Yeah. Thanks. This was awesome.