The following is a rough transcript which has not been revised by High Signal or the guest. Please check with us before using any quotations from this transcript. Thank you. === paras: [00:00:00] Data teams voices, essentially customers voice at scale. So instead of asking 10 customers quality, that's super valuable to be clear. But in a tech company where you have millions of customers are not gonna be able to actually literally talk to million customers. And so that point data becomes voice of customer at scale. hugo: That was Perra Doshi outlining a powerful mandate for modern data teams to serve as the voice of the customer at scale. Perra leads data at Opendoor, the real estate technology platform. Previously, he spent five years at Amazon working across Alexa and brand analytics. Helping shape data strategy are one of the world's most data intensive companies. In this episode of High Signal Duncan Gilchrist and I speak with Perras about his journey from Amazon's decentralized data world to building a centralized function at Opendoor and the core principles that define impactful data leadership. Today we dive deep into the practical realities of data strategy, discussing how to earn a true seat at the [00:01:00] table. Y PERAs believes AI is creating the 100 x individual contributor and how it will fundamentally consolidate the future of data rolls. This conversation provides a clear playbook for pragmatic leadership, unpacking why a builder mindset and the core traits of agency autonomy and adaptability are now critical for anyone looking to build a data function that's not just a service center, but an indispensable driver of business value. High signal is brought to you by Delphina, the AI agent for data science and analytics. If you enjoy these conversations, please leave us a review. Give us five stars. Subscribe to the newsletter and share it with your friends. Links through in the show notes. Let's jump in. Hey there per us and welcome to the show. paras: Super excited to be here. hugo: It's so great to have you here and I'm really excited to, to jump in particularly with your career path from. All the time you spent in everything you did at Amazon [00:02:00] to leading data at Opendoor and, and the wins you've had there. You spent five years at Amazon working across Alexa brand analytics before moving to Opendoor to lead data. I'm just wondering what the biggest differences between those environments were and how that shaped your leadership style. paras: Yeah. First of all, super excited to be on the pod, big fan of the pod, and thanks for having me here. Yeah, so like. Amazon and Opendoor, they have a lot of similarities. If you kind of go online and you can search for Opendoor, a lot of folks would say Opendoor is essentially Amazon, not realistic, right? And a lot of those kind of culture actually also flows through. So some of the already culture pieces are, Opendoor pretty much mirrored a lot of goodness. That was from Amazon. Things like ownership, results matter, customer obsession. So when I transitioned from Amazon to open door, it kind of felt. Fairly natural tradition. At least for me. I was able to take all the good parts and things that I learned at Amazon and carry it forward to Opendoor. But [00:03:00] one key kind of thing that from like a data person perspective that was different for Opendoor was that Opendoor has like a centralized data kind of ladder versus at Amazon. Different kind of GMs get to build their own kind of data teams and they're very decentralized in their approach ass kind of classic case of running Amazon. Very decentralized, move as fast as possible, two bit teams, those kind of culture pieces. And whereas at Open Door we had a central data kind of architecture, which has something I was looking to do as I grow in my career. And so that was like a key differentiator. hugo: That makes a lot of sense. And I am interested. In whether and how at Amazon, a decentralized structure could help in career growth or limit your career growth, and how that experience influenced the way you think as a data leader about building career paths for everyone who you support and who works with you at Opendoor. paras: Yeah, it definitely [00:04:00] shaped my career path in a significant way because at Amazon. I had essentially maxed out the career ladder as a person that was in that job family. What that effectively meant was that I could act ion into owning other functions, but me being me as having too much fun running data functions that I was like, oh, actually, I would rather continue and find companies where there's a more kind of central data structures. And luckily at Opendoor, they're going through a transition. And they hired me to navigate that from a decentralized to centralized kind of data org structure, duncan: that definitely resonates the loving data above all else, curious, I mean more generally, like taking a step back, when you think about your own journey, how do you decide what's next? And especially now as the leader of data at Opendoor, how do you think about the future of data leadership and where that world is going? paras: I think most of the leadership, whether that's engineering, product design, data. All of them are transitioning [00:05:00] from less of a functional leadership to more of business kinda leadership and outcome leadership. So what I mean by that is I, for one, just happened to be a leader who loves data and I'm accountable for creating results and outcomes for the broader organization that I'm part of cares about. And so that. Especially with the advent of AI has accelerated that kind of push. And more and more functions can do other functions. And so that is something that has been my North star, which is how do I and my team, and collectively as an organization, we can drive outcomes for our customers? How can we collectively drive success and put wins on the board? And so that's something that kind of shapes my leadership style and I work backwards from that to say, okay, what do I need to do to maximize impact for my team, people that I work with. And it fundamentally boils down to if I maximize career growth for folks on my team and take care of them fundamentally, they'll grow. They'll kinda do bigger things [00:06:00] and whatever it takes to kinda support them in that journey and at individual level, if I'm maximizing impact of all the individuals in the team collectively, it just maximize the impact of the broader organization. So that's how I kind of think about leadership in general, and it becomes an objective function of impact and then just maximizing that and working backwards. duncan: I, I love that. And I think that the lens of not only thinking about what the best thing is for the customer, but also for the people is really powerful and maybe too often understated, I think, among executives. Maybe actually that takes us into talking a little more about kind of fragmented analytics and like when you joined Opendoor, there were 80 ish people doing analytics fragmented across the business. And you saw that, I think as a problem. Maybe you can talk a little more about that experience and what you saw and felt like needed to be fixed there. paras: Yeah, so Opendoor had been like a rocket ship and continues to be, [00:07:00] and when I joined Opendoor for context at just because I joined in 2022 for context, and just couple of years before that, Opendoor had gone public and had been a rocket ship and the name of the game was like, move fast, launch things. And can help the customer essentially move. And as a function of that, all of the different kind of functions were empowered to go hire people that they wanted. They were empowered to build whatever tech stack they needed to get the job done. And because of the optimal function, we essentially had different teams had spun up their own kind of data tech stacks. They had hired the people that they thought helped them with whatever they needed to make the decisions. And so the structure when I joined was essentially 80 R people spread across multiple kind of business teams, reporting into business kind of folks, not necessarily tech leaders and doing things that were specific to the groups that they were part of.[00:08:00] And so week one, when I joined, I found kind of three issues. I asked something as basic as I pulled up some of like our public, like quarterly reporting data. And I asked where that came from and can I get internal sources for that? Asked five different groups and they gave me different numbers. Something as simple as how many homes did we acquire last month? I had five different numbers. Marketing gave me a number, finance gave me a number, product credit number, all of them were right for their definitions, but collectively they're useless because we couldn't really make any additions to it. And something that was in a public reporting needed so much kind of. Work and sifting through. So imagine how hard it'll be for like an, the last one is kinda very hard to make decisions. Second was, I found the best folks were not necessarily gonna maximized for the output that they could be doing because depending on the group that they're part of, they would be assigned like a project that was specific to their group, not necessarily what we are. Company [00:09:00] maximizing function. Let's say if you're in operations, you'll get to do things that are specific to operations, not necessarily. The biggest kind of decision that the company is facing at that point in time. And third, we had six different BI tools, five different data quality monitoring tools, and I keep going across like the data stack and everyone kinda had spun up their own kind of stacks. So not only did the data, didn't talk across those stacks, but we are essentially paying for each of those stacks because of lack of economic of scale. You can imagine each of those vendors were charging us like at the. Top of their ticket place. And we didn't start out overnight with centralizing everything. We started with like couple of functions that were critical for open door and we said, okay, we're gonna start out with centralizing that, prove that out essentially, that this is a better model compared to this decentralization. And that's what did and I can set out on a multi one journey to start with. Like people got the right people hired first period. My kind of work around it. Put a kind of platform strategy in place. And it created basic processes that helped us get to this stage. [00:10:00] So essentially across people, platform and process. Yeah, that was essentially our journey. hugo: Super cool. And I'm wondering, I think we all have a sense, and our listeners will have a sense of how valuable and actually essential to open doors, business data, ml, what we call now a AI are. But I'm wondering if we could step back and just get a sense of what the data function provides to Opendoor and and why it's essential. paras: Absolutely. So Opendoor is a realistic. Like kind of business, we buy and sell homes algorithmically. So what that fundamentally means is that we need to get the valuations right for the homes that we are buying and selling. Because if you get that wrong, we essentially lose money. And so to value home the underlying kind of data associated with homes is massive. So if there's like a 10, $20 product on e-commerce, yeah, it does like few features, 10, 15 features, maybe few more. But we are talking about a product that is like hundreds of thousands of dollars. And so we literally have [00:11:00] 600 plus features for each home that we collect across internal kind of data sets, external data sets, and are able to value the home as accurately as we are. 'cause it's business critical for us to do that. And so my team essentially owns the underlying data assets and the analysis that. Goes into that function that is core at heart or open door does, which is buying and selling homes algorithmically. So it is key part of our business model. hugo: Fantastic. I am interested, Duncan, I know you are not the guest here, but how even those types of things relate to a lot of the work you did at Uber? duncan: Yeah. I mean, lots of similarities and differences, right? I think these are both types of businesses that are. In some ways founded on the opportunity to leverage data to heavily optimize a marketplace or an exchange of value. And interestingly, actually, I think a lot of the challenges that PERAs saw were also challenges we saw [00:12:00] at Uber. And a lot of the early evolution of Uber was super decentralized across all functions, including data, which led to a. Creation of tons of inconsistencies and costly problems that gradually over time kind of got rolled up and I was there actually and fortunate to be at a time when it was getting rolled up and part of pieces of that, everything I think for us is describing is in incredibly resonance and it's interesting to see those parallels. hugo: So for us, you've said you want data to have a seat at the table. I'm wondering what that means in practice and how do you go about measuring whether you've achieved it? Uh, paras: let's see. Let's say CEO wants to know how we'll hit our quarterly revenue goals. Instead of asking a report, they would first turn to the data team and say, help me create a bridge to hit those revenue goals. What are the initiative sizing, et [00:13:00] cetera that needs to happen for us to hit those? Data team at that point becomes an arbitrator of resources. Look, they're unbiased. They don't have skin in the game. They're not part of product team or operations team or marketing team or what have you. They can essentially be an unbiased data kind of agnostic. It can essentially tell them, Hey, here are all the initiatives. Here's what the sizing is, here's how like, so when that happens, when that switch happens from executors asking for reports to asking about. How can we hit the things that we publicly said last quarter in our earnings that we will is essentially what seat at the table means. When can, I'm highly simplifying, but that's essentially what it means to me. duncan: How do you differentiate though, on the one hand kind of being the arbiter, which is obviously tremendously valuable, simultaneously driving the impact and like being the driver of kind of value and those two things are a little bit at odd. Curious if you can unpack that for us. [00:14:00] paras: Yeah, it's an and not an OR on one hand. Company has certain goals and there are various functions that get together to achieve those goals. Those are usually pulled out of data teams, so someone will come and say, Hey, we need to hit this goals. Help us hit those goals like strategy, create, help us create a strategy growth. But their data teams are also closest to. The data and way I think about that is essentially data teams. Voice is essentially customers voice at scale. So instead of asking 10 customers quality, that's super valuable to be clear. But in a tech company where you have millions of customers, you paras: are not gonna be able to actually literally talk to million customers. And so that point data becomes wise of customer at scale, especially when you segment it. And then being able to take those insights and pass it is where the incremental value comes from. So that's a. Place where data teams would push insights to others where they're not necessarily asking for it, but it's proactively being [00:15:00] sent to them and say, Hey, here's the part of the funnel that is broken or leaky. Let's go fix that. Or here's like the set of customer segments that are adjacent customers to our core customers that we should be able to kind of service it. We do X, Y, and Z. Let's spin off an experiment. And then they're part of like experimentation design. And then we can roll it out, see what happens, and go from there. So it's like push and pull. Where the poll would be, what the business needs and what their goals are, and the pushes where we have seen insights and proactively share with people and need those so that we can land impact. duncan: I love voice of the customer at scale. That's a powerful phrase for us. I'm curious to actually, maybe you can take us a level deeper on an example where your team's work directly shaped the strategic decision at Opendoor and what the work was and how they moved the needle there. paras: Yeah, absolutely. One example that I can think of is when I joined there was a general sense that V Estate is a high consideration category, which means [00:16:00] customers would take multiple months to decide. I, it means that on marketing, media and branding, we need to run multi-month long like campaigns to be able to convert those customers. Turns out the data suggested that it's. While that's true, but you have a large population of customers that have already decided customers are smart, they're savvy. By the time they come to Open Door, they're ready to sell or buy. And that meant that a lot of our kind of existing prior on how we can set up campaigns did not yield the ROI that we thought it would. And so one of the things for the past year, as my team has done is we've helped optimize millions of dollars of marketing spend. And help allocate it across different channels, across different markets such that it gives us better ROI. Because fundamentally if we can get, take a dollar and stretch further, then we get more customers that [00:17:00] come through the funnel, which means we can spend even more. That's like a virtual cycle that kind of applies. And also just paid marketing is one aspect. My team as a second example, we. Essentially inspect the entire funnel on a daily basis. And we have essentially automated kind of tools that tell us that this part, the funnel had a change. We should look at it. And a lot of times that helps us early detect ENG issues before the end issues actually can surface up. Because we can say, Hey, let's kinda avoid those before actually happens. Let's do what we need to do to make sure the customer gets the seller experience. And so those are the kind of examples that kind of. A strategic in nature that both on dollars and location of dollars and on the product experience and where we should be focusing our energy. hugo: Fantastic. Thank you for that, those examples and that deep dive there. I am interested in cross-functional leadership, but how you really manage, like collaborations be between teams as well. So I, I am [00:18:00] wondering how you navigate ownership of. Machine learning more generally between like data science, ml and engineering or ops teams and getting those teams to work together. paras: Yeah, absolutely. So just for context that Opendoor, the way it's structured is I run data science analytics, engineering, data engineers, the data functions, and then we have, my peer essentially runs ML engineering. And so the way I think of kind of the divide and conquer for us is the data teams who own what and the why. And then the ML teams would on how and the when, and both of us are accountable for So what? So in the business outcome, like what fundamentally matters that the company succeeds and the org succeeds? So on the data side, when well, why? And what basically means that what are different models that we should be running that would help the business and why that would help the business. So that's mostly like the data teams. And then once we've established that we would [00:19:00] collaborate with ML team and their own, how they'll operationalize it when they'll decide to scale it. Also, there'll be like a actual live kind tooling, those kind of decisions on ML team, depending on what the needs are. And so that's how we kinda divide and conquer between the data team and the ML team. 90% of the time it's fairly messy, mutually exclusive, collectively exhaustive. Or there are like 10% of cases where you, there's levels just getting in the room and sort it out. Just a simple conversion goes a long way. hugo: Makes a lot of sense. And drilling a bit further down and maybe getting slightly technical, last time we spoke, and correct me if I'm wrong, you mentioned that at Opendoor you've actually split batch or offline machine learning and live customer facing ml between these two groups, what have you, what was the rationale or the motivation behind this and what have you learned from it? paras: Yeah. Common fallacy is to throw the fanciest tech at a thing and then. Assume that's like the best thing. And so I see over and over again, [00:20:00] like something as simple as this whole real time data pipeline, right? If a CEO comes to you and says, Hey, I want the data more frequently, they might just mean instead of 11:00 AM I want the data delivered in my inbox at 8:00 AM. Mm-hmm. But what a lot of engineering teams will interpret that all they need it real time. So now let's spin up like Kafka and all of those, like real time batches. And now we'll refresh data every 50 minutes. Guess what? The CEO is only looking at that for once a day. All we need change the batch from 11:00 AM to eight. And so one of the fallacies that we try to avoid, not saying we can always avoid it, is this really like a tech thing that needs to happen. Or there's like workarounds that we can keep the tech as simple as possible, but still get the job done. So there are batch models that we run that doesn't necessarily need to be built. They require less tech, they're much easier to maintain, they shouldn't get the job done. Our example of that is open door. We send millions of mailers to prospective homeowners that could sell their home to us, [00:21:00] and we don't send those mailers on real time. That process happens on a schedule, and so that does not need to be a real time ML model. That's a match model that is on a scheduled Databricks notebook, for example. That just kicks off. And then when we need to kinda send that file to our mail partner, we would clear those, and that's a bad system that doesn't need fancy ml, real time, all of that fun stuff. So we keep it as simple as possible, as low cost as possible, and as easy to wind it so that we can take the time. And do the fancy stuff and do the hard part. So there are hard parts at Opendoor to solve on ml. An example of that is when you come to Opendoor and you put your address in, you get an instant valuation that is as real time as you get. So you'd rather take all of our like smartest engineers and have them solve that problem where the latency is the lowest. We can get that information to you as fast as possible. That is not a batch process. So that needs ML engineering, that needs folks that obsess with latency, scalability. And all of those [00:22:00] like things, so that way we do it and conquer in a way that is mapped to the business outcome and what needs to happen, not what's the fanciest tech that we can apply and get as geeky as possible. It's again, going back to this like leadership style of my, which is like a objective function is impact, and you do what you need to do to then work backwards on that, and if that needs a batch model, great. Gets it all done. So you cannot do that. duncan: I'd love to talk more actually about the, your taking your CEO example and figuring out what they really need and addressing that problem versus the kind of tech you want to build. Oftentimes, I think data leaders struggle to know how to really work with cross-functional peers and kind of the C-suite. And so I'm curious if you can talk more about what you've learned about building trust. And effective relationships with, with kind of other executives at Opendoor? Any kinda advice you might have for folks who are trying to do the same? paras: Yeah, I think [00:23:00] fundamentally all of the data boils down to what decision is being taken. So as long as you have context about what is the decision at hand, you can then provide a better path of the solution to make that happen. And to get to that stage, you have to have inbuilt trust where the executive would also feel like they want to share that with you. It won't happen. Like you're just gonna walk into an org. Like they're not gonna start giving you all the decision that they want to take. Like sometimes it is you to earn that you own the seat of the table and then over time that happens. But fundamentally, if you know what are all the big decision that are being taken at an organization, staggering them. Work on the highest kind of impact decision that you can help through data. Do that. Do the next one. Do the next one. Do it with like enough people. Repeat. And stay consistent and disciplined that much easier said than done. We'll do that. Then it'll create a loop where people would come to you not with data request. They'll come to you with, Hey, here's additional, and you become a thought partner. You [00:24:00] elevate to all of those things that we talked about. duncan: Open Doors had data and ML leader involved almost from day one, which is pretty unique among companies where oftentimes. There isn't enough data at the start to really take data seriously, and so data maybe comes into play at series B or C or much later on in the company's journey, but it's really cool that Open Doors had data at the center from the very beginning, and so would love to talk more about how that's shaped the company's DNA and what you see as the kind of positive benefits of that. paras: Yeah, it's the classic Spiderman thing. With more kind of power comes more responsibility, and if other companies, we would not be at the heart of the business model. Here we are, which means in the middle of the night. There is a sales, we can't send any offers. Yeah. Okay. That's all hands on deck. I've gotten enough, 2:00 AM pagers to know that we are part of the critical [00:25:00] path and part of the company said cuts both ways, but fundamentally it allows us to have impact and fundamentally help more of our customers move, buy and sell their homes and we help them move. So it's super aligned to the company mission and something that is very energizing. I don't wake up. Questioning what the mission is or like why am I doing the job that I'm doing? It's super clear. Now just go and execute that. A lot of times I've also worked at companies where data is a support function, so I was like this classic, like, okay, where do I get my motivation? I don't know why I'm there. Is this, I need a data person. I'm the designated data person. So yeah. Over time I've learned to filter companies that are truly not just data driven, but data led. And optimize to work for those companies, and Opendoor is definitely one duncan: of those. How do you identify that? If you could say more about how you, how do you, how'd you know that going into Opendoor? paras: Yeah. I think exactly what you can ask the question. On the co-founder of Opendoor, I was an essentially ML data person. So you [00:26:00] look at the founding of the company and see, okay, was the first data person hired at the hundredth hire, or the thousandth hire, or the 50th hire, or the second hire? The sooner they've hired it, the more data is in the DNA. That's one way to know about it. The second way to know is the business model. You can have different companies, maybe their origin story is different than what they currently do, but you can inspect the business model and quickly tell whether that's like a data with data is particular in the business model or not. Third way to do it is more bottoms up is just ask the current kind of data folks at the company. I think most folks would be fairly objective and say, Hey, what are the kind of problems you're solving? Tell me about two C projects that you did and the project that they did. If they just tell you the dashboard that they built, then you know, that's not the company you wanna join. But if they tell you about this decision, that like shaped the company strategy. Okay. And now that is a culture where data is more at the heart of the company. And so those are like various ways you can inspect whether it's the right thing [00:27:00] or not. There's hugo: so much insight there for us, as our audience has, the podcast audience has a lot of data leaders who are now really thinking through how and when to incorporate generative AI capabilities. So I'm wondering, do you see generative AI as something you know that will fundamentally reshape how data teams operate and what they build, or more as an add-on to the fundamentals? paras: Oh, a hundred percent. Be transformative. Uh, and it is very kind of exponential, right? So they say that compounding is the eighth funder of the world, and here is going through that exponential curve, which means it is essentially next to impossible for me to sit here and say, okay, what will happen a year from now? But paras: what has happened the past year? And if that like trend continues, it'll be very transformative to not just data function, but all functions in general. Something that I'm seeing already is product. A person is able to pull their own kind of data and is able to do their [00:28:00] own analysis, and they don't necessarily need the data person to hold them like their capabilities have involved where the table stake analysis does not need like a human. So then that means the data person is involved in kind of more complex setup things that require more judgment. And over time, the AI that we see right now is the worst version of the AI that we currently have, which means it'll only get better over time. Which means you make sure your skills and what you bring the table also evolve over time. So in the data landscape, I see a few things happen. So one, we have created so many different job titles and data. Like last time I checked there like 15 of them, like data analysts, bi engineer, data science, and I keep going in different data engineers, analytics engineers and so on and so forth. Yeah. All of this kinda growth. Probably just go into two buckets. There'll be like data builders who would build the foundations for humans and ai. And there'll be the data science persona, which would be more tragic, pushing proactive insights, both for humans and, [00:29:00] and so we'll see a massive consultation of Jos families. And more broadly, I think all of the different engineering and tech functions are being evolved from, you are strong at a function that you are in, but you're fundamentally a builder. And there'll be very clear delineation between like builders org and like business accountability orgs. And so I fundamentally have started to think myself as a builder, which happens to know data is think of myself as a leader, which just happens to know about data. And so one kinda very tangible thing I started to do with my teams is that we'll look at functions that could be transformed with ai. So as an example, a couple of weeks ago, we launched our AI voice port that can do sales for Opendoor. We are not, we can't keep up with the lead volume that we have and. Our human kind of sales team is amazing, but they're human and so they have a capacity, so we can't keep up with the lead volume that we have, which means we deployed AI voice [00:30:00] agents to tackle the rest of the leads and make sure those customers knew about Open Door and we had, they had someone to talk to. There's not something someone asked us to do, to be clear, looked at the data and said, Hey, half say half of our leads are not being proactively contacted. Human capacities maximum. Okay, let's spin up a AI voice agent and do that. So that's the builder's mindset. We didn't wait for the some other team to do it, spin it up. And so fundamentally, I think what boil down to is teams that have agency autonomy and kind of accountability would probably be better off. What I see happen is that teams that. Like this example of who, I say agent, right? We essentially showed agency. No one asked us to do it. We, Hey, that's a problem. Let's go fix it. Let's build it. Accountability is that if that went wrong or something like that happened, my team is accountable. We will fix it like we did call a couple of folks that we shouldn't have, and we fixed the logic on [00:31:00] who we call and adapted the scripts, for example, and then broadly just kinda stay adaptable to what's going on. But yeah, to answer your question, it'll be a hundred percent super transformative and it'll be very hard to know what exactly will happen a year from now, but as long as you stay adaptive and flexible to kinda what's going on, show agency. Sure autonomy and those things would go a long way. hugo: I think that combination of a agency and autonomy and being adaptive is so important, particularly startup 4.5 came out recently and how are we all using that now? Duncan and I were just discussing all the exciting things they're doing. Delphia now since last week, continuing to do exciting things. I also love, you mentioned compounding as the eighth wonder of the world. I wonder if we can be cheeky and 'cause we work in Python, we can actually make it the zeroth wonder. As we tend to zero index over here. I also, with respect to AI adoption and adaptability, I am wondering how you think about creating a culture of experimentation. And we all know this, but for a bit more context for listeners, we all know that [00:32:00] AI doesn't work all the time. The tools don't necessarily work all the time. Now, that doesn't mean pre AI tools worked all the time, but. Due to non-determinism and stock elasticity and these types of things, they are less reliable on, on average. So they're inherently more risky. So I'm wondering how you create a safe culture where PE people feel safe to experiment, be able to experiment and make mistakes while delivering value. paras: Yeah, there's a couple of things. I think broadly the leadership at the company needs to be risk respond with ai. So one of the things that our CO recently did is, is. Publicly claim that we are default to ai, which means we have the permission to use AI to try different things. Responsibility of course. And we do that responsibility as legally all of those kind of baseline things as possible. But you fundamentally need a culture where the leadership is this. I see that with Shopify and bunch of like other companies that I also publicly claim that we are ai once that is taken care of and that's like the default, if you can [00:33:00] work in organization where that has happened. Then you just randomly boil onto agency and adaptability. Go right. Bunch of things. Be accountable to those. So we cannot do basic, like basic of fancy, depending on what the problem is. Evaluation simulations to see, okay, what happens if we can do X? And the classic best practices from software doesn't go in ai, right? So when you launch a software, do a soft launch, launch it to say, 5% of our customers see like. What happens, you evaluate if nothing breaks, then you turn on the feature flag and it goes to all of the customers. So something similar with ai. So those best practices from software carries over as is where it's still that we will launch it to a subset of the user, see how it performs, be accountable for the outcomes, and then when it works and it passes basic, all of the tests that we have put in place, then we can to scale it and then go from there. So the basic best practices from software carries over in ai. The difference is the velocity is now like 10 x, so you can kinda do more things [00:34:00] faster. Uh, that's just keep shipping and you learn it. duncan: The change in velocity, I think is really palpable when folks truly embrace AI and, and figure out how to use it effectively. You're a really prolific writer for us with very frequent and deep blog posts. I've really enjoyed. And one of the things that that piqued my interest was really this kinda notion of a super IC and how AI has created an opportunity for certain folks to be 100 x more productive than their peers. I would love to kinda unpack that with you. Maybe you can share an example of a super IC on your team who exemplifies it, and how that works in practice at scale in a large organization like Opendoor. paras: Yeah, absolutely. This notion of Tenex engineer always was like this. 10 engineer or this like notion of 80 20, where 20% of the team drives 80% of impact. And so those like things already existed. What AI did is 10 x it, so you only 10 by 10 you get a hundred, right? That's essentially where that a hundred x comes from. So [00:35:00] you take like a 10 engineer, give them tools, they now become a hard engineer. And so as a leader, your role to go find the tennis engineer is still true, but the pay that you get to find the tennis engineer, pairing them with AI is now. Hundred X. So you are now incentivize as leader to kind go find those people, build your team around those people and give them the agency, the autonomy, the adaptability that they need to go do things. So this kind of why say AI agent example that I gonna mention. I just hack it up one night, show it to a couple of people that like, Hey, this is great. Then I pulled. One of the ics on my team was really good. I said, Hey, let's partner together. I'm gonna tackle the next thing, but you now own it. And off he went. Another person on my team, she's really good with like funnel analysis and be able to take context from multiple kind of product managers. So she is able to single handedly work with 10 different product managers on 15 different tests that are running. And he's able to du and say, Hey, this test conflicts with this [00:36:00] test and we should be doing this way. This is how it's sequenced. Someone is able to cross functionally, look at multiple tests and be able to guide the sequencing and being able to call like the winners in the right way and essentially guide the PMs and look at the funnel proactively push insights. And so I've taken kind of those folks and I've protected their time and I've said, Hey, these are like the top three things you'll do. Don't worry about the long day questions that you get. paras: You'll have either they'll get answered or not answered. It's completely fine. Someone else will take it. But you focus on this priority and because I know that they are really good at what they do, as long as they product them and give them that kind of focus and objective function of they go right. So objective function for the person who's doing funnel analytics was your objective function is reduce the abandonment and the funnel. Now look at an entire funnel and give insights to as many people as you possibly can. The objective point for the YC agent was go scale YC agent so that every single customer gets a call, whether that's from a human or ai, it doesn't matter.[00:37:00] Run with that, it gets to a hundred percent coverage, right? So you give an objective function and give them focus and then clear everything else from their plate and their 10 x already. With ai, it be a hundred x. So that's couple of things I do as a leader. And then, you know, essentially be a shield for them so they can do their best work. duncan: And so it's almost like if the old Pareto rule is was like 20% of the effort creates 80% of the output, now it's 1% of the effort creates 99% of the output or something like that. And creating that environment where they have, as you said, the air cover, they're shielded from their stuff so they can actually deliver that thing that is so impactful. Yields through the greatest gain. Yep, exactly. hugo: So I'm wondering. For us, looking five years out, I'm wondering what advice you'd give to data leaders who want the data function to be indispensable to the business. And I'm wondering whether it would help to decouple this into thinking through organizations like the ones you've, you've worked for and in where data is actually part of the DNA [00:38:00] and then maybe incumbent organizations or other organizations that are trying to bring in data functions as well. paras: Yeah, I don't know about five years, but at least I'll tell you what I'm doing right now. Fundamentally, I would advise anyone to go work for companies that are data alert. First and foremost, you can't really change the culture of the company if they're not like data alert. It is paras: essentially you are swimming against the current. If you decide you can go change that, it's going back to your ex and thinking you can fix that. There's a reason why it didn't work out. It probably won't work out again. In rare cases, it does work out, but for the most part it doesn't. So you just stop trying to fight against the, I feel hugo: same man. paras: So you don't fight against the current, right. Like swim with the current tailwind, especially in the AI world, the data kind of space is even more valuable than before. So there's a really good tailwind that we have, so make the best of it. That's first and foremost. And once you've found that. If you [00:39:00] have it, that's great. That's awesome. Then cannot do three things that differentiate you. So one, don't wait for people to tell you things. You already should have a bunch of insights that you go share with people. So that's like the show agency. Second, just be autonomous. That what that means is that go be able to code, do things that you can independently deliver value for the business, whether you're an IC or a manager or leader. It doesn't matter. Like you go to those. No one really. Told me once I centralized some functions of Opendoor to say, Hey, would you be the person that centralized desktop it? I just felt like, yeah, I'm just gonna go do it. So that was like the autonomy piece of it. And last but not the least, be very adaptable. There's so many changes that are happening. Data tools are changing all the time. AI if like even before ai, a bunch of those things were highly fragmented anyways. If you have ever looked at a data vendor map, you'd just be like hundreds of vendors in that map. And as hard as is, and AI cannot makes it even. Harder. It's of course like a helpful tool as well. It help you accelerate your [00:40:00] productivity, but we adapt a table, try different things, get new signals, and then as you get them, the way I like to think about that is that you are a chess player who essentially you maximize your options. So take the moves that allow you to essentially make the kind of next moves in the best possible way. And making decisions that will pigeonhole you into, at Amazon, we used to say one way or two or so. One way decisions are additions. You take, you can't come back from two or addition you take, you can come back from so by yourself. Optionality, we adapted with like ai. Lots of things would change, I don't know, five years, but at the immediate kind of few months, this is what I'm maximizing for agency autonomy and adaptability. hugo: Beautiful. It's time to wrap up. What I just wanted to say, Duncan mentioned that you write a lot of fantastic blog posts and they're on, in Insight it. They're on insight extractor.com and you've got over 600 blog posts. You've had over two to 2 million visitors. So congrats and on [00:41:00] all the work you've done there and thank you. I'm encouraging everyone to check it out as well. paras: Yeah, awesome. Yeah, I'm very easy to find online, so feel free to connect with me and paras: yeah, thanks for having me on the part. hugo: A real pleasure. Thank you, PERAs. Thank you. What? Thanks so much for listening to High Signal, brought to you by Delphina. If you enjoyed this episode, don't forget to sign up for our newsletter, follow us on YouTube and share the podcast with your friends and colleagues like and subscribe on YouTube and give us five stars and a review on iTunes and Spotify. This will help us bring you more of the conversations you love. All the links are in the show notes. We'll catch you next time.