Kevin: Excel was probably the tool of choice for 30 years for citizen developers. And, uh, the problem with that is it lives inside of us. We wanted to create a system that still encouraged the citizen developer to be able to capture the data that they felt was most important, along with the balance of capturing the information that the company feels is important. Announcer: You're listening to Augmented Ops, where manufacturing meets innovation. We highlight the transformative ideas and technologies shaping the front lines of operations, helping you stay ahead of the curve in the rapidly evolving world of industrial tech. Your host for this episode is Eric Mirandette, Chief Business Officer of Tulip, the frontline operations platform. Erik: Welcome back to Augmented Ops. Today, we are joined by Kevin Kidd, Director of Software and Analytics for RFK Racing. Welcome to the show, Kevin. Kevin: Thanks for having me on the show. It's going to be a highlight of certainly my week, maybe even one of the highlights of the year. So thanks for having me. But, uh, uh, Kevin Kidd, uh, as mentioned, director of software and analytics here at RFK Racing. I have been in professional motorsports my entire professional career. Graduated college from Virginia Tech, got a bachelor's degree in mechanical engineering. I like to tell people I graduated on a Saturday afternoon and the very next day, Sunday, I was at Charlotte Motor Speedway racing in the Coca Cola 600. Jumped straight in and haven't looked back since. In my time in professional motorsports, I've done a whole bunch of different jobs, went from, uh, the ground floor up, so to speak, and I've been fortunate to, uh, do mechanic jobs. It was kind of my first job getting into sports straight out of college. Mechanic slash engineering. That transitioned into full engineering after a couple years. Spent a good chunk of time doing race engineering on Cup Series vehicles. That transitioned into crew chiefing at the Xfinity Series level. And then from there Rolled back into the cup series as a team manager and then ultimately competition director and served a little time as technical director. So those roles kind of more overseeing the entire operation and sort of souped to nuts on the, uh, on the competition side. Today, my role is software and analytics and largely my role is about driving technology change within the organization from both competition and the operational side. It's been a lot of fun. I've been doing this now for a little over a year and a half and I think we've made a lot of progress and looking forward to continuing to make some more progress on that. Erik: Yeah, you've got one of these dream jobs, frankly. We're technology nerds and we're manufacturing nerds, right? And you live at the intersection of both of these worlds at the extreme, where you get to go out and you compete. And I think what's so interesting about what you do, and I've had the opportunity to be on site with you, and I've always kind of enjoyed racing, but what I didn't appreciate is how much Not just the engineering, but the actual manufacturing process of that car influences the outcomes of the race. All of the components that go into each of these cars are pretty much prescribed. You have very tight tolerances around all of these components, but every week that car comes off the truck, you take it all the way down, every part goes back on the shelf, and you rebuild a new car every week. Multiple new cars every single week you guys race. Can you just explain to me a little bit about what that process is like and how the manufacturing process influences the car that gets out there on race day? Kevin: Yeah, absolutely. You know, there's a stack up of things that ultimately lead to performance on Sunday. There's certainly the driver is instrumental in everything that we do, not just in his ability to turn the steering wheel and step on the gas pedal and the brake pedal, but well beyond that, just the communication and how he articulates the challenges that he's dealing with on the racetrack. Beyond that, you know, we have a race car. And so different than a lot of traditional sports. We'll take basketball as a good example. You take a basketball, you throw it on the court, the hoops are all ten feet off the ground, and the ball is the same, and you know, you can count on that week in and week out, game after game. So those teams are really largely focused on player performance and strategy as a team. In racing, we have all those same, you know, player performance and strategy as a team things that we have to focus on. We also have to focus on the basketball, and the hoops, and all the equipment that goes into our sport, namely the car. The car is a huge part of performance, and when you look at how a car comes together, modern NASCAR, I would say about 90 percent of the car for this next gen car that we're racing today is a purchase part. You know, it's a big transition from where we used to be, prior to this generation car. I would say 90 percent of the car we actually manufactured in house. But through rules and strategy change by NASCAR, we are now in this new realm where we're purchasing a large portion of the car and, um, chassis that we race is going to be the same chassis that every single team in the field races. So when you look at how to construct a car that performs better than folks that you're racing against, it comes down into the details, details of construction, of maximizing tolerances within the rules, of course, but within those rules, we, we have windows that we can work within and you're trying to maximize every bit of that so that you can get the most down force. The most amount of mechanical grip, you get the balance right on the vehicle, all these parameters that really drive performance on Sunday. Erik: Well, and this is one of the things that I had no appreciation for prior to, you know, spending time on site with you guys. You said 90 percent of the car are purchased parts. What that means is that you're not manufacturing these components. You're not saying, how can we engineer this specific component to give us a competitive advantage? Everybody gets the same part from the same set of suppliers. And the engineering challenge for you is all in the manufacturing process for the most part. And the rules, you know, one of the things that, that shocked me was how tight these rules are. You know, this panel can have a tolerance up to, you know, one and a half millimeters. This panel that sits on top of it can have a tolerance of up to two millimeters. And you guys have to Control all of these tolerances through the assembly process, mostly, to stack these tolerances such to give you a competitive advantage for the track that you're going to be racing that day. And Talladega is a little bit different than Daytona, and, you know, if you're expecting rain or the heat profile, you know, you're capturing such granular data on race day that you're saying, Okay, what's the subtle configuration changes, which means you need to have Complete traceability. Well, one, you need to have unbelievable tolerance specification for each of these subcomponents, right? For each of these components that are going together, one. Two, you need to have the ability to stack these with such precision. And for you guys, you know, I spent a lot of time in production shop floors with manufacturers and getting it right means winning or losing. But winning or losing means is the part defective or not? Are we going to have to do rework? Did we meet our target for the day? For you guys, winning or losing is Quite literally, you're either first, you know, I'm not going to say you're either first or your last. I won't say that. I don't know how that's received in the NASCAR world. But my point is, you're either going to win or you're going to lose. And, you know, I'm not taking away from Brad or the driver or anything like that, but like a big part of that is you got to put a car out there that's ready to win. And that's 100%. The assembly process and the process controls you have in the operation. Kevin: It really is. You know, we don't have the luxury any longer in our sport of practicing. What I mean by that is when we get to the racetrack, we come off the truck, we put a car on the racetrack, we might get like a 15 minute practice session, and then we jump straight into qualifying and go race. Years ago, we would get hours of practice leading into an event. So you had a little more room for error. You could come off the truck and if you didn't quite hit it right, you knew you had enough time to get yourself sorted out and ultimately be okay for Sunday. Now there's no room for error. You have to be right, right off the truck. And so what that means, how that manifests itself all the way back through our assembly process, it really starts with this concept of having a digital twin or a simulation model that is really, really accurate and good. And so I drop somebody off in the shop and just tell them, hey, you got a week to figure things out. Not going to take them too long to understand that everything we do. is largely feeding into building the best vehicle model that we possibly can. Backing out of that, and what does that imply for our assembly shop? We have all these parts, quite literally hundreds of parts that come together to build this race car. We have to use a tremendous amount of metrology all the way through the build process. In certain cases, we're measuring things two, three, four times depending on the process. We're taking this data and it's informing our, our multibody vehicle simulation model. As we're building this far and building this model, they're kind of in tandem. They're, they're getting built at the same time. The other thing that's kind of driving a lot of our assembly processes are just known performance parameters. So what do I mean by that? We'll take aerodynamics, for example. It's probably the best example of this. We go to the wind tunnel, we are constantly iterating on things, trying to find more downforce or less drag, maybe a little bit better balance with the aero, all these things that ultimately make a faster race car. And when you learn something out of that process, inherently you want to come back and put it into your cars. To do that, I mean, sometimes it quite literally means, hey, this panel, instead of mounting it in position A, we're going to mount it in position B, which happens to only be a ten thousandths of an inch translation. Those are the kinds of things that we chase, all in the name of speed. So, we chase those known performance parameters, we're building this vehicle model. In the end, when a car finally comes together, you do this correlation exercise to make sure that the fully assembled, measured car correlates back to your vehicle assembly model. And if you have that kind of correlation at the shop, then you feel pretty good that you've at least captured the car and you're ready to go do your simulation efforts. Erik: So we're talking about stacking tolerances, we're talking about rigorous data capture and analytics. Both on the track, in the wind tunnel, through the production operation. Let's talk a little bit about how do you do this, you know, like we discussed previously. Every week, I was there, the truck pulled up, the cars got rolled off, and you know, I saw the car that raced the day before. And your team's out there and they start ripping this thing apart. Later in the day, I look out and there's just a chassis sitting down there. Everybody went home. There's like no car left anymore. And all the components get stripped down, inspected. You said, you know, you're taking multiple inspections. They go back on the shelf. And you don't, it's not like you have one part there, you have many different configuration options here that all get indexed. Let's talk a little bit about, practically speaking, how do you do this? And I think this sort of naturally leads into the role of technology, right? On the measurement side, on the sensor side, on the tracking side. Can you talk a little bit about how technology plays a role in your ability to feel the winning car? Kevin: Yeah, absolutely. Just maybe to give the audience a little bit of perspective of what that process looks like. You kind of touched in on the sort of the end of the line, the disassembly part. When those parts come off the car, they're immediately deactivated from service. You know, this is kind of like the first step of our, of our technology system. We have systems in place that as soon as the checkered flag drops, It's We know every single part that's on both cars that raced, we pick off some lambda functions and get some processes running and in the end, all those parts get deactivated from our system. What does that mean? That means that any of those parts on those cars, they would not be able to be physically put back on a future car. without going through some sort of inspection or verification process. That goes for everything, soup to nuts, the whole car. So that level of rigor forces our technicians to go through these processes of verification, quality control. In some cases, that is a very simple visual inspection. Hey, here's the part. Yep. Looks okay. Off it goes. In other cases, it's much more rigorous involving 3D panning. We do CT scanning inspections now. We're, you name it, we've, we've got all kinds of different processes for all these parts. Once those parts are verified that, hey, they're, they're good to go, they're ready to race, the work orders get checked off. So another piece of the technology. We've got a work order system, kind of have a, an in house MES, for lack of a better way of saying it, my team has developed, and we're currently developing it. It's sort of ongoing. This is an area where Tulip has really kind of jumped in and helped us. We've been able to create this system that allows a very efficient traceability program with all these parts. Why is traceability important? Well, first of all, you don't want to have bad parts that end up on race cars. Bad could be described from a whole host of different things. But second of all, it's super critical to understand the full life cycle of a part. For the event that you go all the way through the loop and you're back at the end, but this time instead of the car coming to our shop, it actually goes to NASCAR for inspection. If you're sending an inspection on NASCAR and for whatever reason they're questioning a part or they don't like something, you need to have this traceability. Hey, here's the condition of the part when it came in the door. Here's all the things that have occurred with this part since we've had it. Here's the races it raced at, here's the, the mileages it's accumulated through that time, here are the rebuilds, the processes. You need to have this not only in, in just textual information, but also a lot of photography, a lot of scan data that comes out of this. This is where our technology stack is really going to work. It's collecting all this information, making it easy for the operators and the technicians to generate this information and to put it into a system that can house it all. And then on the backside of that, having some sort of a system that allows us to go quickly search, quickly find the information we need in that moment. We need it because you quite literally, and this is no exaggeration. You could be arguing about. Something, some feature or some part on the car five minutes before qualifying and having that information handy and able to kind of prove your point is super critical to making sure that you can go out on the racetrack and actually qualify. That doesn't happen very often. I mean, I've seen that happen two or three times in my career, but those are the edge cases that you remember as you're building out a system. And we want to make sure that this system is robust enough to be able to get ourselves in a good spot when situations like that occur. Erik: It's kind of analogous. I mean, if I take this into the med device for life sciences space, it's the equivalent of, Hey, the FDA just knocked on your door and you need to prove that you've produced these parts according to spec. In your case, this is NASCAR saying, Hey, let me take a look at that car that you just raced and make sure that it conforms to all of the rules that we put in place. But it's the same burden of proof. Well, Analogous anyway, I don't know if I want to say it's quite the same, it's a little different, but and if I think about this in a traditional sort of non regulated environment, uh, this is the equivalent of, hey look, I'm producing this big thing and it involves a bunch of sub components and you know what, I've got an issue, a recall, you know. We've know that one part is faulty and it's dangerous or something like that, I want to know Where does that recall start and where does that recall stop? And if I don't know precisely where it starts and stops, it's gonna get a whole lot more expensive a whole lot more quickly. Now, ideally, you're avoiding this altogether because you have the appropriate procedures and testing in place. And by the way, we can help with that sort of thing. But let's talk a little bit about you. So you said you have a homegrown MES and this is an area where Tulip is helping out quite a bit. And when I was down with you guys, you were, you were, Just getting started. You know, I know you guys had some dashboards up. I know you had quite a bit of traceability components built out, but to be honest, I haven't even seen, you know, what you guys are doing today. And I don't want to turn this into a Tulip commercial to be clear, but like, what are the different types of technologies? And then I think another thing that you and I have talked a lot about is not just the individual technologies that solve individual problems, but rather this Ecosystem of technologies that are all working and sharing information together in order to solve the problems, because none of these things are really can be addressed in isolation or very few of them can be addressed in isolation. So maybe you could walk us through the different technologies that come into play at the different points and how they all work together. Kevin: Yeah, so this now really gets into my day to day and what we do with our team. The technology stack is a huge part of it. I'm going to beat up on our company a little bit, just so everybody can understand the place of which we're coming from. If you go back not long ago, 18 months, 24 months ago, and I hate to say it even in some cases still to this day, because we haven't fully eradicated it. This company was largely driven by Microsoft Excel, and I don't think that probably very different than maybe how a lot of companies are who are making their transition technologically. Erik: Kevin, I think there's a lot of people listening right now that are saying, yep. And even if we don't want to, even if you have the shiny MES and all this stuff, you still got to excel on your shop floor. Why? Because the people who are running the operation day to day who have problems, guess what, they can open it up, they can do whatever the heck they want, they can solve the problem, move on with their lives, they don't need to call a system integrator, they don't need to call IT, right? I don't know, I've been in a lot of shop floors and I haven't seen one yet that I don't see some version of this, at least initially. Kevin: Yeah, and this comes back to the thing that I hear on the show all the time of the citizen developer, right? Right. Excel was probably the tool of choice for 30 years for citizen developers. And, uh, the problem with that is it lives in silos. You know, Excel exists on one spreadsheet over here and another spreadsheet over there. And getting all that data to come together, be able to actually drive meaningful change within the organization and stuff when you're in that environment. So, I kind of made it a mission to eradicate Excel, so to speak. We wanted to create a system that still encouraged the citizen developer, that allowed our technicians to be able to capture the data that they felt was most important, along with the balance of capturing the information that the company feels is important, but then to be able to drive all that into this connected environment where systems are suddenly talking to each other. And, um, the process is much more integrated. And so, you know, specifically what am I talking about? If you look through our build process, we have a lot of metrology from one of our other tech partners, Barrow Technologies. Those guys have outfitted us with the CMM arms and scan arms, and we have 3D surface scanning equipment, generating all this metrology about our parts. And this information drives the simulations. But it also drives a lot of our quality control process. The simulation process, by the nature of the way that works, it was fairly integrated already, going from metrology to simulation, but From the quality control aspect, it was not integrated at all. Things were living on spreadsheets, on servers, or on hard drives, spread around the whole campus and you, uh, had one of your quality control guys leave and somebody replaced him, you know, there was a significant ramp up to get people up to speed. Erik: Well, never mind the fact that you're getting five minutes out from qualifying, you said, and you say, Hey, did we check that part? And somebody says, Yeah, yeah, it's on so and so's Excel spreadsheet on his hard drive. And you call him up and it's, I imagine that's probably not the best feeling. Kevin: No, it's not. So what we've driven towards is just this, you know, connected ecosystem where Going all the way back to the purchasing side of things and at the ERP level, you know, we are in the middle of transitioning ERPs and all this and heading to another tech partner in CISPRO and again, not trying to make this commercial for anybody specifically, but I think it's important to talk about the technology partners who are involved in all this. At the purchasing part of this, we want to capture all the relevant information about that purchase, about the information that was exchanged to us from the vendor. A lot of times there's technical information is exchanged at that point. Again, that was just living kind of on a hard drive in a vacuum, but we wanted to grab all that. Then as it kind of makes its way through the system, it goes to Quality Control next for its initial inspection and sort of check in, where we put it through our own rigor and testing and generating more data. www. mfg. org Again, we want to get all that data out of Excel into databasing and kind of push it down the line. And that's kind of really where our homegrown MES starts. We call it Relay. Relay starts at that step. It kind of pushes the part on down the line at the point it's past quality control. It goes on to the sub assembly groups. Sub assembly groups then have the chore of taking build sheets from the engineers and determining, okay, what parts do I want to put on a car to meet this build sheet? This is where they are referring to all this information that's been collected, both incoming parts new, as well as all the recycled parts that have come back from a racetrack. They're going through the process of determining, okay, what parts are mileaged out? What parts need to go off for service? What parts are still active? You know, they have to juggle all this information to be able to come up with parts to put a build together. We're a lot different in that regard than maybe traditional manufacturing. You know, we don't take raw material in and shoot a part out and we never see it again. We have parts that come in the door and they'll recycle through the system many, many times before they ever get kicked out. And when they're kicked out, we never see them again. So it's that managing of recycling, uh, that quite frankly, we couldn't find a commercial product that really handled that very well. It was something that we needed to build ourselves. It's a very niche thing for motorsports. And so we've built the system to handle that. When our guys put a build sheet together and ultimately put a car together, that information then tracks to the racetrack where we're collecting more information. Literally, every time the car crosses the star finish line, we're logging in the database another, another lap. In turn, yields another mile and a half, or another two miles, or another half mile, depending on the track size. And you're constantly creating these data points on all those parts, so that when the checkered flag drops, We go all the way back to the start of this conversation where everything's deactivated and, uh, we come back through. Erik: You've got complex process requirements, and you need to coordinate centrally, to an extent, what information is captured and how is this process controlled, but you're also Talking about citizen development as a core concept here, where you say, Look, people are solving the problems one way or another, I want to give them a common platform where they can do this. I'm curious, how do you think about reconciling that tension? What do you say, look, this is what we need to do, I don't care if you think left or if you think right, like, we gotta do this, or, you know, folks come to you saying, hey, thanks for this set of tools, here's the things that are important to me, how do you manage that? Kevin: There's a balance that you have to walk. I don't know that there's one prescribed way. I would imagine every organization's a little different on this, but for us internally, I think there's a general recognition that there are certain processes and certain strategies that we invoke that are just required. And, um, there's really little debate over those things. And so you build the systems around those processes to take care of it. Where I think citizen development really comes in is there's a little bit of freedom, so to speak, in any process, you know, you could, you can say, hey, we need to capture this information at this time in this manner, and you can be very rigid about it. And then you just have a robot working for you and you're not really encouraging creativity or continuous improvement. That guy's just coming in, plucking in, doing his job and going home. That's not really what professional motorsports is about. That's not what RFK is about. We're, we're very much CICD minded. We kind of have this philosophy of trust is assumed, not earned. When we hire people on the front side of employment, We do our level best to make sure that we've vetted them out and that we've got the very best people for the positions that we can, and they come in on day one with trust. And the only way that trust is lost is if something negative were to take place which eroded that confidence. Otherwise, trust is assumed. And so, with assumption of trust, it opens up this whole world of citizen development. You can trust your people, that they've got kind of the right frame of mind. They know where the general direction of the company is headed. They know, in their specific role, the things that they're looking to accomplish, along with the company goals. And you kind of push that continuous improvement. All the while, yes, we have managers in place, and we have database managers in place, and you know, we've got, you know, all kinds of things scattered throughout the company. We're kind of watching over this, and we do it with guardrails, right? You kind of watch, and you want people to feel like they have the freedom of Excel, yet the data is all connected together and working together in a cohesive manner, and when it gets off the rails a little bit, you jump in and you fix the problem. But I'm very much a believer that within reason, and this is kind of a balancer striking, within reason, trust your people, hire them right, give them the, uh, the rope to go make you better, and don't put a thumb on them. Erik: Well, let me ask you then, NASCAR's been on a journey going from competitive advantage, at least on the car that gets put on the track come Sunday, being derivative of the engineering decisions that are made through the fab process. And that's shifted, uh, into this most recent next gen car, to the actual manufacturing operation is where you get your competitive advantage. And we talked a little bit about how you, Run your operation, the role of technology to be able to make sure that you're putting the best care out there every Sunday. It sounds like you've been on quite a journey. I think it's a pretty good season. I think we can be pretty happy with the results so far, but I know we're not done. And so my next question is, you know, look, I know you guys have made a ton of progress. I also know that you're speeding up, not slowing down. I don't mean to be too punny here, but I can only help myself so much. But I know you guys are in no way slowing down. What's next for you guys? What are your next set of priorities? And what are you doing to take it to the next level to be even better prepared for tomorrow? Kevin: Yeah, so if you look at the evolution of our technology, we've been on this journey really since the beginning of 23. 2023 was largely a focus on competition tools. So think simulation, race strategy, the things that we put in the hands of our race engineers and crew chiefs to go do their jobs week in and week out. And we've made a lot of progress through 23 on that. That work never stops. So we're continually always working on those things. But we did shift priorities a little bit for 24, in that we knew our operations technology needed to really improve. And so, Here we are eight months into 24. I think we've made a lot of progress and we've got a lot of the, what I would say, the blocking and tackling of technology has been established within the group, the operations guys. But when you look at what's ahead and what we are hoping to do down the road, right now, I would say we've achieved a state where we're, we're in a much better position of collecting information, collecting data and presenting that data where it can be used. I want to do more with the data. I want to really start driving operational decision making, operational change with the data. I think there's a lot we can do around optimization of build schedules. You know, when do you run certain parts and at what track? There's a tremendous amount that we can do there. There's some financial elements that I think we can work towards to try to optimize. Again, trying to reduce costs for operating far over a season. Now that we're starting to collect data in a better format. And it's not siloed, but it's all together. That is really our next step. And I think we're going to start in on some of that in 25. Erik: Well, it's exciting. I sure appreciate you joining us for the podcast today. I know our audience is going to, it's a different perspective, maybe not as different as they would have thought it turns out racing cars and running a top tier manufacturing operation in this case. One in the same, uh, it would seem, but again, really appreciate you taking the time to speak with us today. Likewise, Eric. Announcer: Thank you for listening to the Augmented Ops podcast from Tulip Interfaces. We hope you found this week's episode informative and inspiring. You can find the show on LinkedIn and YouTube or at tulip. co slash podcast. If you enjoyed this episode, please leave us a rating or review on iTunes or wherever you listen to your podcasts. Until next time.