Why NPS and CSAT Hurt More Than They Help

“How likely is it that you would recommend our company to a friend or colleague?” You’re not alone if these 15 words are seared into your brain. Why? Because everyone from your local auto repair shop to countless websites asks you the exact same question.

The culprit? The Net Promoter Score, which Fred Reicheld of Bain & Company introduced in 2003. Since its introduction more than 20 years ago, NPS and similar customer satisfaction (CSAT) scores have become common methodologies for companies in many industries to identify their happiest customers.

In this episode of AI Knowhow, Courtney, David, and Mohan explain why it’s time for NPS and CSAT scores to be supplemented by more sophisticated ways of gauging customer sentiment and gaining client intelligence.

The idea behind these surveys is that if companies can identify what pleases their raving fans, they can do more of the same. It’s a great idea that had its day in the sun, and it can still be a valuable source of input for companies going forward. But as currently constituted, the team believes that these types of surveys can actually do more harm than good.

In unpacking why it’s time for more sophisticated methods of gauging customer satisfaction, the team starts with identifying some of the inherent problems with these surveys. NPS and CSAT scores can become a crutch that is relied on far too heavily to run the operations of a business, which wasn’t their original intent.

The data they provide is often stale and not nearly as real-time as businesses need it to be so they can act on it. The data itself is also anecdotal, and it gives an incomplete picture of true customer sentiment when there are many more stakeholders with ties to your organization than you’re getting feedback from.

As David recounts in one notable example, surveys may be delivered in a manner that overinflates the scores that come back, and the responses provide far too narrow a window into true customer sentiment.

David and Mohan say that now is the time for companies to evaluate and implement smarter ways to gauge customer sentiment. For B2B services companies, the best place to start is to tap into their unstructured data sources like email, Slack, and video calls to get a sense of where they stand with customers. AI is ideally suited to undertake this task, which is simply impossible for a human—or an army of humans—to do on their own.

B2C and product companies likely already have all the data they need in their product analytics. They can look at the data to see if customers are returning, actively using their product, completing purchases, where in the funnel they’re dropping off, etc.

For this week’s interview segment, Doug Meiser of AMEND Consulting joins Pete Buer to discuss how AI and data science can be used to drive operational excellence. AMEND is a management consulting firm focused on middle-market manufacturing that also helps companies in private equity, construction, real estate, healthcare, and financial services tap into the power of AI to drive process improvement, profitability, customer satisfaction, and growth.

Doug emphasizes focusing on the science side of data science by identifying, documenting, and iterating on an organization’s routine practices to understand the problems your customers are asking you to solve at a deep, fundamental level. Only once you do that can you craft the kinds of solutions that AI is perfect for.

Watch or listen to the full episode via the embeds below!

Watch the Episode

Watch the full episode below, and be sure to subscribe to our YouTube channel.

Listen to the Episode

You can tune in to the full episode via the Spotify embed below, and you can find AI Knowhow on Apple Podcasts and anywhere else you get your podcasts.

Show Notes & Related Links

This transcript was created using AI tools and is not a verbatim transcript of the episode. Please forgive any spelling and grammar errors that may be included. 

Courtney: On a scale of one to 10, how likely is it that you would recommend this podcast to a colleague or a friend? Thank you so much for your rating of 10. You’re probably familiar with the score. It’s because we’ve all been asked it a lot. It is called the Net Promoter Score, NPS. It was created in 2003. That’s more than 20 years ago, by the way.

Today we’re gonna tell you why that score may be hurting you more than helping you. 

Hi, I’m Courtney Baker, and this is AI Knowhow from Knownwell, helping you reimagine your business in the AI era. As always, I’m joined by Knownwell CEO David DeWolf, Chief Product Officer Mohan Rao, and Chief Strategy Officer Pete Buer. 

We’ll also have a discussion with Doug Meiser of AMEND about how AI can be used to drive operational excellence.

Courtney: But first, get on those safari hats and bust out your binoculars because it’s time to bring back AI in the wild.

Courtney: Pete Buer joins us for another AI in the Wild. Hey, Pete.

Pete: Hey Courtney, how are you?

Courtney: I’m doing good. So spring is upon us, and that brings the time honored tradition of graduation. One university in New York made headlines recently when an AI powered robot spoke its commencement ceremony. Pete, what can you tell us about this one and what’s the takeaway?

Pete: Okay. For a little bit of context, this took place at Deville. in Buffalo, New York, an AI robot named Sophia took a few questions from the student body presidents and shared some, shall we say, quote unquote life of advice, um, based on an amalgamation of other commencement addresses. here’s how it sounded. 

Sophia: What are the most shared words of advice that commencement speakers give to [00:02:00] college graduates? 

Sophia: As you embark on this new chapter of your lives, I offer you the following inspirational advice that is common at all.

Graduation ceremonies, embrace lifelong learning. Be adaptable. Pursue your passions. Take risks, foster meaningful connections. Make a positive impact and believe in yourself. 

Pete: So on the one hand, this is a great encapsulation of what this generation of grads is going to experience what we’re all gonna experience, how to live side by side with machines that can do, at the end of the day, a lot of the work of humans.

And also a, an appreciation for the challenge, not just to coexist, but to integrate and optimize the combination of contribution from humans and machines. On the other hand, and maybe to that second point, this may also be a great encapsulation of how to get it wrong. strikes me that a commencement address is kind of the sort of event where the human experience is the one [00:03:00] that matters most, and it kind of can’t be replaced by a machine.

By the way, 2,500 people on campus signed a petition who felt the same way trying to hit it off at the pass. and how much value was there really the generic consolidation of other commencement address that ever was for this group of attendees? I think there’s a big picture risk worth all of us thinking about ai that it drives us to some vast.

average vanilla middle ground where original thought used to exist. Now then again, you’re on the PR team for the Kansas City Chiefs, a nice controlled commencement address might be looking pretty good to you right now. No doubt the event will have been memorable for those who attended.

I guess I just wonder if it will be memorable for the teaching lesson, the messaging, the controversy, or [00:04:00] the the gimmick of AI.

Courtney: Pete, I gotta tell you, this one really irks me. Um, when I found out that we were gonna be talking about this, I immediately was like, you know, the problem is we want the technology to elevate humanity. And I feel like what the problem with this situation is, it’s not elevating humanity, it’s just replacing something that.

Most likely somebody probably would’ve really enjoyed doing and would’ve enjoyed hearing about the life experiences of a really great commencement speaker. 

Pete: right. 

Courtney: not only that, Pete, but Sophia, I gotta say, since she’s not a human, I feel fine throwing this barb, but she looks creepy. She looks creepy. Okay.

Pete: Yeah. Right. I don’t get the feeling this was a commencement address that people left. know, wiping their eyes on the way out the door, right? Like, which is, I think, what you want to feel 

Courtney: Yeah, so I just think it’s, it’s helpful. Again, I feel like this is an [00:05:00] example of where the technology is being used. Not to elevate humanity, but just to replace something that we actually really love doing. Um, and that’s where the technology feels like it’s taking a role. That I think as society we need to say like, Nope, that’s what we’re gonna do.

Pete, thank you as always.

Pete: Thank you. Take care. 

Courtney: What if I told you that a 20 plus year old methodology for measuring customer satisfaction was still being used to drive how you think about your customers? Would you be surprised? Probably not. I talked with David and Mohan recently about how new ways of measuring customer satisfaction are in the works as we progress into the AI era. 

Courtney: Today I wanna pick a fight.

Um, and this time you two, I’m [00:06:00] not trying to pick a fight between you two.

David: She’s gonna fight with us.

Courtney: No, no, not even with me today. I wanna pick a fight with NPS scores and CSAT scores. And you know, if you listen to our last episode about. Hype versus hope. I think this might help people kind of take something that we’re just so used to using these tools to actually find out what’s happening with our clients or with our customers, but not to lead the witness.

Uh, they’re actually not really great, but we’re just so used to it. And so today, I think. This is an area of hope where AI could really help us change how we think about understanding our customers. So I would love for you two to help break down, you know, how those are being used to [00:07:00] now. Most of our audience is gonna be very familiar with it and why they are broken.

David: Well, I mean, I’ll, I’ll just start. Um, I would sum it up. Let’s take NPS as an example. Um, net promoter score is a helpful tool. It’s not an operational tool. Um, it doesn’t actually help you drive. execute, hold accountable. Um, actual activities within a business and unfortunately void of anything else, we’ve adopted it as one.

Now why is it not an operational tool? Uh, first and foremost, the vast majority of the time it’s stale. Um, it is a survey that goes out. You cannot ask your. Customers a question every single day and have real time date on it. So typically at the very least, it’s three months sta usually it’s six or nine months sta in most organizations I am aware of.

Um, and so that’s problem number one. Problem number two is it’s anecdotal. Um, it is simply what somebody says at some point in time. Um, and that can be highly influenced by all [00:08:00] sorts of things. Um, not the least of which is simply. The person’s personality, right? How do you compare one person’s answer? How many times have you heard somebody say, I just don’t give out tens yet?

How many times have you seen somebody else give out a 10? What does it mean? I don’t know. Right? So there’s, there’s no standard, it’s anecdotal, it’s not actual data. So I like to think of it as a compass. A compass is helpful. It tells me kind of the direction I need to head. it’s not a roadmap. It’s not telling me what turns I have to take, what steps I have to take, how to move forward.

And I think that’s the fundamental problem with it, is we’re using it as an operational tool and it’s not,

Courtney: I do think there are some e-commerce business use cases where, you know, like if you’re on gaps website and they’re asking you, that to me is a little different than what we’re talking about it.

Uh, for our audience, we’re more specifically talking about interprofessional services.

David: throw it away, right? Like compasses are still used. Maybe not the kind that you had when you were in scouting as a kid. [00:09:00] Right. Manually in the woods. But we use Compass technology without a doubt. And like we said, embedded in modern cars, uh, like you’re headed east. It’s not that it’s irrelevant.

that it’s not an operational tool. People are using it for things that don’t exist. And so we can use it. Stop relying on it to actually drive the activities of your business, right? We need something more modern with higher levels of fidelity in all areas of life.

We have got more and more and more we’ve added to technologies to help us execute better. Let’s do that here. Let’s make smarter decisions, drive execution, and hold people accountable against things that were designed for that as opposed to something that wasn’t designed for that. To your point, it’s for other businesses.

Even for whatever business you’re running, it’s relevant as a measure of brand sentiment, right? As an example.

Courtney: Yeah, think there is, uh, limited value, right When, uh, you know, the example that you gave [00:10:00] about e-commerce makes sense. Sometimes when you go to grocery store, they’ll have the frowny face a happy face and say, how, how was the, uh, how was the experience? Um, uh, you know, I. It’s valuable because that is happening in the context of the transaction, right?

Mohan: As you are buying, uh, on an e-commerce site, you’re asking for it. That makes sense. What makes zero sense is sending a survey six weeks after and for Mm-Hmm to kind of pass it on to somebody else, to fill something where no context is captured and get a number back. That is just dumb data.

David: Well, and not only that, I mean, I’ve got a real story here. So this is funny, you just brought up a memory. Um, several months back, uh, one of my kids, um, had a little accident with the car and I had to get some body repair done to the car. And, uh, checking out of this body shop and getting my keys back, the receptionist literally said to me.

survey is coming and [00:11:00] my compensation and the reviews that we get and all these things like this heartfelt SOB story is dependent on me getting a 10. Will you please make sure that you answer 10 that I thought, Mm-Hmm. Gotta be kidding me. This is not valid whatsoever. Um, it’s not helpful. And that’s what I mean by it’s anecdotal, right?

Like if I’m just a good guy and I don’t really care about the operations of that body shop, but for some reason my heartstrings were pulled the right way, that lady’s getting a 10, even if it was a three. Right? what we’re dealing with here.

Courtney: And you’re talking about a body shop, and I think that still applies in professional services. I think account managers, we actually, we, the three of us were with a host of executives from prof, professional service companies, and they talked about this very specific thing that account managers, they know when to send these surveys.

They’re sharp people. They know when they just launched a project that went really well, that’s when they’re gonna start, [00:12:00] send the survey. Um, even to that point, these executives are spending all of their time trying to figure out how to structure these surveys so that they get better data, but they know it, it’s not good and it’s incomplete.

I mean, I, I don’t know the exact rates on how often you even get these completed within a professional service firm.

David: Well, the, the other thing that I would add to that is, hear anecdotes all the time about organizations who, even when they have a good rhythm around the NPS score or the CSAT survey that stand out even when they mandate a certain rhythm, and don’t let the account managers control it.

Even when, even when, even when they’re still surprised by churn of their clients.

Courtney: Yeah. 

David: They still miss opportunities. It’s not helping them build a stronger commercial relationship, a healthier commercial relationship that has nothing to do with somebody’s anecdotal sentiment that [00:13:00] has to do with what is actually happening on the ground.

Is there. A perception of service quality that is worthy of the relationship continuing. Is there a healthy B2B multifaceted relationship between the parties on either side? Right? Do we respond to events in a positive way, um, that may otherwise disrupt? Uh, those are the types of things that matter, not somebody’s opinion of whether I’d refer you or not.

Mohan: and in a digital world, the best way to do this is to make these metrics come out as a byproduct of them using your service, right? That’s just the bottom line. For example, software businesses. necessarily depend on these things anymore because their fundamental truth comes from the application usage.

Uh, do you, do you, do you love using it or not? How many people log in every day? How long did they stay? How? Same with e-commerce sites as well. You can send all the surveys, but the truth is in the data [00:14:00] of app usage, and you can see where that’s going. And in the new digital AI world. be able to do this for every business, right?

The, the satisfaction has to come from just using it, uh, more and more and more deeply. And how do you measure it and how do you build metrics around it? And how do you build an operational framework for that?

Courtney: Okay, so we’ve outlined like a pretty big problem here. I want to just open the gate of how AI is going to help us solve this, uh, probably not perfect tool that we have been using to like, let’s get to the root problem and how can AI help us solve that.

Mohan: I think the way to think of it is every company is gonna be a digital company in the future. If you’re a manufacturing Hmm. you’ve got your processes that are essentially running in a digital fashion. Um, you know, the metrics that are coming out of the machinery is digital. The [00:15:00] sensors are there. there is, uh, usage data in terms of retail analytics.

can you create a satisfaction score that that is directly a byproduct of the usage of your services and goods? And that’s where sort of the future is in terms of true measurement of the customer sentiment.

David: the other thing that I would add is I think, um, the vast majority of organizations have tons more information than they’re even looking at already sitting as exhaust of their business.

So yes, we can instrument, um, you know. Our manufacturing machines and do things, and we can push for more data points, but let’s not all forget all of the information that already exists, right? So we’ve seen this move in recent history from structured data to unstructured data. So we’re starting to manage more, but we still have gobs and gobs and gobs of emails that we send back and forth.

We still have. Video conferences and telephone calls that [00:16:00] we can now get transcripts for and we can process this data. Um, we still have slack messages that fling back and forth like crazy, distracting people all day long. Um, like tons of information. 85% plus of the information that exists within an enterprise is in these formats and we’re not even looking at it yet.

They, it includes so much rich information that we can tap into and we can convert that into your point, which is real time data. While the service is being provided, we can look at those clues and because it’s communication, ’cause it’s not it, it’s just. Happening. It is the real time reaction perception that the client is having and as the service is being delivered.

And so we can look at those clues, look at those bread crumbs, and then leverage AI to do what? It’s great at connecting the dots and finding signal and noise. Right? And, and that is [00:17:00] where I think we unlock. Human potential by really honing in on the true operating metrics and insights that now give us fuel to act.

Now we can say, because this is happening in real time, I as a leader can go out and do what I’m great at, as opposed to stumbling around with a survey, trying to figure out how do I actually operationalize this?

Mohan: That’s such a good point. Um, you know, leveraging the investments you’ve already made, right? Everybody’s got email, everybody’s got calendar. Most companies have CRM systems. Most companies have ERPs. These are all digital investments you’ve already made, and just mining the data in that is a great first step.

Courtney: And I think this will also solve for, I think you two probably have both experienced this when you have had these surveys and you feel confident in the commercial relationship continuing because you’ve had all of these great scores and then all of a sudden [00:18:00] that client leaves and you’re like, hold, hold on.

Uh, you gave us all tens on all these surveys. I mean, we just know it’s broken. It, there’s not enough there to have the confidence that. You know whether or not a relationship is gonna continue or not.

David: Yeah, I, I remember when I was running 3Pillar, uh, we actually had a client that left that, that was, I can’t remember how many data points we had from this organization, but it was all nines and tens and, and loved us. Literally terminated the contract and did make a referral to another customer the very next day.

Right. And it was like the funniest thing in the world. It was like the proof point that NPS doesn’t matter. They can love you, they can want to refer you yet not have a need for your services anymore. Right. And this was in a business where our average tenure was five years, uh, of serving a client. It’s not like we were in this, you know, project based business where things just Mm. I think that’s the point and I Yeah. just hit the nail on the head. Um, we need a new model. Courtney.

Courtney: David Mohan, [00:19:00] uh, thank you for taking on the behemoth of the CSAT and the NPS. Good

day. for

I’m just kidding. fight.

Do you really not? David. Earlier this year, we gathered some of the smartest and brightest executives from professional service firms to talk about what’s actually happening with client retention in those firms today. It was such a robust conversation that we wanted to make it available to more of you.

So if you work in a professional service company, we wanna give you access. You can watch it right now at Knownwell dot com slash Executive Roundtable. We hope you enjoy it. 

Pete: So [00:20:00] Doug, uh, welcome. We’re so glad to have you on the show.

Doug: Thanks, Pete.

Pete: just to give our listeners a little context for the conversation that’s about to flow, um, could you share some background on amend consulting and your role? I.

Doug: Yeah, sure. Amed was founded in 2005 as a management consulting firm focused in middle, middle market manufacturing. Craig and David focused from the very beginning on people, processes, and systems. 2016, they introduced, uh, BI and quickly evolved into data science. We now work across opex, fp and a BI technology and data science, and we’ve grown from manufacturing into private equity construction, real estate, healthcare, financials, just to name a few.

role at AMEND is to oversee the custom AI and data science practices.

Pete: Well, as you know, AI is our focus, and on this episode the um, topic is driving operational excellence with ai. Um, [00:21:00] hoping you might be able to share a couple examples of work you’ve done with customers to that end.

Doug: Yeah, sure. Um, and it’s, it’s such a awesome point inside of data science. There’s two important points as we get started in this, that people tend to leave out, right? And first and foremost, are you working on the right problem?

And way too often we find clients call us in and very quickly the data points to something they’re not considering.

And this has gone on for years. Right. Elijah Goal writes book. The goal is a staple at end and should be a staple for everybody inside of our field. I remember that. tell you how many times we’ve walked into a manufacturing plant, a warehouse, and there’s been continuous improvement teams for years working to gr get greater throughput.

And they’re not working on the bottleneck in the area. Uh, so we have multiple different simulations we’ve done to show very quickly, this is your bottleneck. This is how to improve it,

The second thing that’s super important is custodial control. a term coined from FedEx. Did, did you [00:22:00] do what the system asked you to do? And if not can do, are we strong enough from a data perspective to see who is better, the system or the human? And when you have those two, it’s an awesome feedback loop where the performance starts to drive the next set of behavior, the next set of model changes.

And so there’s lots and lots of examples we have of making sure you have custodial control into a system. you walk into a Kroger store, um, there’s three yellow balls at checkout. Every store in the company has it. Those three yellow balls are based on infrared systems that are over entrances, over exits, over checkout, measuring wait time, predicting how many people are coming in at the end of the day.

You can see the store manager gets a day after report and sees what the queue time was, real time. They know how many lanes are open versus what the system is telling ’em to open. Right, so custodial control all the way down the management capabilities. You walk into general office and you can see real time queue what the stores performed yesterday across the entire country.

It’s an amazing instance of [00:23:00] driving, like you said, operational behavior. It only, it doesn’t happen with AI by itself. Right, AI has to work with the people in the processes. AI by itself delivers absolutely zero value.

Pete: Okay, cool. So these are the, the two. Um, often forgotten points. Um, are there other stories, um, I guess you’ve shared one with, with Kroger, but are there other stories where customers have kind of gotten it right? Uh, in deploying ai, I. 

Doug: Oh, absolutely. So of the typical projects we get caught in a lot on is scheduling and capacity work. And I, I remember we got a call from one of our go lives and the customer called us up and said, for four weeks now we’re eight up on production by eight and a half percent. And what was magical for him was he had been going through and expediting orders, expediting orders, and we walked through a scenario and just kind of showed, Hey, if you trust this system.

You are actually gonna reduce your lead times and not have to expedite at all. [00:24:00] And so we were stacking orders so much smarter. There was millions and millions of combinations. A human can’t consider all of ’em, but because these, the machine could now stack everything and he convinced the operators to start using it, expediting less, he was able to deliver both better on time performance and much improved throughput through his machines.

And it’s very typical inside, inside the space. But then what we were able to do after the fact on the cus studio control front is. Were, was the operator following the, the schedule the system was creating? 

Pete: Okay. 

Doug: were they expediting? Was it a good expedited order or not? 

Pete: Cool. 

Doug: got many, many examples of early warning systems, people trying to be more proactive, uh, and, and pulling a bunch of data from a bunch of different places and saying, what’s about to happen to your operations?

Pete: we have a, a focus at Knownwell on, um, professional services. Uh, are there examples from that space where you’ve helped to, improve, you know, the operational efficiency and effectiveness of their workflows?

Doug: Yeah, so [00:25:00] there’s multiple different examples, um, of inside of like technology organizations. So a a couple examples of like, but that’s, that’s more on getting the, how technology works. 

in getting into, you know, making sure you have code repositories, how to teach ’em the basics of sharing code in a, in a better way.

And the two typical folks focus points are reusability and repeatability.

you know, we call it data science and science terms super important. Science is based on, you know, repeatable practices, repeatable math. it’s amazing how often you find that things aren’t documented, archived well enough that you can consider it actual repeatable.

If you don’t save off the, the data, the methodology, the code at that point, you can’t call it science anymore.

Pete: When I think about, um, project implementation, all of it’s hard and we have evidence that suggests transformation efforts fail all the time. [00:26:00] But if I like massively oversimplify it to. Getting the problem and the solution set up in the first place versus implementing and driving to results, which half of that equation do you worry about the most for customers,

Doug: Yeah, the final step, and it seems the larger the company, the more people we have to get on board.

Pete: Sure.

Doug: we follow basically the crisp DM data science methodology, right, with super, super iterative process, making sure you have really well-defined problem.

Go through, make sure the data is well, and always reevaluate that you’re understanding the problem, its impact on the data, vice versa. By the time we get to the end, we should have been working with the key users we’re gonna be working with. And then implementation always keeps us up at night, right?

Our job’s not done till we get across the finish line, Yep. finish line isn’t a product. That finish line is the product being used, Right. results. So there’s, there’s always more opinions once you go to flip [00:27:00] that switch and get it turned on.

Pete: do you have any particular secret sauce or best practice in making sure the solution actually gets used and benefits get captured?, 

Doug: So Pete, love this question. Um, and part of what we try to do is make sure we have a standard process and a checklist for our data scientists. about making sure the problem definition’s really well defined. The data’s really well understood.

They then do a science review to review all the literature in this space and stand on the shoulders of giants. We’re not the smartest people in the world on every single topic, so leverage them. 

Pete: Nice. 

Doug: Then we, we get into the modeling phase, and at that point you better know what implementation looks like. And too often we find data scientists build models without understanding implementation.

Pete: Hmm. 

Doug: if you don’t have the speed and feeds at the at the right place, it becomes impossible to implement something that you think is a nice, great model. and then on the implementation side, everything we just discussed on custodial control, make sure you have measurements in place for how [00:28:00] the system’s performing versus how the operator’s performing.

What’s really cool about that is it creates a roadmap for future opportunities. you see that gap, it’s that wa classic waterfall chart that says if you release this product, the operators can perform better. If you release this feature, you can get this much more performance.

Pete: And presumably you’re involving the operators from day one.

Doug: Absolutely. the later you get ’em involved in the process, the more likely you’re gonna be, um, blindsided. Um, and it’s just not a fun, fun place to be late in that stage of a project.

Pete: Right. Having to convince them about something that’s happening to them as opposed to involving them in something that prospectively can help them. Right.

Doug: Yeah. I, I can’t tell you how many times, um, data scientist goes onsite. they realize they’re talking past each other, and when they get their eyes on it, they realize they didn’t understand it to the level they needed to too. Okay, And it’s the best, the best projects in this space is when everybody’s trying to make it better and seeing it the same way.

Pete: We get a lot of questions [00:29:00] from customers about, usually in the business case, uh, shaping phase about how to think about ROI. How do you. Set it up, describe it, calculate it, make it convincing. Um, surely you’ve been down this path many, many times. How do you think about it?

Doug: Early and often at, at the end of the day, I think financial models are one of the tools. Data scientists don’t use enough, and what I mean by that is if you work side by side with the financial department. Your finance team, your accounting team, and you look at the ROI and the assumptions built into that.

It allows you to identify what the key risks are on the project. And so you learn to test really quickly those assumptions and figure out how to put those and test them. then make sure on the backside that you’re getting whatever change that is to hit that exact ROI or IRR. However, the finance team has actually measured it, and when you get all that magic.

The, the teams you’re working with never wanna [00:30:00] let you go.

Pete: Thank you. It’s been fascinating, so, uh, appreciative to have you on the show.

Doug: Pete, awesome for host and nice to meet you.

Courtney: Thanks as always for listening and watching, and since you’ve already given us a 10 rating on our NPS score, you might as well give us a five star rating on your podcast Player of Choice or like us over on YouTube.

It really helps more people find the show. At the end of every episode, we like to ask one of our large language model friends to weigh in on the topic at hand. 

Hey, Claude, what’s happening, Courtney here? We’re sick and tired of the NPS score. What do you have for us and how we think about client intelligence? 

[00:31:00] 

Courtney: and now you’re in the know. Thanks as always for listening. We’ll see you next week with more headlines, round table discussion and interviews with AI experts. 

You may also like