Rapid advancements in artificial intelligence have sparked a transformation in how many businesses operate, requiring companies to reimagine their decision-making frameworks and where, when, and how humans should fit into the decision-making process. In this episode of AI Knowhow, Courtney Baker, David DeWolf, and Mohan Rao delve into a simple yet vital tweak to the classic decision-making model known as the OODA loop—originally designed by the US Air Force—that makes it a critical tool in the AI era.
Unpacking the OODA Loop
The OODA loop, which stands for Observe, Orient, Decide, and Act, has been a staple in strategic disciplines for decades. However, the complexities introduced by AI require an adaptation—enter the OODAI loop, where ‘Inspect’ becomes a pivotal addition. In this evolved version of the framework for modern times, companies add a step to the loop that encourages retrospection and continuous improvement.
As AI and automation accelerate decision-making, it is crucial to step back and assess the business’ desired outcomes, the results AI has yielded, and any discrepancies between the two. By inspecting AI’s performance, organizations can refine and retrain models, ensuring decisions made by AI systems are continually optimized.
The team uses self-driving cars, like those developed by Tesla, as a practical example of how the OODAI loop works in real-world scenarios. Telsa utilizes a continuous loop of observation, orientation, and decision-making paired with inspection to improve driving algorithms over time and develop safer self-driving technology.
Why Inspect is a Necessary Addition to the OODA Loop
The addition of ‘Inspect’ not only enhances decision quality but also builds confidence among business leaders by providing a clear, actionable framework for deploying AI technologies. As Courtney Baker notes, this structured approach is a powerful tool for navigating the fast-paced AI landscape where the OODA loop moves faster than ever before.
As AI technologies continue to evolve, frameworks like the OODAI loop can provide a vital roadmap for companies aiming to leverage AI’s full potential. By incorporating inspection into the decision-making cycle, organizations can achieve more accurate, effective, and insightful outcomes. Whether driving customer success or improving operational processes, the OODAI loop can be a valuable asset in the pursuit of intelligent innovation and automation.
For executives looking to harness AI effectively, the OODAI loop offers a structured approach to integrating AI into decision-making processes. It allows leaders to strategically determine where AI can add value and where human intervention is crucial.
Guest Interview: Embracing AI in Customer Success
In a compelling discussion with Michael Harnum, CEO of ESG Success, Nordlight CEO Pete Buer explores AI’s role in elevating customer success strategies. Michael describes how AI is often conflated with automation and emphasizes the intelligence aspect, suggesting ways to scale the customer success function using AI without losing the human touch.
Michael shares insights on deploying AI to streamline customer interactions, enhance decision-making, and optimize processes for serving a broader client base than is possible when relying strictly on human Customer Success Managers. “A hundred percent of the job maxes out at 40 customers,” Michael says. “That’s where we are today.” His team is focused on utilizing AI to help companies go from CSMs who can manage 40 customers at a time, at most, to upwards of 100 companies.
By focusing on the fundamentals and automating routine functions, businesses can maintain personalized relationships at scale, ultimately driving customer growth and satisfaction.
Watch the Episode
Watch the full episode below, and be sure to subscribe to our YouTube channel.
Listen to the Episode
You can tune in to the full episode via the Spotify embed below, and you can find AI Knowhow on Apple Podcasts and anywhere else you get your podcasts.
Show Notes & Related Links
- Watch a guided Knownwell demo
- Download a free copy of the OODAI loop
- Connect with Michael Harnum on LinkedIn
- Connect with David DeWolf on LinkedIn
- Connect with Mohan Rao on LinkedIn
- Connect with Courtney Baker on LinkedIn
- Connect with Pete Buer on LinkedIn
- Follow Knownwell on LinkedIn
Is your company equipped to make rapid fire decisions?
Probably not, most aren’t.
So what if I told you a small tweak to a 75 year old decision making framework could help prepare you in the age of AI?
Interested?
Hi, I’m Courtney Baker, and this is AI Knowhow from Knownwell, helping you reimagine your business in the AI era.
As always, I’m joined by Knownwell’s CEO, David DeWolf, Chief Product and Technology Officer, Mohan Rao, and Chief Strategy Officer, Pete Buer.
We also have a discussion with Michael Harnum of ESG Success about how technology firms are using AI to elevate their customer success strategies.
But first, hop in the jeep and throw on those wraparound shades because it’s time for another edition of AI in the Wild.
Pete Buer joins us, as always, to break down some of the latest and greatest in AI.
Hey, Pete Buer, how are you?
I’m good, Courtney.
How are you doing?
Doing well.
This week’s story comes to us from Computerworld, and the headline reads, How Ernst & Young’s AI Platform is Radically Reshaping Operations.
Pete, what do you make of this one?
Well, I recommend the article on two dimensions.
First, for its breadth, it covers a lot of terrain, looks to the future, some interesting prognostications, and for its depth.
It goes deep into the case study of work done by Ernst & Young to embrace AI in the day-to-day.
So, on the Ernst & Young story, EY spent a cool $1.4 billion to develop their own in-house generative AI platform.
They ran a pilot, quote-unquote, for 4,200 team members in 2023, and then rolled the platform out to nearly all 400,000 employees of the firm.
And today, they report more than 96% of team members are currently using the platform.
That’s pretty impressive.
And so, as to application, the platform is fueling both internal operational efficiency gains and external customer support plays.
And deliberately listed in that order, EY has a sort of clever name to its deployment approach called Client Zero, where they test and perfect all applications on themselves first before extending them out to customers.
Seems like good practice.
Lots of other cool stats and discussion in the article, including some predictions regarding artificial general intelligence, so AGI and its threats and promises.
You kind of get the sense that the disruption that we’re experiencing today from generative AI is but a drop in the bucket long term.
Pete, that does feel like a heavy investment.
And probably for most of the executives listening to this, maybe more than they want to invest in an AI platform build out themselves.
But it is really interesting to see how some of these large businesses are choosing and when they’re choosing to develop their own products.
Yeah.
And it’s courses for courses, right?
Huge company, professional services, lots of opportunity to be more efficient and to deliver more value, tons of information, data to leverage.
And so that level of investment sort of makes sense in that context.
And every other company and leader listing will get to their own sweet spot as well.
Makes sense.
Pete, thank you as always.
Thank you, Courtney.
How can a small tweak to a decades old decision-making framework prepare your company to win in the age of AI?
I sat down with David DeWolf and Mohan Rao recently to talk about that very topic.
Mohan, David, I want to talk today about the OODAI Loop, and don’t worry if you’re listening and you’re like, what is she talking about?
Don’t worry, we’re gonna give you an explanation in just a minute.
And how it’s evolving in this AI era.
So to start off, Mohan, when I think of an OODAI Loop, I always think of you.
Can’t wait to hear the why behind that.
So Mohan, I would love for you to break down, first of all, what is an OODAI Loop?
What is it?
What does it tell us?
You know, it really comes from the US Air Force and it’s broadly been used in business and sports and other competitive type environments.
You know, it stands for Observe, Orient, Decide and Act, right?
So it’s an iterative process of going through the OODAI Loop.
You gather information in the Observe portion of it.
Then in the Orient portion, you analyze and put it into context because context matters as we know.
And then D stands for Decide, which is make a decision and act as implement, right?
There are other types of frameworks as well.
There’s the Deming’s Loop.
They’re all similar.
They’re iterative, right?
So you just can’t act and say, I’m done.
You’ve got to measure the results.
And then you’ve got to go back to observing.
And then you go to Orient, you go to Decide and Act and on and on and on it goes.
It’s a, it’s a loop.
That’s an OODAI Loop.
So before we get into how this may be evolving in the AI era, can you, you or David, give an example of when or why this OODAI Loop was needed?
What’s an example?
You know, it’s used in strategy, right?
So you have to make a very important decision.
Potentially, it’s about making a large strategic decision in your company.
So what you do is you gather all the information.
It sounds very logical when I say it this way.
You have to gather all the information, you have to analyze the information.
We all know somebody here loves frames.
So this is a frame and you have to put it into context.
That is the hardest part of all of this because decisions made without context are just dumb decisions.
You put it in the context, then once you have the information, you have the context, you have to make a decision, then you have to implement the decision, you have to measure what happened, and you complete the loop again.
I think in the business context, one of the areas that I’ve seen it used a lot is in crisis.
Crisis communications, for example, right?
It’s a great way to stop, slow down in the middle of feeling like you have to urgently do something and just not necessarily take a long time, but be deliberate about where you are in the process, right?
So processes like crisis communications are great ones that use concepts like this to just be deliberate under stress so that you can have improved decision making.
Yeah, that’s really helpful.
And I’m assuming if this was, Mohan, you said originally in the Air Force, was that right?
Came out of the US Air Force, yep.
That’s an OODAI Loop and it’s been around for a really long time.
I mean, like a century probably, I don’t know.
Just make up facts, that’s fine.
I really think I looked it up at one point and it was a long time.
Okay, so that’s an OODAI Loop.
Now, we are proposing in this AI-
75 years actually.
Oh, thank you.
Okay, so it’s been around for 75 years.
We are actually proposing in this next era, there’s a really important addition that needs to be made to the OODAI Loop to make it a really viable frame to use in the AI era.
David, do you want to introduce the new addition to the OODAI Loop and why it’s important?
Yeah, so as with all things AI, if you have an A, you have to have an I, right?
And so it’s critical in this era that we don’t just add I’s, but there’s a reason for it.
And our reason is really simple.
In this world of increased automation and reliance on the computers, we need to inspect.
We need to actually look at the end of this cycle and go back and reflect and do a retrospective to say what happened when we took that action, right?
And so it’s critical when you’re building platforms that rely on artificial intelligence that you remember the I and that you get to the end and you have this additional step that says, let’s inspect that final process.
Look at all of those four sequences, digest what was the action, was it the appropriate action so that we can then retrain models and iterate and improve over time.
Just to reiterate, the OODAI Loop is now the OODAI Loop.
That’s right, exactly.
I almost said it was an accent, but I think I kept it.
It sounds very Australian if you say it very fast without laughing.
OODAI Loop.
OODAI, mate.
OK, so we’re proposing this addition to the OODAI Loop.
And are y’all proposing this because some of the things happening in the traditional OODAI Loop are now being done by AI, where previously this whole loop would have been done by a human?
Is the I being added because now some of these steps AI is going to handle for us?
Exactly.
I think there are two things happening in the AI era with the OODAI Loop.
We’re going to reinvent the OODAI Loop.
One, the speed at which OODAI is spinning is much, much, much faster, right?
Because gathering information and putting it to context is more possible in faster speeds than ever before.
Then there’s a question of who is making the decision.
Is it the machine making the decision or is the machine helping the human make the decision?
Nevertheless, that is being aided by a machine or being made by the machine.
You need some sort of inspect, that’s the I, that is required to make sure either you’re inspecting after the decision has been made or if it is mission critical, maybe it is before the decision has been made.
One way or the other, the inspect portion of the loop is very important.
Maybe just to put it into a practical example of maybe one of the areas that is leveraging AI to the greatest extent and at this increased speed that Mohan is talking about is self-driving cars.
Think about Tesla and the technology they’ve been building around self-driving.
No matter how far that’s gotten, what do they have to do?
Tesla, first and foremost, has all these sensors, cameras, radar, that observes surroundings.
So that’s the O, the observe, to collect all this information about its environment.
Then it orients.
The AI takes this data, understands road conditions, traffic nearby obstacles, and it adapts its driving model based on what it has observed.
It then decides that the AI processes it and makes a decision about should I accelerate?
Should I brake?
Should I steer?
What does that look like?
Processing all of that context, and then the car actually executes it, right?
It autonomously takes these actions so that you don’t drift out of the left-hand lane of the highway, for example.
Well, in addition to that standard OODAI Loop, data is continuously being processed by Tesla, by the engineers, by other autonomous systems, in order to improve driving algorithms and continually enhance them to get better and better.
And that’s why this self-driving technology is getting better over time, right?
It’s because of that additional inspection that doesn’t just say, oops, we acted and we crashed.
Oh, well, no, we need to take that into account and say, okay, now how do we update the models?
How do we do this OODAI Loop better next time around?
That additional inspect step can really, really help us to improve these autonomous systems.
David, let me ask you a question, right?
So in the OODAI Loop, where would you put the inspect?
In what scenarios, what circumstances would you do it before acting versus when would you do it after acting?
Like, how should the listeners think about where it comes in the loop?
You know, when I think about the inspect, I actually think about it after, but I think there’s an important insertion there, which is I don’t think these OODAI Loops have to be entirely autonomous.
I think the decision we should be talking about is where is the human in the loop?
What’s the appropriate point for the human to be in a loop?
And so instead of moving the inspection, what I would propose is, for example, in the decide step, maybe there are certain use cases, medical use cases come to mind, right?
Where the human should still be in the loop.
And until we have a degree of confidence, degree of trust, we don’t want to automate it.
And oh, by the way, in some situations, we may never want to automate it.
It is okay.
And I think this is a part of the helpfulness of this frame in the AI world, we can tease apart each one of these steps and assign it to a different actor.
And it can both be a autonomous actor and an agent, or it can be a human being.
And so being really smart about where is the human in the loop doesn’t change the loop, the eye is always at the end, but it does change who is executing that step of the OODAI Loop or the OODAI Loop.
And I think that’s really important.
All right, sometimes you will have AI inspecting and sometimes you’ll have humans inspecting.
But the same thing can be said for orienting and deciding and even acting.
Do you think the helpful thing for executives here is having a framework like this gives a better sense of control over a new technology like this?
Being able to do exactly what you just said of determining, hey, is this autonomous?
Is this a person?
On all of those steps, at least for me as I hear that, it gives me more confidence as I deploy it in my organization.
Yeah, and I think it also gives you a frame for thinking about where will you deploy AI, right?
And you can look at all of these different steps and say, okay, I’m going to use it to observe.
Okay, I’m going to use it to orient.
I’m going to use it to decide, right?
All of those things are possible.
You don’t have to do it all together.
So yes, I think it gives you a frame to be more deliberate about how you’re deploying AI and thinking holistically about it.
Like autonomous vehicles, great, exciting, obviously the leading edge, but we don’t have to model everything after that, right?
We can be very deliberate and intentional.
The reason we love the OODAI framework is because it’s a decision-making framework, and AI is about making better decisions, faster decisions, right?
So that’s why we love the OODAI Loop, just in terms of framework for us to think about as we build our products.
And as David said, observant orient is something that AI is really, really, really, really good at, right?
And then whether you decide to implement, decide and act via AI or leave it to the humans for now, that’s fine.
You’re going to get a lot of benefits from just getting an observant orient platform.
Well, David, Mohan, man, since I’ve been around you two, I’ve heard so much more about OODAI Loops, and I think this is a really powerful framework, especially when we add that I, the inspect, especially as we’re continuing in the AI era.
Again, if you’re listening and you’re feeling this fast moving technology, feeling the pressure of all of these different situations, hopefully this is a really helpful tool for you as well.
On that note, you can download a copy of the OODAI Loop on our website at knownwell.com/oodai, O-O-D-A-I.
David, Mohan, thank you as always.
It was fun.
Thanks Courtney, thanks David.
This episode goes live on 1111.
So as my eight-year-old would say when the clock strikes 1111, make a wish.
And if you happen to wish for an AI platform that delivers real ROI and helping you keep and grow your existing customers, well, then congratulations.
Your wish has come true.
Go to knownwell.com/demo to get a look at Knownwell’s AI-powered platform for commercial intelligence.
We’d love for you to check it out.
Michael Harnum is the CEO of ESG Success, a company that delivers customer success as a service.
He sat down with Pete Buer recently to talk about the role AI is playing in shaping forward-thinking customer success organizations.
Michael, great to have you on the show.
Welcome.
Thanks, Pete.
Appreciate you having me.
Could you start us off with a little bit of background on ESG Success for listeners who might not have the background?
Yeah, happy to.
Appreciate that chance.
So ESG, we do customer success as a service.
That is a term that we coined, that we trademarked, and it’s a service offering that we piloted into the marketplace five or six years ago.
It is a completely outsourced customer success option in the marketplace.
It can be everything, Pete, from your outsourcing the entire customer success function to us, including strategy, personnel, management, outcomes, all of it, all the way down to you need help with building playbooks to insert them into a customer success platform.
There’s a lot of space in between those two poles.
We work with customers specifically to accelerate the development of their customer success strategies, so that they can execute them towards scale.
As you know, this podcast has a focus around AI, so as your teams are living inside other organizations, where is AI showing up?
What’s cool that’s going on?
The eye of AI is often forgotten.
There’s not much intelligence put into it.
Basically, when we work with customers that say, if you’re happy with your approach in your 25 percent, and you’d like to extend some of that to the 75 percent, which are your smaller customers, but may represent your biggest upside opportunity, again, back to the macro problem of you’re not growing fast enough, let’s go look at those 75 percent, and let’s figure out how we can help them stay with you, and give them a reason to grow.
Hopefully, I’m getting closer to directly answering your question.
Yeah, this is great.
The logic is very clear.
Keep going.
Yeah.
Then when that sounds attractive to me, I want to go do that, that now we’re talking about the how, which is really the heart of your question.
How do I go about doing this?
Because I can’t throw people at the 75 percent the way I did at my enterprise accounts.
Yes.
So the pros of that approach are I can now go to the 25 percent and I can pull out some of the things that are working.
What’s working?
Who’s growing?
Why are they growing?
What practices have you employed?
What tools are you using?
What technology do you already have in place?
What data do they have access to?
In that, now you can see I’m beginning to build the ingredients of a solution that I can take using other technology to a broader base of their customers.
If you take, for example, customer success is fueled by a customer success manager.
That’s the role that is interacting with your customers.
One way I think about scale is, you’re a customer success manager and you’re responsible for 40 accounts.
Okay, great.
Why is it just 40?
What does, why can’t you manage 100?
Right?
And the answer today, Pete, is, well, they can’t because 100% of the job maxes out at 40 customers.
Yep.
And it’s really good to know and benchmark and baseline.
That’s where we are today.
And so, then I encourage all customer success leaders and very few take me up on this, sit next to a customer success manager for a day.
Watch them work.
And I’ll give you one simple task that a customer success manager has to do, and then I’m going to tease apart what’s behind it, and we’ll begin to see some real opportunities to leverage technology towards efficiency.
So let’s say you’re a customer success manager, you’ve got 40 accounts, you’ve got some level of business review to do.
In some cases, it’s a monthly business review.
Other companies do quarterly business reviews.
But there is a practice inside of most successful customer success organizations that, Pete, I’d like to meet with you for an hour, and I would like to review your account, the data associated with your account, and the recommendations I have for you to maximize the value you’re getting out of the platform.
It’s that approach.
Okay, so now I’m going to go do that.
Well, now you’re sitting next to this person and you watch them.
So let’s build a list of what we would need to know to do that at an A plus level.
I would need to know the telemetry of the application.
What’s being used?
What’s not being used?
How does that compare to prior periods?
Are there features and functions that should be used that aren’t being used?
I would need that entire ecosystem around platform usage.
I would need from finance, billing history, pricing methodology, all of those things to make sure that they’re priced correctly.
I would need to know from the customer support organization or system, have they opened trouble tickets?
Are there open trouble tickets that I’m walking into?
Were they resolved in a timely manner?
Were there consistent themes in those customer trouble tickets that I need to be aware of because that can inform my recommendation?
Now I need to go to the learning management system and look at what training has Pete and his team taken.
Have they completed it?
Are there other things that could be taken?
Are there certifications available that might be valuable?
Are there new training offerings that we have that I could bring to bear to this relationship that could be valuable to Pete and his team?
And I’m going to take all of that and I’m only listing a handful of them.
There’s probably, I would need to go into the CRM and see are there new organizational changes.
I might need to go into my customer success platform.
So a Tatango, a Turn Zero, a Gainsight, and I need to pull, are they red, yellow, green, and why?
And all of that health index information.
That’s just for one business review, right?
And so I described the process that each CSM goes through as I’ve watched it, as hunting and pecking.
It’s a hunt-peck solution.
Now, I’ve got to go take all the time.
And what I described, I would say you’re measuring that in weeks, not days, unless I’m really well-trained and I have access to the system.
And I know my way around the learning management system.
And I know my way around the trouble-ticketing system.
Most people don’t.
And so I’m reliant on someone else or employing a report or whatever it is.
And I’ve got to synthesize that, okay?
Now, let’s say I’ve done the heavy lifting of that, Pete, and I have the data and I have it collected.
Now comes the I part of AI.
What do I do with it?
And what do I recommend in terms of efficiency and improvement?
And now I’ve got to come up with some set of recommendations based on what I see in the data.
And that’s the human element that comes into play here.
And then I also have to put it into a standard presentation template, right?
So you get the point, but what I just described, it sounds laborious for me to even outline it.
Well, and then it’s times whatever, 40 to 100 customers, times four times a year, times three.
Right.
And so what we’ve talked about, this is the objection or the obstacle, better said, to scale.
Inside of these companies.
Sure, I’d love him to, some companies will just go say, you now handle 100 accounts.
And Steph explains in a hurry.
Because what we focus with our customers on is, how do we make that world that I just described more accessible, more efficient, more automated, but also including the human element of, there’s some relationship knowledge and intellect that’s required to formulate that, build a talk track that is effective to that customer to drive the outcome that you want, which is you want to keep them minimally, you want them to grow ideally.
The numbers that you provided, recognizing they’re just representative, are you using a mix of technologies and process solutions and actually getting companies from 40 to 100 customers under management?
We are.
And so from a technology standpoint, the first important thing to understand is we are technology agnostic.
In most scenarios that we go into, the customer has enough technology.
The problem is that in digestion, it’s not access.
And so one of the things we try to do is maximize the value of the platforms that you’ve already invested.
Now, a lot of them don’t have an active AI tool that they’ve deployed in a smart way.
So there’s gaps that we have to fill and I can talk about that in a moment.
But oftentimes, we’re coming into a customer setting and helping them expand their scale.
I’ll give you a couple of examples of how we’ve done that.
With one of our customers, we created an automated process where their average customer spend was like $2,000 to $3,000.
So these aren’t accounts of the size that you’re probably going to assign a human to.
And yet, they were of high strategic value to this company.
This is a company that was going from a premise-based solution to a subscription-based solution in the cloud, right?
And so it’s going to represent lower revenue initially, but ultimately they want all of their customers to migrate to that service.
So huge strategic import, low revenue impact at the current point.
And so they wanted to set up foundational practices that would give them the best chance to achieve that strategic objective.
And so we, of course, didn’t recommend they hire an army of people, either they hire them or we hire them.
That didn’t make sense to the revenue and profit profile.
But we did set up an automated monthly business review document.
It was an email that went out to each customer, and you would get an email on a monthly basis that you didn’t ask for, that’s very informed, and it would give you a lot of the attributes that I just walked through.
Your product usage, your open and closed trouble tickets, training opportunities that were available to you, and some features that we noticed you’re not using, that other customers are.
If you would like to see examples of how they’re using it, click here.
And we always put a button.
If you need to talk to someone about this, click here to schedule.
That’s one example.
The other one which is more advanced than that, it’s kind of the phase two of that, is we’ve actually created, in one of our accounts, a digital customer success manager that is an animated avatar, right?
So you would get the same thing.
You would get the email, click here for your quarterly business review.
And you click there, and there’s an avatar in the corner that is talking you through these same attributes and you’ve got opportunities to ask questions, you’ve got opportunities to schedule a review with a human if you would like that.
Find my CSM and schedule an appointment.
All of that is there.
But it’s about a 12 to 15 minute presentation that is made as human as we could make it.
But it does fit the criteria for customer success of being both proactive and informed.
If I’m a business leader and I’m looking at my customer success function and thinking there’s got to be a better way, where would you have me start?
I’ve done a lot of youth basketball coaching in my background and I get a similar question from parents who think Johnny’s going to be a college start.
And the answers, I just think there is power in this simplicity and power in the fundamentals.
And I’ll tell that parent, Johnny can’t dribble with his left hand, therefore he’s cut the court in half.
He can only play on one side of the court, and the defense knows that and he’s limited.
So, well, how do we fix that?
I’ll tell you what I did.
I got into a gym and I turned the lights out, and I stood in the corner and I pounded the ball with my left hand, and I brushed my teeth with my left hand, and I wrote with my left hand.
It’s that kind of stuff that seems kind of obvious, but in today’s, like even in the youth basketball, there’s all of this, well, I need to hire a professional coach, I need to have one, you need a ball on a flat surface, that’s it.
And so if I apply that same simplicity in thinking to customer success, too often we see customers trying to get to phase three and phase four of their customer success strategy without nailing phase one and phase two, right?
So we have a customer success maturity model that we will utilize as part of our discovery process that will help them identify where are you on this continuum?
And it can be painful, right?
Because they may think they’re further ahead than we are, but recognizing where you are and where you’re starting is really, really critical.
And so it’s things like, do you have a health score capability for your customer base?
Red, yellow, green.
And you can begin to get smart about how you deploy your strategies.
And then playbooks.
Okay, if it’s red, yellow, green, what then?
How then are we going to live?
If Pete’s account is red, what activities do that trigger?
You can’t leave it to the CSM to figure out everything, because you’re going to get different actions and different outcomes.
This is the playbook.
When Pete goes from green to yellow, this is what happens.
Documenting those expectations, if you can do those three things, you now have the foundation for a customer success practice that you can build upon.
Too often times, people rush past that and don’t get that right, and then they stumble in the future.
Awesome.
Thank you so much.
Best to you.
Yeah.
Thanks, Pete.
Thanks as always for listening and watching.
Don’t forget to give us a five-star rating on your podcast player of choice, and we’d really appreciate it if you can leave a review or share this episode on LinkedIn.
Be sure to tag Knownwell when you do so, so all of us here at Knownwell can repost to our networks.
At the end of every episode, we’d like to ask one of our AI friends to weigh in on the topic at hand.
So, hey, Perplexity, what’s happening?
This episode, we’re talking about why adding Inspect to the OODAI Loop can help companies win in the age of AI.
What do you think?
Hey there, adding Inspect to the OODAI Loop could indeed be a game changer for companies in the AI era.
By incorporating a dedicated phase for inspection and analysis of AI outputs and decisions, companies can ensure better quality control, reduce errors and ultimately make more informed and effective choices in their decision making processes.
And now you’re in the know.
Thanks as always for listening.
We’ll see you next week with more AI applications, discussion and experts.