-
Only 5% of companies are seeing measurable value from their AI investments, according to a new report from Boston Consulting Group
-
The big takeaway: operationalizing and compounding intelligence beats experimenting with tools and technology
-
Find out how (and why) to move from pilots to performance in the next 90 days
If only 5% of companies are seeing measurable value from their AI investments, what are the other 95% doing? In short, pouring enormous amounts of time, energy, and resources into POCs and spinning their wheels.
While this is understandable for a technology as “new” and novel as AI, it’s time to start asking the question: what separates the few who are compounding AI’s benefits from the many that are still stuck in pilot mode?
The Real AI Divide: Operationalization, Not Access
Knownwell’s Chief Product & Technology Officer Mohan Rao says the real gap isn’t about access to the latest models. It’s about what companies do with them.
“Everybody has access to foundational models,” Mohan says. “But where we’re falling down across the board is how we operationalize intelligence.”
Companies that win with AI don’t just buy tools, they build AI operating models that embed intelligence directly into workflows. That’s when AI stops being a side project and starts becoming part of the company’s nervous system.
Leadership Makes or Breaks AI ROI
Boston Consulting Group’s study found that companies succeeding with AI are 12 times more likely to have C-suite executives deeply engaged in AI efforts. Courtney notes that this echoes Knownwell’s own client experience—when executives are hands-on, transformation sticks.
David ties it directly to change management. “This is classic change management,” he says. “These organizations have executive sponsors that understand the need to operationalize and are actively engaged in the transformation.”
Three Moves to Close the Gap in 90 Days
If you’re a business leader staring at that 5% statistic and wondering where to start, Mohan and David laid out a practical playbook:
-
Identify 2–3 high-impact workflows. Go deep, not wide. Focus on the areas that directly move your business metrics.
-
Set clear business outcomes. Define one measurable KPI that connects to your P&L.
-
Drive it like a change initiative. Assign ownership, communicate the “why,” and measure adoption and impact.
In other words: shift your mindset from one of pilots and POCs to one of operationalizing and disciplined execution.
ROI in Action: A Case Study from Coframe
This week’s expert interview takes the concept of applied AI out of theory and into practice. Josh Payne, CEO of Coframe, breaks down how his team is helping companies unlock real revenue growth by transforming how they experiment, learn, and optimize across their digital funnel.
Coframe, as Josh puts it, exists because of AI. Their entire model hinges on something every growth leader understands but very few have ever been able to achieve at scale: high velocity experimentation with messaging and creative.
Instead of relying on designers, engineers, and long release cycles to test new ideas on a website, Coframe uses AI to generate, launch, and evaluate experiments at a pace that simply was not possible even two years ago. Many clients go from one or two tests per month to dozens. That speed creates a compounding advantage because the faster you test, the faster you learn, and the faster you improve.
Josh explains it simply: faster experiments mean faster insights, and faster insights mean higher conversion. Their clients get the benefit of a system that is always working, always adapting, and always pushing toward stronger performance. Whether model progress accelerates or slows, the ROI remains. Even if AI never advanced another inch, the ability to run tests twenty to fifty times faster than traditional teams would still be a structural advantage.
For executives listening to this episode, Coframe represents the shift from AI as a novelty to AI as a revenue engine. It is a prime example of what it looks like when AI operationalization is done right. Not hype. Not hand waving. Just measurable performance improvements driven by rapid learning and intelligent iteration.
Betting on Results, Not Hype
The episode opens with Pete Buer’s breakdown of investor Michael Burry’s $1 billion bet against the AI boom, shorting Nvidia and Palantir. Pete’s take: the real bet isn’t on AI itself, it’s on whether businesses can use it to drive measurable results.
“Don’t measure AI maturity by its presence or adoption. Measure it by its impact. Are your processes more efficient, your people smarter, your clients sticking around longer?”
That, ultimately, is how the market will separate the winners from the rest.
The Bottom Line
If there’s one message to take away from this episode, it’s this: the AI revolution won’t reward early adopters. It will reward effective operators.
Those who move beyond pilots, embed intelligence into their workflows, and engage leadership at every level will find themselves on the right side of the widening AI value gap.
Watch the Episode
Watch the full episode below, and be sure to subscribe to our YouTube channel.
Listen to the Episode
You can tune in to the full episode via the Spotify embed below, and you can find AI Knowhow on Apple Podcasts and anywhere else you get your podcasts.
Show Notes
-
- Read the BCG study, The Widening AI Value Gap
- Connect with Josh Payne on LinkedIn
- Connect with David DeWolf on LinkedIn
- Connect with Mohan Rao on LinkedIn
- Connect with Courtney Baker on LinkedIn
- Connect with Pete Buer on LinkedIn
- Try Knownwell free for 30 days
- Schedule a guided Knownwell demo
- Follow Knownwell on LinkedIn
Courtney Baker: Here’s a stat that, well, it might sting a little. Only 5% of companies are seeing substantial measurable value from their AI investments. 5%. The other 95, they’re investing, experimenting, and spinning their wheels.
Let’s make sure the only will spinning in your world. Are the ones in your head, Hi, I’m Courtney Baker, and this is AI Knowhow from Knownwell, Helping you reimagine your business in the AI era. As always, I’m joined by Knownwell CEO David DeWolf, Chief Product and Technology Officer Mohan Rao and Nordlight CEO Pete Buer. We also have a discussion with Josh Payne, CEO of Coframe.
Speaker: Josh shares his insights on how you can make sure your company sticks the landing on the right side of the widening AI value gap. But first, Pete Buer joins us to break down some of the latest in AI news.
Courtney Baker: Pete, how are you?
Pete Buer: I’m good. Courtney, how are you?
Courtney Baker: I’m doing great. Here is a timely story that caught our team’s eye. Michael Burry, the investor who predicted the housing crash in 2008 and was portrayed by Christian Bell in the Big Short, just made a $1 billion bet against the AI boom.
Courtney Baker: He specifically shorted Nvidia and Palantir stock, both of which have soared to historic heights. Pete, another signal that the AI bubble is about to pop and why should it matter to our listeners?
Pete Buer: So given his pedigree, it’s kind of tough not to take notice when this guy bets against something. the interesting question for me though, isn’t. Whether you bet for or against ai, it’s whether you bet for or against the AI powered business models. [00:02:00] So a lot of companies right now are getting insane valuation boosts just for saying AI and their press releases, and we’ve seen that before and we know how it plays out in the long run.
So the takeaway for business leaders, I think is pretty straightforward. Don’t measure AI maturity by its presence or even its adoption. Stay true to your business commitments and measure it by its impact. Are your processes more efficient? Are your people more productive and smarter? Are your deals closing faster?
Are clients sticking around longer? Brewery story is a reminder that at the end of the day, the market rewards results, it’s always impressive to see the wow coming out of the backend, uh, of an AI tool, but where we need to see the wow. Is in the results with the audiences that we serve with those tools.
So to your question, uh, will the bubble burst? Will we see a correction possibly? Sure it happens. Will the market press on? Heck [00:03:00] yeah, of course it will.
Courtney Baker: Yeah.
Pete Buer: I have the smarts to win this bet on timing? Hell no. Of course, I don’t.
Courtney Baker: I think that’s such great advice. In some ways, it doesn’t really matter for business leaders, you know, it is focusing on the results and then how does AI help us achieve those, does or does not help us achieve those? So, such a good reminder not to get sucked up into the hype. Pete, thank you as always.
Pete Buer: Thank you, Courtney.
Speaker: if there is indeed a widening AI value gap, how can you make sure your company is on the right side of it? I sat down with David and Mohan recently to get their thoughts.
Courtney Baker: David Mohan, welcome back. Really excited to talk to you today about the widening AI value gap and what to do about it. So y’all ready? Good. Okay.
David DeWolf: I wanna know how you [00:04:00] define value gap. I can’t wait to hear.
Courtney Baker: yeah, so here’s a number that. Frankly, stop me in my tracks. According to Boston Consulting Group’s latest global study, only 5% of companies are getting substantial measurable value from AI at scale.
Meanwhile, about 60% have invested heavily, yet
have little to show for it. So far, That’s what BCG calls the widening AI value gap. So today. I’d love to unpack that study, why the gap exists, what future built companies do differently, and what it takes to establish enterprise scale advantages. David Mohan, I, I like truly am excited for this episode. Why are some companies. Compounded AI’s benefits while everyone else is [00:05:00] still spinning up experiments.
Mohan Rao: I think the main gap, Courtney, isn’t about access to technology, it is about how organizations operationalize intelligence. Uh, right. So that is where, this number comes from.
I think everybody has access to foundational models. People are building better and better and better, UX on top, but where we are falling down across the board. Is how we operationalize intelligence. All right, so, so there are some very, some principles that are very important here. you know, we think of AI as tools, right?
But they need to be AI operating models. It has to be embedded in the workflows. That’s when it becomes natural in a company to use. And not just that I go consult a tool. Um, so, uh, there, there are a couple other things as well, like agentic orchestration is gonna be very, very, very important as opposed to just chat bots.
And then the mistake a lot of companies make, um, especially when [00:06:00] you have 400 plus projects, is that it is better to go deep. to go wide. Like transforming a few critical processes is important before you transform the whole company. Uh, from, from a, uh, across the board perspective, these are the types of things I think where we are in terms of where, um, you are not getting the value from the investments because we are not focused on operationalization.
David DeWolf: I think all of that, uh, is so true. I also think there’s a little bit of the answer in your question alone, Courtney, and, and even almost a reference back to the last few episodes that we’ve had. You talked about experiments. Well, experiments aren’t designed to drive a return and, and drive impact.
Right. They’re designed, the outcome is learning. Right. I think we have gone too long in the experimentation phase and uh, we haven’t seen organizations shift. I think we talked about that a few episodes ago. where we had the [00:07:00] conversation about that and, and buy versus build, which I think points back to similarly, I think there are a lot of organizations that have just assumed and, and candidly because the market’s not there yet, it’s just arriving.
There aren’t enough applied AI applications that are actually solving use cases that are needed in businesses. And so organizations have built versus bought well. That’s not sustainable, right? There is a need for enterprise AI solutions that solve real functional problems, right? What are the pains? What are the problems?
And how do we build products that solve, that? Everybody building it on their own is not efficient, effective use of resources and, and as an industry, we have to get past. This phase. Right. And so I think if you tie all of those things together, it’s, it’s a little bit of a dog’s breakfast of everything.
Like I think just the dynamics of an immature market have [00:08:00] led to this value gap in a very, very real way. And what we need to see is the maturing of the market. We need to see. Enterprise platforms that are agentic by nature and natively arriving. We’ve really only seen that in the developer community tools and those types of things, right?
we need those business solutions to emerge, um, as an example, and then we need organizations to shift towards making those major purchases to solve their problem versus spending millions of dollars to try to build it on their own.
Courtney Baker: You know, something that I thought was interesting and David, it kind of. Relates back to something you talked about a few episodes ago about top down versus bottom up with these AI initiatives. Um, but so that C-Suites are also much more involved in those companies that are future building and having success with ai.
Having success with AI [00:09:00] platforms in their businesses, it says that they are 12 times more likely to have a C-suite executive and leadership teams deeply engaged with AI than the laggards. And so it just shows how important it is to really have
David DeWolf: Hmm.
Courtney Baker: organization focused on
David DeWolf: I think this affirms Mohans point about the operationalizing the intelligence,
right? This is change management, right? That, that, what that screams to me is these organizations have executive sponsors that understand the need to operationalize and are actively engaged in the change management to drive transformation, right?
Because that’s yet another factor that you could put into the dog’s breakfast of a little bit of this and a little bit of that is. Buy a tool. Don’t engage. Don’t use it. Don’t figure out how to embed it into the operations of your business and change your processes to.
Courtney Baker: Well, and I think we can say even with our own experience with Knownwell, when we have an executive [00:10:00] that’s really leaned in, been highly engaged in making sure that the change management happens well, that the transformation, um, is happening. It’s worked. It’s worked really well.
Mohan Rao: absolutely. You know, that, that’s always been a true statement, but it’s especially true of AI systems, right? So because you’re talking about elevating judgment and, um, and people, in large organizations already being concerned about some of the tools, so. so executive involvement is so important.
It’s about, uh, enablement, it’s about op, uh, operationalization into the workforce and into the workflows of their various systems. I think that’s what’s gonna make the difference.
Courtney Baker: okay, so David Mohan, I think maybe what would be helpful now is just to make this really practical for people, you know, for leaders that are sitting with this knowledge. Now that we have this AI value gap, [00:11:00] what should we do? What can we do in the next 90 days to start closing this gap?
Mohan Rao: think the first thing to do is to, you know, take the resources and focus on two to three high impact workflows. Um, if you can just start there as opposed to, because we started a couple of years ago with saying pilots and experimentation and all that, that era is over now. It’s about, um, having business impact in.
Two or three or three critical processes. Um, and then as we just discussed, having the executive involvement is super important. Uh, right. So these are, these are two good places to start.
David DeWolf: Yeah. So let’s boil that down to, uh, you know, a chopping list. Number one, identify the core outcome that you want to drive. What is the one outcome on your p and l that you wanna move the needle on? Right? Number two. [00:12:00] What is the KPI associated with that? That’s gonna give you an indication if it is, and the workflow associated with it.
That’s the, that’s mohans one or two or three workflows, right? What is it? And then go to the analysis number three of. What is the solution that may solve that? Is there something to buy that is out there where somebody spent deep, deep, deep thinking on solving this problem in a way that meets my business needs?
Or do I need to build it? And then who is going to drive it? When do I need it by? And it, it begins to be change management and project management and execution. But you gotta start, what’s the outcome I’m trying to drive? And if it doesn’t show up on your p and l. You’re not thinking of the right altitude.
Courtney Baker: Really good staff. David Mohan. Thank you as always.
David DeWolf: Thanks court.
Speaker: if there are a list of bad words in [00:13:00] business meetings, might just be one of them. The reality is they shouldn’t be. They’re actually a strategic asset. That’s why we’ve created a brand new workbook to help you make sure your meeting cadence is setting you up for success with your clients. Download this resource to help you run AI informed meetings that spark alignment in action.
Download your copy at Knownwell dot com slash meeting cadence and turn every meeting into momentum.
Speaker: Josh Payne is the founder and CEO of co frame, a company that’s redefining how websites convert visitors to customers. They’ve built an AI solution that actually learns continuously optimizing websites to drive measurable revenue in the discussion you just heard, we talked about what BCG calls the widening AI value gap, and Josh represents the other side of IT [00:14:00] leaders who are not just talking about AI potential, but actually realizing it.
Pete Buer: Josh, welcome to the show. We’re so delighted to have you.
Josh Payne: Thanks so much for having me, Pete. Good. Good to be here.
Pete Buer: If we could, can we, uh, start with a little bit of context on your role, uh, and your company and where a AI fits into your story?
Josh Payne: Yeah, absolutely. I am, uh, the founder, CEO of Co Frame. We’re a company that, uh, exists because of ai, so it’s a pretty, pretty close connection there. Um, what we do is we help companies improve their. Website conversion rate and other things that are important to the business by running experiments and tests without them needing to pull in engineers and designers and all that to do that.
as far as how AI is involved, the reason we’re able to do this so quickly for clients and speed is really important in experimentation because the faster you can test things, the faster you can learn. Um, and the reason why we’re able to do it so fast is ultimately because AI makes it possible. Um, [00:15:00] to get these tests out and on the site at a rate that is probably 20 to 50 times faster than what we would typically be able to do with the normal process and team.
So AI is basically what makes this company possible.
Pete Buer: the next question I think is already answered, but let me give it to you anyway. So, uh, thematically in this, um, episode, uh, we’re sort of speaking to the topic of realizing returns on. investments. and as with most things nowadays, it’s cool to be skeptical.
Josh Payne: Mm-hmm.
Pete Buer: a lot of
Josh Payne: Oh
Pete Buer: about an AI bubble, fellow, uh, Michael Burry from the Big Short just
Josh Payne: yeah.
Pete Buer: large and public bet against by shorting Nvidia and Palantir.
big picture, where are you on the whole ROI of AI BET with BET against?
Josh Payne: Yeah, that’s a great question. My job is to ensure that the customers that we’re working with, uh, get strong ROI from ai.
So what’s actually kind [00:16:00] of interesting about what we’re doing is even if the model progress would stop. For, you know, forever and we would be stuck with what we have today. There’s still very, very strong ROI in in doing experimentation much more quickly, uh, which is what we’re able to enable. like for our business, we are certainly making the bet that AI is gonna continue to get better and um, our process is gonna get, continue to improve and so on.
But even if it didn’t, um, I think it’s really important, actually, that’s probably a very important mental framework for.
Pete Buer: That’s
Josh Payne: All new AI companies to
have, which is that like, yes, you’re making the bet, but also today are the unique economics, you know, in your favor. and so that, that’s the way that I think about it.
Pete Buer: And so let’s then, uh, focus instead, on the customers that you work with. Do you find um, there are predictable patterns in the kind of use cases or in the ways that, companies are adopting AI that separate those who [00:17:00] will see? ROI versus those who won’t.
Josh Payne: Oh yeah, that’s a good question. So. It’s funny enough, I feel like the more, um, the more blindly optimistic you are about ai, the less you’re chance you’re gonna see ROI Yeah. If you’re just saying, let’s add AI to your existing process and like, you know, slap some chatbots into the way that our employees do work, uh, without really understanding what the workflows are, without understanding what the employees actually need and what they’re gonna acclimate to, that’s a recipe for disaster.
And that’s why we see there’s actually that recent study, uh, that MIT did. Which found that, you know, 95% plus of enterprise AI pilots fail. Yep. And there’s really good reason behind that. You know, you, you go and look at, um, what they’re rolling out and it’s just so, um, it’s, it’s not very thoughtful. It’s not connected to real use cases, and it’s not, um, it’s not something that like, you know, someone has really come in and deeply understood AI and also deeply understood the business and put those two and two together.
Usually they’ve deeply understood their [00:18:00] business. But they haven’t deeply understood ai. That’s the most typical case. Um, in the opposite side. Whenever you have third party vendors coming in, they’ve usually deeply understood AI but not deeply understood the business that they’re working with. So that’s also disconnect and that’s not good either.
The way that we try to approach it is we, we, we, of course, have a really deep understanding of, of AI and what it can and can’t do and what it will and won’t be able to do, at least, uh, a decent degree out from now. Um, but we also importantly go in and try to deeply understand the customer’s business, uh, that we work with.
And, and, and it’ll be very hands-on. And so that’s, that’s why we’re able to see a lot more success in success rate in, in the, in the pilots and so on that we run. Um, and engagements that we do
Pete Buer: do you find that there’s a big education burden, uh, in working with. like you, you show up with the expertise and they’re counting on you for that. But they also have to be brought along, I suppose, to some degree.
Josh Payne: to some degree. [00:19:00] Um, it that can depend on the, the shape of the product for, for our product. you don’t really need to, I mean, maybe to sell the initial concepts. That’s true, but at least for us, uh, we, you know, this, this has been, this is a pattern I, that I’ve seen really be working quite well for, for new startups that are in this, in this era of AI, where it’s, it’s almost like they’re trying to abstract away the AI part of it as much as they can for their clients and just basically produce the outcome.
So the way that we work with clients, we’ll go in and we will. Uh, say, Hey, like, we’re gonna increase your conversion rate by running money. Morning, more tests and, and experiments on your website. You don’t have to worry about who does it. It could be ai, it could be humans, it could be a mix. Uh, all that should matter to you is that you’re getting increased conversion on your site.
And so we try to actually abstract it away. We don’t, we don’t have anyone learn a new tool. Now that’s gonna be different for different products out there. There are actually a decent number of, of products, I mean [00:20:00] maybe even most potentially, um, that where you do have to learn something new. You have to have a new way of doing things.
But, When you don’t have that, then you can just move as fast as you want. The new model comes out, you instantly have it. You know, it’s our full-time job to like know how to use this technology properly and to use the latest and greatest, and that would be very unmanageable if you had to educate people and bring them along.
So, at least for our business, this has not been a factor. You know, that’s as, that’s as important as the education part.
Pete Buer: So you’re of, um, balancing sufficient understanding of the business with sufficient understanding of the ai. When I Yeah. like massively oversimplifying, the process of solving problems, I guess the problem in the first place. Define the solution to solve the problem. Can get people to use the solution
Josh Payne: Yeah.
Pete Buer: that chain of activity, uh, when you’re balancing those two things up front, where where is the breakdown happening most often when customers aren’t finding their way to [00:21:00] ROI
Josh Payne: Yeah. Let’s see. Is it on the problem or the solution side? Um. That’ll also depend on the business for for our business, it is almost always in the solution side. Like we haven’t found the right, you know, way of working with them and their processes and all that stuff. That can be an issue sometimes. because the problem for us is, is is pretty much always there.
It’s like you always wanna have greater conversion rate on your website. That’s like a given. And so, yeah. Um, for other businesses though, especially ones that are. More at the frontier of the technology and kind of maybe, maybe more advanced in that side. or, or that’s what they’re selling. They’re selling a lot of the sizzle.
If you’re selling a lot of the sizzle, then you’re gonna run into problems with maybe you don’t have the problem clearly defined.
Pete Buer: And I, what I didn’t hear you say in all that was that the problem is with the downstream. adoption like users actually taking advantage of the solution once it’s developed.
Josh Payne: Yeah. Yeah. At least for us, you know, in, in other cases [00:22:00] where it’s like, we’re gonna roll out this chat bot and maybe, or like AI search. You know, uh, a good example here would be glean, if you’re familiar with that company, like they enterprise AI search, where they can search across everything in, in their knowledge and so on.
That’s a really incredible use case. Um, but you do have to get people to adopt it and use it. Um, in our case. They just have to add a pixel or script onto their website and, you know, receive recommendations or new, new, new tests that are ready to run and say, yes, I wanna run it, and then we run it. Um, so there’s not a whole lot of adoption that’s needed there.
Pete Buer: Gotcha. Well, let’s Yeah. more about your case if we could. There is a term thrown around on this, uh, program periodically, uh, compounding intelligence, um,
Josh Payne: Yeah.
Pete Buer: systems that learn and, upon the learning sort of continuously.
Josh Payne: Yeah.
Pete Buer: would I be right to understand that that’s sort of fundamental, uh, to your approach?
Josh Payne: Absolutely. Yeah. That, that’s another [00:23:00] word. I, I, I, I generally use the word, uh, self-improving systems, uh, or like the term, uh, for that. And I think it’s just such a powerful concept. It’s like an amazingly powerful concept, and it’s something that I, I personally hold quite dear. Uh, you know, I, I just am very.
I we’re very interested in this whole notion of self-improving systems. Um, I think it’s something that is gonna define this next iteration of, of AI in the world.
Pete Buer: Well, and just so we can finish the, attaching a concept to, um, an example, can, can you explain how self-improvement is working with, your product?
Josh Payne: Absolutely. So, you know, when, when we come in and start to run experiments on a site, we’re going in relatively blind. You know, we, we try to ask for what are your brand guidelines? What are the previous tests you’ve run? What have, uh, what are your analytical data and so on. That helps. But what, when it really becomes powerful, when COVID really becomes powerful, is when we’re able to, given all the experiments [00:24:00] that we’ve run over the duration of our engagement, we start to generate new and novel ideas that are based on those learnings.
And when you’re able to do that, you’re kind of closing a loop, if you will, of, you know, generate an idea, test it out, analyze the results, and then continue, uh, use those results to generate the new idea, the next idea. That loop concept is really, really powerful, and it’s at the core of what Cora is. The Cora is a self-improving system for business today, businesses, websites, and not too long from now, it’ll be for paid media and for lifecycle marketing, and virtually every channel, digital channel and marketing.
Pete Buer: if, again, if I’m a, a business leader and our, our market is, uh, services, uh, companies, mid, mid-size services companies,
Josh Payne: Mm-hmm.
Pete Buer: about areas of my business where, uh, application of this kind of technology is gonna make a big difference for me, where, where would I go looking?
Josh Payne: So if you’re a business leader, like there’s, there’s a couple parts to every business. [00:25:00] There’s the core operations parts of it. Um, these, these core operations, like self-improving systems can be, useful. But I think the most important thing for like core back office operations is really just like consistency and having one workflow and doing it right every time.
And so accuracy is paramount. Like you don’t wanna do a whole lot of experimentation with these parts of the business. Legal, finance, accounting, business operations and so on. now for front office, which is also something that almost every business has, it’s quite a bit different of a story.
So we’re talking. Sales, marketing, lead generation and so on. This is where you want to be running lots and lots of experiments and testing a lot of things
out. And so whether it’s through your sales calls or your outbound marketing campaigns and email, or if it’s through your website optimization and conversion optimization, uh, or your paid media, you know, running ads on various [00:26:00] platforms, you wanna be trying out a bunch of stuff and.
the true virtue of a really impactful AI rollout. It’s not gonna be like consistently does it every time. It’s gonna be like, it’s able to test a lot of stuff out very quickly and just optimize and find, find the best thing. So, so you kinda have to look at it from a function, by a function perspective.
we operate in front office and so experimentation and iteration and self-improvement is really important. It’s not as important for back office where you just need to get the job done.
Pete Buer: Great. So, so front office focus and, and how about if you think about the world of front office running from, um, you know, generating. Having impact in markets, generating leads through to qualifying and uh, getting into the sales channel and so forth, like up and down that, um, funnel. is there a better or less good place to try experiments or is it a company by company decision?
Josh Payne: I think every, every part of that funnel should, should be undergoing quite a bit of experimentation. [00:27:00] Yep. Um, where it’s easier and highest. ROI probably is at the very top of the funnel because you’re able to learn faster. so initially, like initially it’s gonna be a lot of top, very top funnel experiments. Some of that is, is gonna be AI driven. Some of it’s gonna need to be human driven, because at the end of the day, AI can’t jump on a sales call today with, with a prospect and like walk them through the buying process and like be empathetic to them and so on. at least not yet. Mm. And for the, for the, for the more middle funnel stuff, it’s very valuable to do it, but it’s also very difficult.
That’s kind of why we at Co Frame are focused on it, is because it’s, it’s extremely ROI positive to do experimentation middle of funnel, but it requires a lot of resourcing and a lot of, uh, expertise. And so, that’s probably for most businesses, that’s probably more of a partner versus, versus build decision.
Pete Buer: Just to close this out, um, if you had, [00:28:00] if you were sitting across a cup of coffee or a, a tall cold one with a CEO. In the services business, and they’re interested in hearing your pitch and, uh, you’re giving them some advice about how to think about, in the business, where to go, what to do in what order.
You’ve covered some of this already, but how would you, how would you sum it up in, in short form?
Josh Payne: Um, well, it’s kind of difficult ’cause every business is so different, but what it comes down to is, uh, is, is, is understanding where are the bottlenecks in your business right now from those bottlenecks? where is AI actually useful? And so there’s, it’s difficult to kind of.
Pete Buer: Yep.
Josh Payne: Assess this oneself without having kind of deep AI knowledge. So bringing in someone who is gonna know about how AI works and where it’s gonna be useful and versus not is quite important. but with that, with the combination of like your business knowledge and the AI person’s AI knowledge, um, the sparks will fly.
And so it’s really just about at that [00:29:00] point going in and just. Trying, trying to take a crack at the things that are, have, have the most bottlenecks right now. unfortunately there’s not like a, a, like a, a standard step 1, 2, 3. Here’s how you roll out ai. You, you kind of need to have that person that has some distinct knowledge.
And I guess like, to an extent, these AI chatbots could be helpful thought partners with this as well.
So you could go to ChatGPT or Claude and say, Hey, here’s my business, Something I’ve actually done that’s been quite valuable for me is, um, I try to get as much context as possible loaded into these systems.
So every call that I have, we have a note taker on it and we get the transcript and we are able to plop that into AI and AVEs and it does the proper actions. I just spent a couple of days, um, setting up various CPS for our different tools, uh, for like Slack and Linear and. Um, HubSpot and so on. And so I’m able to use these tools, um, uh, to then connect to various parts of my [00:30:00] data and have, have AI do various things, without meaning to necessarily set up a very bespoke workflow.
And that’s very, that’s been very powerful. So, so getting in there and trying to basically give AI as much context as possible is kind of the right answer.
Pete Buer: Well, I thank you. This was terrific. I was learning, uh, along the way as we were talking, so I appreciate you bringing me on the journey.
Josh Payne: Yeah, absolutely. Super fun to, super fun to do it. Thanks.
Pete Buer: Awesome. Thank you, Josh. Take care.
Courtney Baker: Thanks as always for listening and watching. Don’t forget to give us a five star rating on your podcast player of Choice. As always, we’d really appreciate it if you can leave a review and or share this episode on social media.
Speaker: at the end of every episode, we like to ask one of our AI friends to weigh in. So, hey, perplexity. Bad news. I got rid of comment, but that’s beside the point. This week we’re talking about the [00:31:00] widening AI value gap. Why do you think only 5% of companies are getting it right?
Elizabeth: Most companies rush to adopt AI without aligning it with clear goals or systems that actually use the data well. The few that succeed usually started smaller, tested relentlessly, and embedded AI into real decision-making instead of treating it like a shiny add-on. and now you are in the know. Thanks as always for listening. We’ll see you next week with more AI applications, discussions, and experts.





