If you listen to conference talks, it sounds like every company is “transforming with AI.”
- Boards are asking about it.
- Exec teams are funding it.
- Teams are piloting it.
McKinsey’s latest State of AI 2025 survey backs that up:
- 88% of organizations now report regularly using AI in at least one business function.
But Houston we have a problem.
That same research says:
- Most organizations are still stuck in pilots or small experiments.
- Only about one-third have begun scaling AI across the enterprise.
- Just 39% report any EBIT impact at the enterprise level, even though many see small, use-case-level cost and revenue benefits.
A Wall Street Journal analysis of the same trend is even more blunt:
- 78% of companies use AI in at least one function.
- Yet most report less than 10% cost savings and under 5% revenue uplift from those efforts.
That’s the paradox:
AI is everywhere, and still barely visible on the income statement.
Let’s unpack why.
1. AI Has Gone Mainstream. Impact Hasn’t
First, the “everywhere” part isn’t hype:
- McKinsey: AI adoption jumped from ~50-60% of companies in 2019-2023 to ~72–88% using AI in at least one business function by early 2024-2025.
- Adoption is global: across regions, more than two-thirds of organizations report using AI somewhere in the business.
AI is now:
- In your CRM
- In your support stack
- In your productivity tools
- And increasingly in “AI assistant” layers across SaaS
But when you ask:
“What has this actually done for revenue growth, margin, or cost-to-serve?”
McKinsey’s answer is: the transition from pilots to scaled impact is still a “work in progress” for most organizations.
So you get a familiar pattern:
- AI in tools
- AI in slideware
- Almost no AI in the financial narrative
2. The Myth That “Model Quality” Is the Problem
When AI projects underperform, the reflex explanation is:
- “The model isn’t accurate enough yet.”
- “The tech will mature in 12-24 months.”
- “Once we upgrade to the next-gen model, we’ll see the value.”
McKinsey’s research says… not really.
Across multiple surveys, the top barriers to realizing AI value have been remarkably consistent:
- Lack of clear AI strategy
- No coherent view of where AI should be used and why.
- Fragmented, one-off projects with no portfolio logic.
- Talent and skills gaps
- Shortage of people who can bridge business and AI (“translators”).
- Underinvestment in upskilling the broader workforce, not just an AI pod.
- Weak risk management and governance
- Unclear policies on data, usage, approvals, and logging.
- Risk and compliance pulled in late as blockers, not early as design partners.
- Data readiness
- Data scattered across silos, with unclear quality and ownership.
- Minimal observability: no consistent logging, metrics, or monitoring.
Notice what’s missing from the top of the list:
- “Model quality”
McKinsey’s own framing: the main obstacles are strategy, talent, risk, and data, the operating model, not the sophistication of the algorithms.
In other words:
If your AI isn’t showing up in the P&L, odds are high it’s a leadership and workflow problem, not a GPU problem.
3. What “Everywhere but Nowhere” Looks Like Inside a Company
You can usually spot this pattern up close.
AI is used, but not owned
- Teams can say:
- “Yes, we’re using AI, in sales, in support, and in marketing.”
- But there’s no:
- Single business owner accountable for a KPI.
- Clear value hypothesis (“this will move this number by this much”).
You hear:
- “We’re experimenting in a lot of areas.”
- “We’ve got pockets of excitement.”
You don’t hear:
- “AI is how we cut cost-to-serve by 15%.”
- “AI is why our NRR improved by 4 points.”
Every function has AI-but nobody rewired the work
- Sales still runs the same cadence, just with AI-drafted emails.
- Support still measures the same metrics, just with AI-suggested replies.
- Ops still uses the same approvals and handoffs, just with AI summaries.
So:
- Individuals feel a bit faster.
- The system still burns time and money at the same rate.
We are going to call this out explicitly: to capture material value, companies need to “rewire” workflows and operating models, not just plug AI into existing processes.
Finance sees cost, not contribution
- AI shows up in:
- Vendor invoices
- Cloud spend
- Capitalized development cost
…but it doesn’t show up as:
- A credible, quantified contributor to:
- Revenue
- Gross margin
- Opex efficiency
So to the CFO, AI is:
- A cost center
- A strategic “must-do”
- Not yet a line in the story of why this company is performing better
That’s “nowhere on the income statement” in practice.
4. The Real Barriers
If we boil down the surveys and follow-on analysis down to what keeps AI off the P&L, it looks roughly like this:
1) No coherent AI strategy
- AI projects don’t stack into a portfolio; they’re scattered bets.
- There is no explicit answer to:
- “Which 3-5 parts of the business must be transformed with AI?”
- “Which parts will we ignore on purpose for now?”
2) Talent gaps and missing translators
- Strong engineers, but:
- Few people who can translate between P&L goals and AI capabilities.
- Little investment in teaching non-technical teams how to work with AI.
- Leaders underestimate the people side:
- One McKinsey workplace report found only 1% of companies consider themselves “mature” in AI deployment, even though nearly all are investing.
3) Risk and governance that freeze scale
- Policy vacuum:
- No consistent rules on where AI is allowed, how it’s reviewed, or how to escalate issues.
- Legal/compliance show up at the end:
- For many companies, that’s where use cases go to die, or get confined to low-impact areas.
4) Data that can’t carry real workloads
- AI pilots can cheat:
- Manual data exports
- Hand-curated datasets
- Shortcut integrations
- Production can’t:
- Needs clean-enough, governed, and observable data flows.
- Needs basic event logging and metrics that risk and audit can trust.
Put together, we are describing a world where:
- Tools have diffused fast,
- But only a subset of firms have rewired enough of their strategy, talent, risk, and data to see revenue and cost impact at scale.
5. Quick Self-Check: Is AI Actually on Your Income Statement?
You don’t need a 50-page diagnostic to get a first signal. Ask yourself, for each AI initiative you’re proud of:
A. Could you defend its financial impact to your CFO?
- Can you say:
- “We reduced X by Y%”
- Or “We increased Z by Y%”
- With:
- A clear baseline
- A measured after-state
- A story that isolates the AI contribution?
If not, it’s probably “nice”, not “material.”
B. Does it touch a high-leverage workflow?
- Does this AI use case sit in:
- Revenue-critical flows?
- Cost-heavy operations?
- Risk-heavy decisions?
Or is it mostly:
- Internal convenience
- Edge scenarios
- “Innovation theater”
Low-leverage workflows lead to low-leverage economics.
C. Who owns it beyond IT?
- Is there a business owner who:
- Talks about this AI system in their plan to hit targets?
- Feels accountable if it fails to deliver?
Or is ownership:
- A committee
- A project team
- A CoE with no P&L
If no one outside IT owns it, it’s unlikely to become financially important.
D. Did work actually change, or just the tools?
- Are steps removed, approvals simplified, queues restructured?
- Are roles redefined with explicit:
- What AI does
- What humans do
- How performance is measured?
If the workflow is the same, but there’s just an AI button somewhere, expect marginal gains at best.
E. Could you turn it off without hurting the business?
- If you shut this AI system down tomorrow:
- Would revenue decline?
- Would costs rise?
- Would risk measurably increase?
If the honest answer is “we’d be fine,” then it’s not yet meaningful, no matter how good the demo looks.
6. Where The Scale Crew Fits In
The story McKinsey we are telling is not “AI doesn’t work.”
It’s:
“AI is spreading faster than organizations are rewiring themselves to use it well.”
At The Scale Crew, we work with US startups, SMBs, and mid-market companies that:
- Are done with AI theater
- Feel the pressure to “do something with AI”
- Don’t want to spend the next 12-24 months adding AI everywhere except the income statement
- We don’t show up saying:
We start with questions like:
- Should AI even be involved in your top KPI right now?
- If yes, is there a cheaper “boost what you already have” path before “build”?
- If you did build or configure something, what would have to change in strategy, people, risk, and data for it to actually show up in your numbers?
That’s what our AI Readiness & Transformation Program is designed to do:
- Separate “we use AI” from “AI moved our business.”
- Help you decide where AI belongs, where it doesn’t, and what it would really take to avoid just adding more pilots to the pile.
If AI Is “Everywhere” in Your Org but Hard to Find in Your P&L
We’ll help you see:
- Whether AI belongs near that KPI at all
- Whether your real blockers are strategy, talent, risk, or data
- And what would need to change for your AI story to be visible on the income statement, not just in the slide deck.

