Your AI Strategy Is Collecting Dust
Most AI strategies fail before implementation starts. Not because the ideas are wrong - because the strategy was built for a board deck, not for the people doing the work. Here's how to build one that actually gets executed.
TL;DR: 80% of AI projects fail. Not because AI doesn’t work - because strategies are built for board decks, not for the people doing the work. The fix: start with operations, pick one pilot, give ownership to business leads (not IT), and plan for 90 days instead of 3 years. Sometimes the best AI strategy is fixing your data first.
Most AI strategies are slide decks.
They get presented to the board. Everyone nods. Someone says “this is exciting.” Then the deck goes into a shared drive and nothing happens for six months until someone asks “whatever happened with our AI strategy?”
I’ve seen this pattern at pharma companies, agencies, and scale-ups. The strategy itself isn’t usually wrong - the ideas are fine. The problem is how it was built and who it was built for.
The Numbers Don’t Lie
Before we get into why, let’s look at the scale of the problem:
- 80% of AI projects fail to deliver intended business value, according to the RAND Corporation - double the failure rate of traditional IT projects
- 88% of AI pilots never reach production. Only 1 in 8 prototypes becomes an operational capability
- 42% of companies scrapped most of their AI initiatives in 2025, up from 17% the year before (S&P Global)
- BCG found that 74% of companies have yet to show any tangible value from their AI investments. Only about 5% create substantial value at scale
Global enterprises invested $684 billion in AI in 2025. Over $547 billion of that failed to deliver. That’s not a technology problem - it’s a strategy and execution problem.
The Slide Deck Pattern
Here’s how it typically goes:
- Leadership decides the company needs an AI strategy
- Someone (internal or external) spends weeks researching “the AI landscape”
- A 40-page deck appears with quadrants, maturity models, and a 3-year roadmap
- The board approves it in principle
- Nobody knows what to do on Monday morning
- Six months later, nothing has changed
The deck answers the wrong question. It answers “what could AI do for us?” when the real question is “what should we do first, and who’s going to do it?”
Why Strategies Collect Dust
They start with technology, not problems. The strategy leads with “AI can do X, Y, Z” instead of “here’s what’s costing us time and money today.” Technology-first strategies generate excitement but no urgency. Problem-first strategies generate both. McKinsey’s 2025 State of AI survey found that organizations who redesigned workflows before selecting tools were twice as likely to report significant financial returns.
They’re built for executives, not operators. The people who read the strategy aren’t the people who’ll execute it. The warehouse manager, the HR coordinator, the finance analyst - they never saw the deck. They find out about “the AI initiative” through a vague email three months later. A Gallup poll found that only 15% of US employees report their workplace has communicated a clear AI strategy.
They try to do everything. Twenty opportunities across eight departments, all marked as “high priority.” When everything is a priority, nothing is. The team spreads thin, nothing ships, and after two quarters of no visible progress, the whole thing loses momentum.
Nobody owns anything. The strategy says “implement AI-powered document processing in Q3.” Who’s doing that? IT? Operations? The consultant who wrote the strategy? Without a named person who cares about the outcome, it’s just words. Projects with sustained CEO involvement achieve 68% success rates versus 11% for those that lose executive sponsorship.
The timeline is fiction. A 3-year AI roadmap is creative writing. The tools, capabilities, and costs will look completely different in 18 months. Planning three years out gives a false sense of control while delaying the only thing that matters: starting.
What Actually Works
BCG’s research across 1,250 companies boils it down to a ratio: AI success is 10% algorithms, 20% data and technology, and 70% people, processes, and cultural transformation. Most strategies spend 90% of their pages on the 10% that matters least.
Here’s what the research and my own experience point to:
Start with operations, not technology
Don’t ask “what can AI do?” Ask “what takes too long, costs too much, or breaks too often?”
Spend time with the people who do the work. Shadow them. Watch where they copy-paste between systems, where they wait for approvals, where they recreate documents that already exist somewhere. That’s where AI actually helps - not in some abstract “transform the business” way, but in a “this specific task takes 4 hours and should take 20 minutes” way.
McKinsey’s 2025 survey confirms this: the roughly 6% of organizations seeing real bottom-line impact from AI were nearly three times more likely to have fundamentally redesigned their workflows around AI - rather than bolting AI onto existing processes.
Pick one pilot, not five
One working example changes more minds than five plans. Pick the opportunity that scores highest on a simple matrix: business impact, feasibility, speed to value. Then do that one well.
The pilot serves two purposes. First, it delivers actual value. Second - and this is more important - it teaches the organization what “using AI” actually looks like. It demystifies it. Once people see a real process improved, the next five initiatives sell themselves.
The research backs this up: organizations that demand clear success metrics before approving a project see a 2.4x improvement in outcomes. Those that conduct formal data readiness assessments see 2.6x. Starting small and measured beats starting big and vague.
Make business owners the AI owners
The HR director owns the HR AI initiative. The finance lead owns the finance automation. Not IT. Not a central “AI team.” Not an external consultant.
Why? Because the person who owns the process understands the nuances - the exceptions, the edge cases, the reasons things work the way they do. And because they’re the one who’ll maintain it after the initial excitement fades. If they don’t care enough to own it, the initiative isn’t worth doing.
IT supports. The consultant builds. But the business owner decides, prioritizes, and lives with the result.
Plan for 90 days, not 3 years
Nobody knows what AI will look like in 3 years. The models are getting cheaper and more capable every quarter. The tools available today weren’t available six months ago.
Plan for 90 days:
- Month 1: Foundation work (data access, permissions, tool selection) + pilot kickoff
- Month 2: Pilot execution and measurement
- Month 3: Results review, learnings, decision on what’s next
Then do another 90 days. This compounds faster than a 3-year plan because you’re building on real results, not projections.
Kill ideas explicitly
Every strategy session generates more ideas than you can execute. Most strategies handle this by putting everything on a “Phase 2” list that never happens.
Instead, explicitly decide what you’re NOT doing and why. “We considered AI-powered customer service but parked it because our ticket volume doesn’t justify the investment yet” is a real decision. It prevents the “but what about…” conversations that derail focus for the next six months.
Write it down. A list of things you said no to is one of the most valuable parts of any strategy.
The Anti-Patterns
If you’re evaluating an AI strategy (yours or one a consultant is pitching you), watch for these:
“AI maturity model” - A quadrant showing you’re at level 1 and need to reach level 4. These are frameworks for selling consulting, not for building anything. Your maturity level is irrelevant. What matters is: do you have a problem, can AI help, and are you ready to try?
“AI Center of Excellence” - A new team whose job is to “drive AI adoption.” This centralizes responsibility away from the people who actually do the work. It creates a bottleneck and a political layer. AI initiatives should live in the business functions, supported by shared technical resources.
“3-year transformation roadmap” - A detailed plan for a world that won’t exist. Use it for board-level direction, but don’t pretend anyone will follow it past month 4. Gartner predicts over 40% of agentic AI projects will be canceled by 2027 due to escalating costs, unclear value, or inadequate controls. The roadmap won’t survive contact with reality.
“We need to train everyone on AI first” - Training without a concrete use case is entertainment, not education. Train people on the specific tool they’ll use for the specific task it’s automating. Just-in-time, not just-in-case.
No mention of data - If the strategy doesn’t address whether the company’s data is clean, accessible, and governed, it’s incomplete. Gartner predicts that 60% of AI projects will be abandoned by the end of 2026 due to lack of AI-ready data. And poor data quality costs organizations an average of $12.9 million per year.
The Honest Assessment
Not every company needs to rush into building AI pilots across every department.
An AI strategy doesn’t have to mean launching pilots across every department and sending the whole company on AI courses. It might mean something much more practical:
Phase 1 might be using AI to fix your data. If your product data is scattered across 8 systems, 25% outdated, and nobody agrees on which source is correct - that’s your AI strategy. Use AI-powered tools to identify duplicates, flag inconsistencies, map data flows, and build the governed foundation that makes everything else possible. AI is remarkably good at the tedious work of data profiling, classification, and quality monitoring. Doing this first means your future AI initiatives are built on solid ground instead of sand.
Phase 1 might be process documentation. Many companies don’t actually know how their own processes work end-to-end. Use AI to help map, document, and identify bottlenecks in existing workflows. You can’t automate what you don’t understand.
Phase 1 might be “let people experiment.” For a 30-person company, the right AI strategy might genuinely be: give everyone access to ChatGPT or Claude, see what sticks, then formalize the things that work. Not every organization needs a formal strategy engagement.
The point is: a good AI strategy meets you where you are. If your data is a mess, cleaning it up IS the strategy. If your processes are undocumented, mapping them IS the strategy. The worst thing you can do is skip the foundation work and jump straight to flashy pilots that fail because the basics aren’t in place.
The World Economic Forum puts it bluntly: fewer than one in five organizations report high maturity in any aspect of data readiness. For most companies, the honest first step is getting their house in order - and AI itself can be the tool that helps them do it.
The best outcome of a strategy engagement is sometimes “here’s what you need to fix first, and here’s how AI can help you fix it.” That’s not a failure - it’s saving you from spending money on something that would fail.
And here’s the thing: if you’re using AI to get your data in order, map your processes, and identify opportunities - you’re already executing on your AI strategy. You just skipped the slide deck.
Sources
Research and statistics referenced in this article:
- RAND Corporation - AI Project Failure Rate (2025) - 80.3% overall failure rate
- MIT - The GenAI Divide: State of AI in Business 2025 - 95% of GenAI pilots failing
- S&P Global (2025) - 42% of companies scrapped most AI initiatives
- BCG - The Widening AI Value Gap (Sept 2025) - 10-20-70 principle, 74% showing no tangible value
- McKinsey - State of AI 2025 - Workflow redesign strongest predictor of impact
- Gartner - Lack of AI-Ready Data (2025) - 60% of projects abandoned due to data readiness
- World Economic Forum - Data Readiness as Strategic Imperative (2026) - Fewer than 1 in 5 report high data maturity
- Gallup - AI Strategy Communication - Only 15% of employees aware of AI strategy
If you’re starting from zero and want to build an AI strategy that actually gets implemented - not a deck that gets filed - get in touch. I run structured discovery and workshops that end with a concrete action plan, named owners, and a 90-day roadmap. No quadrants, no maturity models, no 50-page PDFs.
Related work
AI Brain Fry Is Real - But It's Not the Tools' Fault
Harvard Business Review says AI is frying workers' brains. My data - 1,986 commits and 1,900 AI sessions in 76 days - shows the opposite is possible. The difference isn't the tools. It's whether anyone invested in actually understanding them.
Your agents know exactly what you tell them
Most people's AI instructions are empty or stale. The ones getting consistently better results are transferring judgment, not just preferences - and they have a system for keeping it current.
What Are AI Agents, Actually?
An agent is a system where the model decides the next step. That's it. Most things called 'agents' aren't - they're workflows with LLM-powered steps, which is usually the right architecture anyway. Understanding the actual distinction helps you buy smarter and build better.