U.S. businesses have spent between $35 billion and $40 billion on AI initiatives. The return? MIT ran the numbers. Zero. For 95% of companies.
This isn't just an American problem. In the UK, 70% of businesses said they were unsure whether AI is delivering its full potential. In Australia, 93% of organisations cannot effectively measure AI's ROI. In Canada, only 2% of companies are already seeing a return on investment from their generative AI investments.
MIT's latest research reveals something unexpected: the technology works fine. The missing piece? Leadership communication and engagement strategies that create the cultural foundation AI needs to deliver results.
After 20 years working across business, government, and education sectors, I've observed the same pattern challenge major initiatives repeatedly. Leadership announces something transformative. Teams aren't engaged early enough, so it feels handed down. There's clear communication at the beginning whilst a project team drives implementation. Then that communication dries up.
People are left managing something they had no input into and often don't believe is fit for purpose.
With AI, there's an additional complexity: legitimate fear.
Many organisations initially feared and banned AI. As companies now introduce it strategically, employees understandably worry about job security.
89% of workers express concern about AI's impact on their job security. 43% know someone who has lost a job because of AI. This isn't theoretical anxiety.
Here's what most organisations miss: AI doesn't fix existing problems. It magnifies them.
McKinsey found that 51% of companies cite poor data quality as their primary AI barrier. Canadian organisations echo this challenge, with 49% identifying data quality and availability as the most common obstacle when moving from AI pilots to full launch. But that statistic masks the deeper issue.
Most organisations have struggled with data for years — how to gather it, protect it, manage it, and use it effectively. When AI is introduced on top of already weak systems, the technology reveals the gaps and limitations that may have been manageable before.
30% of total enterprise time gets spent on non-value-add tasks because of poor data quality. 82% of respondents spend one or more days per week resolving data quality issues.
This isn't a data problem. It's a communication problem.
Legacy systems and weak data governance create poor data quality. Without proper industrial data management, organisations often develop alternative ecosystems of multitab spreadsheets driven by well-intentioned subject matter experts feeding critical processes.
This pattern emerges when teams aren't engaged early and communication channels weaken over time.
People are understandably concerned that AI will take jobs. Company culture across industries reflects that we're in a significant transition period—post-pandemic and into an era of rapid technological change.
The opportunity for leaders is to emphasise that AI only works alongside skilled professionals who remain in the driver's seat.
This requires strong leadership communication to convey this reality in a way people trust.
54% of employees say their employer is only "somewhat transparent" about AI adoption plans. Whilst 56% receive role-specific AI training, only 38% feel fully supported in adapting to AI-driven workplace changes.
Organisational psychologist Brian Smith notes that "pervasive fear typically indicates a lack of open communication and the presence of unrealistic expectations."
Gartner's research confirms: "Employee concerns are not fear of the technology itself, but fear about how their company will use the new technology."
When frontline people are engaged early about data challenges, they can help find solutions and improve processes.
The communication opportunity: frontline staff often aren't aware of how data is used across the organisation. Creating communication flows that connect these dots unlocks invaluable insights.
Yes, data management is costly. But it's significantly less costly than AI that doesn't deliver.
When AI is deployed on top of unresolved system issues, organisations face extensive remediation efforts. The share of companies abandoning the majority of their AI initiatives has skyrocketed from 17% in 2024 to 42% in 2025.
In Australia, enterprise-wide AI transformation remains the exception rather than the norm, creating a mismatch between the loud global story about AI transformation and the quieter reality inside firms of "experiments, pilots and a lot of waiting around for real productivity gains to show up."
MIT's research points to flawed enterprise integration as the core issue—not the quality of AI models. The problem is the "learning gap" for both tools and organisations.
Executives often blame regulation or model performance. Generic tools like ChatGPT stall in enterprise use because they don't learn from or adapt to workflows.
The solution is strategic preparation, not additional technology.
McKinsey data shows that workflow redesign—not just tool deployment—has the biggest effect on organisations' ability to see EBIT impact from AI.
This means embedding strong communication practices across teams before deploying AI. It means creating feedback loops that identify issues before they escalate into costly problems.
Engage early and continuously. Don't hand down AI initiatives. Involve teams in identifying problems and designing solutions from day one.
Create transparent communication flows. Frontline staff need to understand how their work connects to data use down the line. Leadership needs to hear frontline insights about what's actually broken.
Address fear directly. With 89% of workers concerned about job security, leaders have an opportunity to communicate clearly that AI amplifies human capability rather than replacing it. Demonstrating how people remain in control builds trust and engagement.
Establish continuous feedback loops. The CARE framework (Communication, Accountability, Recognition, Engagement) provides ongoing sentiment tracking and coaching based on real team feedback—not one-time training that disappears.
Empower managers at all levels. AI success isn't just a C-suite responsibility. In Australia, 64% of organisations have not provided any AI training, and 200,000+ managers across Australia lack dedicated communications teams. Democratising communication support means better AI outcomes.
McKinsey's research showed the number one factor correlated with bottom-line AI impact is CEO accountability.
Yet 46% of organisations lack structured ROI frameworks even as 70% place AI at the heart of their business strategy.
In Canada, more than half (57%) of business leaders said one of their biggest challenges in implementing AI is understanding how to capture value from the technology, with less than four in ten saying their organisation has a clear plan on how to extract value from generative AI.
Measurement requires clear communication. Returns on technology investment require strong cultural foundations.
Harvard Business Review research shows that 70% of employees feel more engaged when management consistently updates and communicates company strategies. MIT research confirms that employee disengagement costs a median-size S&P 500 company between $228 million and $355 million annually in lost productivity.
Better communication equals better business outcomes. It's that simple.
Organisations that see AI returns share common characteristics: they redesigned workflows, engaged teams early, maintained transparent communication, and created cultural foundations before deploying technology.
They treated AI as a partner requiring human expertise, not a replacement for it.
They understood that proactive engagement across teams prevents issues from escalating.
The $40 billion in AI investments doesn't have to join the 95% delivering zero return. The solution starts with communication, not code.
Every team member deserves to be heard, to feel valued and cared for. AI can give managers superpowers—but only when built on a foundation of strong communication culture.
The question isn't whether the AI is sophisticated enough. The question is whether the communication strategy supporting it is equally robust.