AI Adoption: Why a Weak Organisational Foundation Guarantees Project Failure

Jordan Kelly • November 8, 2025

Unreadiness: 
The Ultimate Political and Financial Liability

The urge to grab the latest AI fix for chronic government sluggishness is huge. But AI adoption fails a  lot.

The problem is rarely the tech itself; it's the agency's lack of procurement maturity - the basic readiness to actually deploy and sustain Artificial Intelligence once the contract is signed.


Buying complex AI solutions when the state of foundational readiness is weak, is a near-guaranteed way to waste taxpayer money. In the currently tight economy of almost every western nation, the public is becoming increasingly sensitive to public funds wastage.


Project failure stands to severely damage public and stakeholder confidence in an agency, or even the government as a whole, if the application and the issue is big enough.


The Four Pillars of AI Maturity


An agency's AI maturity is defined by its readiness across four key pillars.

It's incumbent upon procurement professionals and other relevant members of an evaluation committee to assess these pillars as non-negotiable operational pre-requisite "readiness factors" for any vendor partnership.


1.  Strategic Alignment: Solving the Right Problem


Many organisations fall into the trap of purchasing AI for AI's sake. But AI is a tool, not a strategy.


  • The Solution Trap

    Success begins with defining a clear, measurable public service outcome.

    Procurement teams must challenge vendors to define the exact
    public benefit that the AI solution will deliver, as well as to demonstrate a thorough risk analysis and mitigation strategy.

    Both the vendor and the agency must ensure the projected expenditure targets a systemic problem, not just a superficial upgrade.


  • The Operational Mandate

    Alignment must combine top-down policy goals with bottom-up operational insights to ensure the solution actually works "on the ground" for the taxpayer.

2.  Technical and Human Literacy: Managing the Skills Gap


AI tools don't replace human experts; they demand them.


  • The Talent Crisis

    According to the 2024 Salesforce Public Sector Survey of global government AI professionals, 60 percent report that a lack of relevant skills hampers their ability to apply AI solutions.


    Maturity means staff must have the
    literacy to operate the AI and, even more importantly, the critical thinking capacity to interpret its output and spot flawed decisions.

  • The Operational Mandate

    Agencies must budget for continuous upskilling.

    Investment in AI must be immediately matched with a dedicated and fully funded human capital strategy to train staff on the new tools and the data science principles that underpin them.

3.  Data Governance: Beyond the One-Time Clean-Up


The fundamental principle is simple: "Garbage In, Garbage Out" (GIGO).

A 2024 global survey of data professionals by Precisely and Drexel University reported that an estimated 70 percent of organisations struggle to trust the data they rely on.


Maturity means setting up
continuous data governance policies, not just a single, initial cleaning event.

  • The Operational Mandate

    Agencies must track
    data lineage (where data comes from and its integrity) and auditability.

    Without clean, reliable data, the AI simply automates GIGO faster, making the agency less efficient and compounding the cost of failure.

4.  Legacy Infrastructure: The Integration Tax


Legacy systems are the silent killer of AI projects.


  • The Hidden Cost

    The U.S. Government Accountability Office (GAO} has repeatedly highlighted in audits of major Federal departments that up to 80 percent of IT budgets go towards maintaining aging legacy systems.


    Every new AI tool must connect to this aging core, incurring a high
    "integration tax".

  • The Operational Mandate

    Maturity requires a rigorous audit of existing legacy IT systems to predict the integration cost before contracting.

    If the new AI solution adds complexity or creates new data silos, the "efficiency" promised by the vendor will be eaten alive by the integration cost.


A Phased, Operational Approach


For vendors chasing high-value government contracts, the pitch must move past simple technological capability.


The modern strategic vendor partners with the agency to diagnose its maturity level   before proposing a solution.


Agencies must adopt a phased, operational approach; initial spending should focus on preparing the foundation (data, skills, systems). Only when this foundational readiness is confirmed should the full technological solution be deployed.


This maturity assessment must become a mandatory pre-condition in the Artificial Intelligence procurement process, shifting the focus from buying tech to building sustainable, intelligent operational capacity that truly delivers for the taxpayer.

By Jordan Kelly November 8, 2025
Algorithmic Bias: The Ultimate Procurement & Risk Management Failure
By Jordan Kelly October 19, 2025
Estonia . . . Where IS That? And Who Would Have Thought?
By Jordan Kelly October 18, 2025
Government's traditional playbook was designed to buy known quantities . . . but it's going to fail for innovative AI procurement.
By Jordan Kelly October 12, 2025
AI-Facilitated Traffic Flow Proficiency: One Common Objective. Two Very Different Methods.
By Jordan Kelly October 11, 2025
From Passive Security to Active Governance of Data Usage
By Jordan Kelly September 18, 2025
Science, Innovation & Technology Minister: 'New Zealand must develop stronger AI capabilities to drive economic growth.'
September 3, 2025
A Massive Upside IF It's Done Right . . . and An Unrecoverable Downside If It's Not
September 3, 2025
AI Multi-Model Expert Delves Into the Ethics Concerning the Sourcing of Data that 'Feeds' LLMs