From 9% Problem to 900% Return
- 3 hours ago
- 4 min read
In an article for Digital Economy Dispatches, Alan Brown highlights a “9% problem” - only a small minority of organisations are converting AI experiments into scaled impact, despite billions spent on digital and data initiatives. In UK local government, that looks like pilots that never move beyond a corner of the organisation, fragmented data foundations, and frontline teams who experience AI as yet another system to work around rather than with. The UK’s AI problem is not a lack of tools. It is a lack of real, measurable outcomes in real services.
Hey Geraldine shows what happens when that gap is closed.
Hey Geraldine, built with Peterborough City Council’s adult social care team, sits firmly in the other 9% - the minority of AI initiatives that deliver disproportionate value against modest investment. In a landscape saturated with generic copilots and platforms, it is rare not because the technology is exotic, but because the outcomes are.

What Makes Hey Geraldine Unusual
Most AI tools arrive as “solutions” in search of problems. Hey Geraldine started with a very specific, human one: a single expert, Geraldine Jinks, holding 30 years of tacit knowledge and being overwhelmed by constant requests from colleagues.
Three things make the tool stand out:
Hyper-specific purpose. It is a personalised AI assistant for adult social care staff, answering practice questions and referral queries 24/7 in Geraldine’s voice, not a generic chatbot bolted onto a website.
Built in weeks, not years. Peterborough, Datnexa and partners took it from concept to live use in around six weeks, using pragmatic AWS-native components rather than chasing “cutting-edge” complexity.
Governance by design. DPIAs, DPAs and information security processes were completed before testing; data is encrypted, remains within clearly defined boundaries, and is not used to train external models.
In other words, Hey Geraldine is rare precisely because it combines frontline relevance, speed, and serious governance in a single deployment - something many organisations still treat as an impossible trade-off.
Outcomes That Actually Move the Needle
For UK public services, the test is not whether an AI tool is clever; it is whether services run better, staff experience improves, and money is used more intelligently. On that test, Hey Geraldine is an outlier.
15 minutes saved per conversation. During the initial six‑week trial, the assistant handled around 1,200 staff queries, saving roughly 15 minutes per conversation – more than 300 hours of practitioner time returned to direct resident support.
Referral backlog cleared. By answering common questions instantly and consistently, Hey Geraldine enabled the adult social care team to clear a long‑standing referral backlog, unblocking hospital discharges and speeding up support for residents.
£250k+ cost avoidance in year one. Time savings, reduced reliance on agency staff and more efficient decision‑making combined to deliver over £250,000 in cost avoidance in the first year of operation.
900% ROI. Against the modest cost of design, build and operation, the council realised roughly a 900% return on investment - a level of payback that places Hey Geraldine in the very top tier of public sector digital initiatives.
These are not proxy metrics or “engagement scores”; they are operational and financial outcomes that directly address the fiscal and workforce pressures described in national assessments of AI’s potential economic impact.
Why Tools Like This Are Still So Rare
If AI can do this in one adult social care team, why is Hey Geraldine still the exception rather than the norm?
Several patterns emerge when you contrast it with typical AI programmes across the UK:
Capability outpacing institutional readiness. Many organisations have access to powerful models, but lack the governance, data quality, and delivery discipline to deploy them safely at scale.
Foundations not fit for purpose. Mid‑market and public bodies often try to “add AI” on top of fragmented systems and unclear processes, turning AI into expensive experimentation rather than a driver of real value.
Hype over hard problems. Leaders are pushed toward eye‑catching pilots rather than the unglamorous bottlenecks - like referral backlogs - where AI can quietly release capacity and improve outcomes.
Hey Geraldine breaks this pattern. It demonstrates that you do not need a “moonshot” to get to 900% ROI; you need one unambiguous problem, trusted domain experts and a delivery model that treats governance and user adoption as first‑order design constraints.
What This Signals for UK AI Readiness
The 9% problem is not inevitable. Projects like Hey Geraldine show a different path for UK public services and mid‑market organisations who want AI that actually works:
Start with a measurable operational constraint, not a technology mandate.
Build with the people who own the work today, capturing not just what they know but how they think and speak.
Use cloud‑native, well‑governed components so information security and compliance are solved once, not re‑negotiated on every pilot.
Judge success in hard terms: minutes saved, backlogs cleared, avoidable costs removed, returns quantified.
As the UK grapples with low productivity growth and mounting service pressures, AI readiness will increasingly be defined not by the number of models deployed, but by the number of services that can point to Hey Geraldine‑style results. That is the bar Datnexa believes should become normal - and the reason tools like Hey Geraldine still stand out as rare, for now.





