top of page

AI, Social Prescribing and a new kind of front door

  • 13 hours ago
  • 4 min read

On Social Prescribing Day, Datnexa and Peterborough City Council shared an early look at a new AI‑powered social prescribing chatbot – a small, web‑based window into a much bigger shift in how residents find non‑clinical support in their communities.


Social prescribing is already doing heavy lifting in prevention: helping people live well and independently, tackling the social and economic factors behind poor health, and taking pressure off clinical services. But across any local area, the reality is a messy, fragmented “corpus” of support – council services, VCSE offers, community groups, apps, digital tools – that are hard to see as a whole and harder still to navigate consistently.


Our starting point in Peterborough was simple: there is more support out there than most people realise, including many services that are commissioned but under‑used. The problem is not just provision; it’s discovery. Residents, carers and professionals all struggle to see what’s available, what’s current and what’s right for a specific situation at a specific moment.



Meeting people where they are


The chatbot you see in the video is our answer to that navigation challenge.

Rather than expecting someone to show up saying “I need social prescribing”, the agent is designed to recognise and respond to the everyday ways people actually talk about their lives: feeling lonely, worried about money, caring for a partner, unsure how to cope after a fall.


The conversation is deliberately natural. People can type in their own words; the agent responds in plain English, asks clarifying questions and gently steers towards practical options. We’ve designed it using a strengths‑based coaching approach, so the tone is supportive and encouraging rather than transactional or clinical.


Behind the scenes, the agent is drawing on a curated knowledge base that brings together content from Peterborough City Council and trusted partners. It reads that content, applies local weighting and priorities, and then surfaces activities and resources that best match what the person has described. At the end of the conversation, it can summarise the plan and package it up – for now as a downloadable PDF, in future through channels like SMS or email.


At the moment, the chatbot lives in a simple web interface. That’s an interim choice, not the destination. The underlying “brains” can be surfaced in multiple places – on council and VCSE websites, via WhatsApp for professionals, or integrated into tools like Microsoft Teams to help staff draft replies to residents.


Safety designed‑in: the safeguarding guardian


One of the most important design decisions has nothing to do with the chat window and everything to do with safety.


We’ve built the service using a dual‑agent pattern. One agent handles social prescribing queries. Alongside it runs a second agent – a safeguarding “guardian” – whose only job is to watch for signs that a conversation may involve risk or harm.


When the guardian spots something of concern, it can immediately step in, stop the automated part of the interaction and direct the person to human help instead. That might mean a phone number, an emergency contact route or another agreed escalation path, depending on local safeguarding policies.


This guardian pattern lets us be ambitious about what the main agent can do, while staying clear about the boundary between what is safe to automate and what must always be human‑led. As channels expand from text into voice and as the agent ingests more content over time, the safeguarding guardian stays in place as a non‑negotiable guardrail.


Curated knowledge, not just clever tech


A lot of the hard work in this project isn’t glamorous AI; it’s the graft of knowledge management.


In Peterborough, the chatbot’s knowledge base pulls together information from multiple sources: local authority websites, community directories, voluntary‑sector partners and more. That content is then weighted so that the agent understands what is most trusted, most current and most relevant for different kinds of queries.


On top of that we are experimenting with an “archivist” role: using AI to continuously scan for duplicates and conflicts (for example, the same coffee morning listed with different times in different places) and flag them for human colleagues to tidy up. Over time, this improves the underlying information that every professional and resident relies on.


The data flowing through the chatbot is also a powerful new feedback loop. By looking at which questions are being asked, from which areas and at what times, we can surface patterns of demand, highlight gaps in information, and spot issues that commissioning and insight teams may want to explore further.


A repeatable pattern, beyond social prescribing


Architecturally, what you see in the video is one instance of a simple pattern:

  • People using a chat interface in channels that suit them

  • An AI agent that can hold natural conversations and apply local rules

  • A knowledge base that is curated, weighted and maintained by local partners


Because the pattern is modular, it can be reused with relatively little extra development. The real work shifts from “building the tech” to understanding where your knowledge lives, how you want to prioritise it, and how you’ll iterate safely with residents and staff.


In Peterborough, we’re already extending this approach into children’s services, prevention and early help, housing and regulatory services. Sitting in front of those specialist agents is an “orchestrator” – a concierge that greets people at the digital front door, understands what’s on their mind and quietly routes them to the right place. That reflects the reality that most people’s situations don’t sit neatly in one service box.


A new colleague, not a replacement


At Datnexa we think of each agent as a new colleague: highly capable in its own way, but effectively “born yesterday”. It needs onboarding, supervision and feedback.

This is not about replacing social prescribers, link workers or community connectors. It’s about giving them and the people they support a shared, current, locally sovereign source of information – and freeing up more human time for the relational, complex and creative work that AI cannot do.


The video you’ve just watched is an early glimpse of that future. The next steps will be shaped, as they should be, by lived experience: what residents find useful, what practitioners trust, and where local leaders want to take the pattern next.


  • TikTok
  • Youtube
  • LinkedIn
  • Twitter
  • Instagram
  • Facebook

Big Ideas. Bold Tech. Better Tomorrow.

Get project learnings, technology insights, and useful tips for AI, data and innovation 👇

© 2025 by Datnexa Ltd 

bottom of page