The Road to Perdition: What Copilot Adoption in UK Councils Reveals About Our Readiness for AI
By Cristian Bogdan
Introduction: A Journey That Started with Conversations
Over the past few months, I’ve spoken with dozens of people
across UK local councils-deployment leads, IT managers, frontline staff, and
even a few disillusioned early adopters. What began as a curiosity about how
Microsoft Copilot was being used in the public sector quickly turned into
something else: a deep dive into a deployment landscape that is, in many
places, alarmingly unprepared.
These conversations triggered a research effort that left me
stunned. I reviewed FOI disclosures, inter national
surveys, and case studies. I compared the glowing headlines with the whispered
frustrations. What emerged was a picture not of technological failure, but of
human, organizational, and strategic breakdown.
The truth is this: Copilot didn’t fail in UK councils. We
did.
Chapter 1: The Promise That Sparked the Rush
Microsoft 365 Copilot, powered by GPT-4, was marketed as a
game-changer. Integrated into familiar tools like Word, Outlook, and Teams, it
promised to automate mundane tasks, summarize meetings, draft emails, and even analyse
data. For councils under pressure to “do more with less,” it sounded like
salvation.
And in some places, it was.
- Aberdeen
City Council reported a projected £3 million in annual savings and a 241%
ROI after deploying Copilot to over 700 staff.
- Somerset
Council saw 300 staff save up to 55 minutes per meeting using AI-generated
minutes. 88% of neurodivergent staff reported productivity benefits.
- Barnsley
Council achieved 70% regular usage among licensed staff, supported by a
“Flight Crew” of 150 champions.
- Buckinghamshire
Council used a “Dragons’ Den” model to identify internal use cases,
reporting 60–90 minutes saved daily per user.
These are not minor wins. They show what’s possible when
Copilot is deployed with care, training, and governance.
But they are the exception.
Chapter 2: The Reality Most Councils Faced
For every success story, there are multiple councils where
Copilot adoption has faltered-or worse, quietly failed.
No Training, No Trust
In over 40% of councils, staff were given access to Copilot
or generative AI tools with no training or guidance. In some cases, the only
support was a link to Microsoft’s generic documentation. Fewer than 15% had a
structured onboarding path or prompt literacy materials.
The result? Confusion, mistrust, and abandonment.
Staff didn’t know what Copilot could do. Many didn’t trust
it. Others didn’t see the point. And telemetry data-often used to justify
adoption-was misleading. A handful of “power users” skewed the numbers, masking
the fact that most staff weren’t engaging at all.
The Human Factor: Ignored at Our Peril
AI isn’t just a technical shift-it’s a cultural one. And in
many councils, the human side of the equation was ignored.
- Fear
of job loss was rampant, especially among administrative and support
staff.
- Concerns
about surveillance and AI monitoring created anxiety.
- Professional
identity was threatened. Social workers, planners, and legal staff worried
that Copilot would deskill them or undermine their judgment.
In one council, confidential social care notes were pasted
into Copilot prompts without awareness of the privacy implications. This wasn’t
malice-it was ignorance. And it’s a direct result of failing to educate staff
on responsible AI use.
Chapter 3: The Cost of Getting It Wrong
Licensing Copilot at £25 per user per month may seem
manageable-until you scale.
- A
1,000-user deployment costs £300,000 per year.
- One
mid-sized council estimated £80,000 per year in unused licenses.
- Only
a handful of councils are tracking ROI. Most have no idea whether Copilot
is delivering value.
This isn’t just a missed opportunity. It’s a financial
liability.
And the reputational risk is growing. Councils that rushed
into deployment without governance, training, or risk assessments are now
facing scrutiny-from unions, from staff, and from the public.
Chapter 4: Why Some Councils Succeeded
The councils that got it right didn’t just “roll out”
Copilot. They built ecosystems around it.
1. Peer-Led Training
Barnsley’s “Flight Crew” and Somerset’s “Pioneer Community”
created safe spaces for learning. Staff could ask questions, share prompts, and
build confidence together.
2. Ethics-by-Design Governance
Successful councils conducted early DPIAs, formed AI
governance boards, and reviewed usage regularly. This built trust and prevented
backlash.
3. Task-Specific Pilots
Rather than launching Copilot across the board, these
councils started with specific use cases-meeting minutes, case notes, email
summaries. This ensured quick wins and measurable outcomes.
4. Cross-Council Collaboration
Groups like LOTI and the LGA helped councils share lessons
and avoid repeating mistakes. The councils that listened benefited. The ones
that didn’t are now paying the price.
Chapter 5: The Psychology of Resistance
If you want to understand why Copilot adoption has stalled
in so many councils, you need to look beyond the dashboards and usage metrics.
You need to understand the psychology of the workforce.
Fear of Job Loss
Staff-especially those in administrative, support, and
documentation-heavy roles-feared that Copilot was not just a tool, but a
replacement. The narrative wasn’t “Copilot will help me,” but “Copilot will
replace me.”
This fear wasn’t unfounded. In some councils, AI was
introduced without any reassurance about job security. There were no clear
statements from leadership about augmentation versus automation. And in the
absence of clarity, fear filled the vacuum.
Distrust in AI Outputs
Copilot, like any generative AI, can hallucinate. It can
produce plausible-sounding but incorrect information. Buckinghamshire Council
experienced this firsthand when Copilot hallucinated details about social care
clients.
Staff quickly learned that Copilot’s outputs needed to be
verified. But without training on how to prompt effectively or review
responsibly, many simply stopped using it. Trust, once lost, is hard to regain.
Professional Identity and Autonomy
For professionals-social workers, planners, legal officers-the
issue wasn’t just accuracy. It was identity. They didn’t want to be reduced to
AI editors. They didn’t want their judgment replaced by machine suggestions.
In Ealing Council’s pilot, staff expressed concern about
losing autonomy and professional integrity. They wanted tools that respected
their expertise, not ones that undermined it.
Chapter 6: Training That Works (And What Doesn’t)
Training is the single most important factor in successful
Copilot adoption. And it’s the area where most councils failed.
The Training Gap
- 39%
of councils allowed staff to use AI tools with no training or policy.
- Only
3 councils had a dedicated budget for AI training.
- Many
relied on passive materials-videos, links, or generic Microsoft
documentation.
This approach doesn’t work. Copilot isn’t a static tool.
It’s dynamic, contextual, and requires experimentation. Traditional training
models-rote instruction, one-off sessions-fail to equip users with the skills
they need.
What Good Training Looks Like
Barnsley Council’s “Flight Crew” is a masterclass in
peer-led learning. Over 150 champions supported staff, answered questions, and
shared tips. Workshops were interactive, contextual, and ongoing.
Buckinghamshire ran department-specific demos, one-on-one
coaching, and even internal “Dragons’ Den” events to surface use cases.
These councils didn’t just teach Copilot-they built
communities around it.
The Prompt Literacy Problem
One of the most overlooked skills in Copilot adoption is
prompt engineering. Staff need to know how to ask the right questions,
structure their inputs, and iterate based on outputs.
Without this skill, Copilot feels random. With it, it
becomes powerful.
Yet most councils didn’t teach it. And that’s a critical
failure.
Chapter 7: Governance, Ethics, and Data Risk
AI in the public sector isn’t just a productivity issue-it’s
a governance challenge. And many councils deployed Copilot without the
necessary safeguards.
Data Privacy and Security
Council staff handle sensitive data-social care records,
financial information, resident details. Feeding this into Copilot without
clear guidance is a recipe for disaster.
One council reported a data leak related to ungoverned AI
use. Others held back usage due to security concerns.
Without robust policies, DPIAs, and ethical frameworks,
Copilot becomes a liability.
Ethical Concerns and Bias
AI can reflect and amplify bias. It can produce
discriminatory outputs. And in public services, that’s not just a technical
issue-it’s a moral one.
Yet many councils lacked ethical oversight. They didn’t have
AI boards, review processes, or escalation paths. They deployed first and
governed later-if at all.
The Governance Leaders
Buckinghamshire and Barnsley stand out. They conducted early
DPIAs, formed governance boards, and built risk registers. They treated AI as a
strategic asset, not a shiny toy.
Their success wasn’t just technical-it was ethical.
Chapter 8: Recommendations for 2025 and Beyond
If councils want to succeed with Copilot-and with AI more
broadly-they need to change how they think, plan, and lead. Here’s what the
evidence suggests:
- Treat
AI as a Change Programme, Not a Software Rollout
- Invest
in Human-Centred Training
- Build
Trust Through Communication
- Establish
Robust Governance
- Start
Small, Scale Smart
- Celebrate
Wins and Learn from Failures
Chapter 9: Copilot Is Not a Plug-In-It’s a Cultural Shift
The story of Copilot in UK councils is not a story about
software. It’s a story about people, leadership, and the readiness of public
institutions to embrace change.
Too many councils treated Copilot like a plug-in-something
you install, announce, and walk away from. But Copilot is not a plug-in. It’s a
paradigm shift. It changes how people write, think, collaborate, and make
decisions. And that kind of change demands more than licenses and logins. It
demands leadership.
The Illusion of Readiness
The most dangerous assumption in the Copilot rollout was
that councils were ready. Ready in terms of infrastructure. Ready in terms of
skills. Ready in terms of mindset.
They weren’t.
- Many
lacked even basic digital maturity.
- Staff
were unprepared-technically, emotionally, and professionally.
- Leadership
often underestimated the cultural impact of AI.
This disconnect created a perfect storm: high expectations,
low support, and widespread disillusionment.
The Cost of Misalignment
When technology outpaces culture, the result is friction.
And friction, at scale, becomes failure.
- Councils
spent tens of thousands on licenses that went unused.
- Staff
disengaged, resisted, or quietly ignored the tool.
- In
some cases, unions intervened, demanding pauses or policy reviews.
The Opportunity Ahead
And yet, despite all this, I remain optimistic.
Because the councils that got it right-Barnsley,
Buckinghamshire, Somerset, Aberdeen-show us what’s possible. They didn’t just
deploy Copilot. They reimagined how work gets done. They invested in people,
not just platforms. They built trust, not just tools.
Their success wasn’t accidental. It was intentional.
And it’s replicable.
A Call to Action
If you’re a council leader, IT director, or transformation
lead, here’s what I urge you to do:
- Pause
and reflect.
- Talk
to your staff.
- Invest
in training.
- Build
governance.
- Celebrate
progress.
And most importantly: lead.
Because Copilot is not just a tool. It’s a test. A test of
whether we can adapt, evolve, and lead with empathy in the age of AI.
Conclusion: The Road Ahead
The road to Copilot adoption in UK councils has been uneven,
at times inspiring, at times sobering. But it’s not over. In fact, it’s just
beginning.
The councils that learn from early missteps, that center the
human experience, and that treat AI as a journey-not a destination-will thrive.
They will unlock new efficiencies, empower their staff, and deliver better
services to the public.
The others? They risk becoming cautionary tales.
The choice is ours.
Let’s make it wisely.
Comments
Post a Comment