AI in Local Government: Getting Governance Right Before It Gets Complicated

AI in Local Government: Getting Governance Right Before It Gets Complicated

Published on:
Reading time: 2 minutes read

Artificial intelligence is no longer a future issue for local government. Many councils are already using AI tools to summarise documents, analyse information and support decision-making. The efficiencies are real — but so are the expectations around how these tools are used.

AI can sit comfortably within existing public sector frameworks. The key is ensuring governance, transparency and accountability keep pace with adoption.

AI is already in use — and the rules still apply

AI is increasingly part of day-to-day activity across councils. Used appropriately, it can reduce administrative burden and help officers process complex information more efficiently.

However, local authorities operate within a well-established legal framework. Statutory duties, public law principles and information rights obligations continue to apply regardless of whether work is supported by AI. In practice, this means AI must be treated as part of the council’s governance structure — not as a separate or informal tool.

A practical issue: disclosure risk

One of the most immediate implications of using AI is that inputs and outputs may be disclosable.

Anything entered into, or generated by, an AI system may form part of the information a council holds. This can bring it within scope of Subject Access Requests, the Freedom of Information Act 2000 and the Environmental Information Regulations 2004.

In practical terms, prompts, draft outputs and AI-assisted analysis may all need to be disclosed if they inform a decision or relate to identifiable individuals. The safest approach is therefore to assume that AI-generated and AI-assisted material may be subject to scrutiny from the outset.

Transparency is fast becoming the norm

There is an increasing expectation that councils are open about their use of AI.

Emerging best practice includes clearly explaining where AI is used, identifying AI-assisted outputs and publishing accessible information about controls and oversight. This supports public trust and strengthens defensibility where decisions are challenged.

AI supports decisions — it does not make them

Decision-making must remain lawful, rational and fair. AI can assist, but it cannot replace professional judgement.

Officers remain responsible for validating outputs, identifying potential bias and ensuring alignment with statutory duties and local policy. Clear human oversight must be retained at all times.

An evolving litigation landscape

AI is also changing the environment beyond the council. Litigants in person are increasingly using AI tools to generate legal correspondence and arguments, while some providers rely heavily on AI-generated content.

For councils, this creates a practical challenge: documents may appear credible but contain inaccuracies or unsupported assertions. AI-generated material should therefore be treated like any other unverified source, with appropriate scrutiny applied.

What councils should be doing now

The focus should be on embedding proportionate governance rather than restricting use.

This includes establishing a clear internal policy, training staff, maintaining strong information governance controls, carrying out proportionate risk assessments, ensuring procurement processes address AI risks and being transparent externally about how AI is used.

 

If you have any queries regarding the use of AI please do not hesitate to contact our expert local government solicitors.

Did you find this article useful?

Written by:

Patricia Grinyer

Patricia heads the Weightmans banking and finance team and advises on all aspects of financial services specifically public sector finance.

Related Sectors: