Interview: NAO Chief Analyst delves into Use of AI in Government Report

Chief Analyst at the National Audit Office (NAO), Ruth Kelly and Director of  Use of Artificial Intelligence in Government report speaks to Government Transformation Magazine to discuss strategic recommendations, the question of marrying action with ambition and integrating a top-down AI strategy to departmental-level needs.

Interview: NAO Chief Analyst delves into Use of AI in Government Report
5:21

The report findings may be a bitter pill to swallow but provide constructive guidance based on a rigorous study of 89 government bodies.

Governance Leaders

Ruth KellyPublic trust in AI remains critical, and needs to be matched with  safe, responsible adoption but, the standards and the assurance arrangements needed to do that are still under development. Kelly noted that, “Some of the bodies we interviewed talked about not knowing exactly the guidance that’s out there and were unsure where to get a definitive view of what they needed to consider."

One body taking the lead in establishing clear guidance for responsible AI use is the CDDO. Their Generative AI Framework aims to improve the assurance of use cases that have a high-risk AI component, but Kelly emphasised "there is still a long way to go". The governance arrangements for the draft strategy for AI adoption in the public sector is currently separate from DSIT’s:

“We found that the governance arrangements for AI adoption in the public sector are separate from the governance arrangements for DSIT to oversee the wider AI policy delivery. There’s also no representation in the wider public sector e.g. police and schools. This misses out on some of the benefits of a coordinated approach,” Kelly emphasised.

A joint review of the planned governance arrangements of the Cabinet Office and DSIT is needed to work out “how they’re functioning, whether changes need to be made and clarity on responsibilities to manage the potential risk of duplication”.

Barriers to AI adoption

Three barriers to AI adoption the report identified were lack of skills, lack of knowledge sharing and abundance of legacy systems. 

Skills and understanding are needed at all levels but recruiting and retaining staff was listed as one of the greatest limitations to adopting AI. She divides these key people into three main categories: First, ‘there's people with core AI skills. That's important to be able to understand opportunities and to design the applications and the systems. Secondly, there’s more traditional digital skills, which are required to then actually embed those into the department systems and thirdly, it's really important that there's capacity with an operational team to be able to integrate those new tools.’

Knowledge sharing is a key barrier, especially for arm’s-length bodies, with ‘74% government body respondents needing support around sharing knowledge and insight’, according to the report. Drawing on this, Kelly recommends bringing together the knowledge and sharing the insights and knowledge systematically. She notes that whilst there may be stronger informal sharing networks in different areas of government, ‘there aren't formal structures and different parts of the system are quite disjointed’.

Legacy systems are a huge challenge for the government. Rather than building on top of legacy systems that in some cases date back several decades, modernising and investing into integrating the backlog needs to happen in an ‘architectured’ way and in multiple, long-term stages.

Matching Action with Ambition

The government's draft strategy is ambitious - but the implementation isn’t quite there. In Ruth’s words, the Government needs “a clear plan with clear accountabilities and performance metrics”. 

The NAO report has highlighted the need to look at AI implementation at an integrated level. To make sure the adoption strategy is feasible, the Government needs to ensure the departmental-level adoption plans are equal to the sum of the overarching ambition in public sector productivity gains. With the government requesting departments to have AI adoption plans in place by the middle of this year, clear performance metrics and monitoring arrangements are crucial to building a foundation of accountability.

Kelly points out that while AI implementation has to be driven both from the bottom up and the top down, ultimately, it has to be driven by business needs such as addressing the most costly processes or those that require the most manual interventions. AI won’t always be the solution, so understanding the business need should be the first starting point- not the technology.  

“Departments know what their pain points are and what’s amenable to automation and AI. But you also need the top-down view to be able to share knowledge and insights. It’s also about bringing everything together to clarify responsibilities and making sure all the enablers are in place - and then, critically, monitoring and holding people to account. 

“That needs strong central leadership and a really clear plan of what that looks like, even if the individual elements have been strongly informed by the bottom-up view of what departments think would be helpful for them,” she says.

Also Read