From data to decisions: transforming public services with real-time analytics
Timely, data-driven decisions are good decisions - and the public sector is increasingly harnessing the power of real-time analytics to transform service delivery and improve outcomes.
In a recent webinar hosted by David Wilde, General Manager of Government Transformation Magazine, leading experts discussed the power of real-time analytics, how to overcome the hurdles of legacy systems, and how cross-departmental collaboration can unlock the full potential of data.
The panel included Ruth Studley, Director at the Office for National Statistics; Aravindhan Rajasekar, Principal Architect at the Department for Environment, Food & Rural Affairs; Tim Williams, Account Director at Actian; and Steve Parry, Advisor and Author of Crossing the Data Delta.
How real is ‘real-time’?
Real-time data was one of the core concepts discussed, and Ruth Studley highlighted its nuanced nature. "In the context of census data, real-time doesn’t mean instantaneous, but it does mean much more timely than the traditional 10-year snapshot," she explained.
Studley outlined how the ONS is now providing more frequent updates using administrative data, improving responsiveness to demographic and public health changes. This evolution will offer annual outputs rather than a once-in-a-decade snapshot.
Studley drew on the pandemic as a key example of why real-time data is essential. "During COVID-19, the reliance on 2011 census data made it difficult to assess vaccine uptake rates in real time. In some areas, we were vaccinating more people than the census indicated lived there," she noted. This discrepancy underscored the urgent need for more timely data in planning public health responses.
Tim Williams echoed Studley’s point, sharing how Actian worked with the University of Oxford during the pandemic. "They were processing huge volumes of epidemiological data that took days to run. We helped reduce that time to minutes, which accelerated the development of the AstraZeneca COVID-19 vaccine," Williams said.
He also stressed that "speeding up these processes is crucial in both public health and broader public sector services." Aravindhan Rajasekar pointed to environmental monitoring as another area seeing significant benefits. "Real-time data allows us to monitor environmental conditions like air and water quality and make immediate decisions that affect public health and safety," he said.
Rajasekar noted that DEFRA’s work in this area enables local authorities to act swiftly based on real-time data, preventing delays that could exacerbate environmental risks.
Rajasekar also introduced the concept of telemetry as a service, which involves the collection of remote data through sensors to help authorities manage public services. "By capturing logs across platforms, infrastructure, security, and business processes, we can make immediate, informed decisions. This not only helps public services run more efficiently but also ensures we meet citizens’ needs in real-time," he explained.
Steve Parry built on this, emphasising the importance of "data crumbs"—small pieces of data that, when integrated, can have a significant impact. He cited a case in Brent where the collection of small, seemingly unrelated data points revealed that over 40 people were living in unsafe conditions in a single property. "These ‘data crumbs’ triggered a broader intervention involving multiple agencies, including social care and education," Parry said. He highlighted how real-time insights, even from small data sets, can drive significant action across departments.
Overcoming legacy systems
While real-time analytics holds tremendous promise, the public sector continues to grapple with legacy systems and technical debt. Williams acknowledged this challenge: "Many public sector organisations are working with outdated systems that make it difficult to implement real-time analytics. However, we don’t believe in ‘boiling the ocean.’ Start small and scale from there. You have to tie any technological change to immediate strategic value."
Parry agreed, stressing the importance of data trust and governance: "Even if you get data quickly, if you can’t trust its quality or provenance, it’s not going to help. Building a culture of data trust and governance is essential to overcoming the challenges posed by legacy systems."
Parry also advocated for leveraging existing infrastructure: "You don’t have to replace everything at once. Instead, work with what you have, and layer real-time analytics on top of trusted, existing data sources."
In Rajasekar’s view, addressing technical debt is about striking a balance between short-term and long-term solutions: "You can’t eliminate technical debt, but you can reduce it by focusing on sustainable, scalable solutions," he said.
Rajasekar advocated for cross-departmental collaboration, ensuring that different agencies leverage each other’s innovations rather than duplicating efforts. "We need to stop working in silos and start sharing resources across departments. That’s how we can gradually reduce technical debt," he added.
Governance and transparency
A key theme throughout the webinar was the importance of data governance in building trust - both in the data itself and the systems that manage it.
Studley explained how the ONS’s Code of Practice ensures that all data is rigorously assessed for quality, trust, and value.
"Our official statistics undergo independent review, which helps ensure that the data we provide is not only timely but trusted," she said. This emphasis on transparency is critical when public sector organisations use real-time data to inform key decisions.
Parry echoed Studley’s point, noting that governance isn’t just about technology - it’s about understanding how data flows between systems. "Data needs to be well-governed, and organisations must know the origins, transformations, and governance rules applied to it. Without this, even the fastest data won’t lead to meaningful insights," he explained.
What role for Generative AI?
As the discussion drew to a close, the panel touched on the future role of Generative AI (Gen AI) in real-time analytics. Actian’s Williams was cautious, noting that "Gen AI holds promise, but you need to have strong data foundations in place before you can truly leverage its potential. If your data isn’t clean or reliable, AI will only amplify those problems."
Rajasekar agreed, adding that organisations must also be ready for the cultural shifts that AI will bring. "Before we fully embrace AI, we need to ensure it’s sustainable and doesn’t compromise security. There’s also a major cultural shift that comes with AI—are we ready to adopt it without losing the human element in decision-making?" he asked.
Parry emphasised that while AI will play a significant role in the future, it must be introduced incrementally. "Co-pilots, or AI-driven tools, can help automate data governance, but we need to ensure these tools are used responsibly," he said.
He highlighted the importance of maintaining a human-in-the-loop approach to ensure AI applications remain ethical and effective. The consensus from the webinar was clear: real-time analytics offers significant opportunities for public sector transformation, but success hinges on trust, governance, and strategic implementation.
As Wilde summarised, "Real-time analytics can make a profound impact, but only if we are clear about what we’re trying to achieve and how we’re using the data. It’s not just about speed - it’s about making sure the data is right and that we’re working together across government."
Interested in accelerating digital transformation in the UK public sector? Read the full Actian report here.