BJSS reviews key themes from Government Data Summit
Last October, BJSS sponsored the Government Data Summit, where we hosted the roundtable on predictive analytics with a colleague from the Cabinet Office.
Throughout the course of the day, which was split between formal presentations, panels and roundtable-based discussions, we listened to and participated in a range of inspiring conversations.
If we had to summarise the event, there would probably be two key themes which emerged.
-
The complicated relationship between ethics, policy and public trust/opinion
-
How easy it is to get analysis paralysis, especially when it comes to… analytics
Ethics, policy and public opinion
In a world where more data is being collected than ever, where data breaches, and abuses of such data have made the front page of national newspapers worldwide, and technology continues to evolve so fast it is hard to predict how that data might be used in the near future, the public rightly desire to know where their data lays and what purposes it serves.
New policies have been put in place in the past few years, (GDPR probably being the most famous of all given it was fitted with real teeth). And there are no real blanket policies that provide coverage for cross-department/body data sharing.
A genuine concern of breaching rules without explicit policy coverage and end up on the front page certainly can stifle innovation and meaningful changes in the public interest.
It often seems easier to find reasons not to do something than to do it. Add to that a lack of modern data tools and skills as well as available, current, accurate and trusted data and you have the perfect recipe for a stress-induced, AI-powered migraine.
Ethics in data and ethical AI have also risen in importance, often following poorly planned uses of data and AI. The question is shifting from “can you do this with data?” to “should you?” And once you’ve ascertained that you should, it becomes about considering the consequences of building new solutions, generating new insights and obviously the biases in your data.
In short, upholding high ethical standards will provide better solutions, help avoid front page news, build trust with the public and gather their support for new policy.
An example we shared at our roundtable was the work we did with NHS Digital by rolling out Oxford University’s QCOVID algorithm and running it on a large portion of the population of England in order to identify people who were Clinically Extremely Vulnerable.
We were working with highly sensitive data and needed to stay within the realms of GDPR which clearly state that you should only store and use data where you have a legitimate use. In order to respect these regulations, we created a rules-based system to pre-qualify individuals that would be run through the newly built Risk Stratification platform to ensure evidently safe people would not be needlessly processed. This resulted in an additional 1.7 million people identified for the shielding list and provided necessary protection to those who were most at risk.
Overcoming analytics paralysis
As discussed above, it often seems easier to find reasons not to do something than to do it. Add to that a lack of modern data tools and skills as well as available, current, accurate and trusted data and you have the perfect recipe for a stress-induced, AI-powered migraine.
Many leaders at the summit talked about the importance of a data strategy but most admitted to being guilty of letting their strategy gather dust on a shelf. Their advice was that those strategies should be at the service of a robust plan of action, enabling their respective departments or agencies to deliver towards strategic priorities.
A key phrase repeated more than once was that “perfect is the enemy of good enough”. BJSS has always been a strong proponent of delivering frequent, incremental value, resolving the needs of today while ensuring that solutions are future-proofed.
To break analytics paralysis, our key piece of advice is to just get started. Find a high-value use case, build just enough infrastructure, ingest just enough data, and measure your results. This approach supports “Strategy by doing”: by following a clear goal with well-defined benefits, building what is required will help identify organisational, skills and technology gaps around a real-world example. This approach provides time to elaborate plans to remediate those gaps while also delivering value and gathering support and enthusiasm.
In short: start small and get value quickly while planning for the future.