The National Audit Office (NAO) recently published a guide for senior government leaders on improving the use of data, which points to endemic difficulties in achieving benefits from data sharing.
Variable data quality is cited in the guide as a hindrance to effective data use: “Data collected by one part of government may not be of sufficient quality to be used by a different part of government for a different purpose. [The] Government’s Data Quality Framework offers a more structured approach to improving the quality of data held by departments.”
The list of barriers to better data use in government cited by the spending watchdog is considerable. Standards are hard to implement because, according to the NAO: “The structure of government is heavily siloed and departments have a high degree of autonomy. Legacy systems make it difficult to introduce standards into this environment and government has struggled to make substantial progress over the past 20 or so years.”
Data analytics is also depicted as inadequate to the scale of the problem: “Data analytics and tools work well with good-quality data, although effort is required to engineer the data when it comes from disparate sources. But there are situations where the accuracy and integrity of the data will make analytics difficult to apply, especially for personal data.”
The creation of cross-governmental datasets for multiple users is almost a non-starter, according to the watchdog: “Merging personal data which does not easily match is difficult. Further questions arise around ownership, maintenance, funding, privacy, and the risks arising from data aggregation.”
The guide cites two categories of organisation that can act as beacons for government leaders. One is the Silicon Valley tech giants, the other is the financial services industry, which was forced on to the path of good data government after the financial crash of 2008, caused by the systemic bad practices of the sector itself.
It states: “Organisations that understand and have succeeded in overcoming the data challenge fall into one of two broad categories.
“Firstly, there are those which are designed and built for data exploitation from the outset and do not carry the ‘baggage’ of legacy systems and ways of working. Examples include Google, Amazon and Netflix. As a result they are naturally able to exploit their data assets and can readily take advantage of business intelligence, advanced analytics and artificial intelligence.
“Secondly, there are organisations with legacy systems which have been forced to address the data challenge in response to external events. For example, following the financial collapse of 2008, the financial services sector was subject to additional regulatory obligations.”
The report outlines a way forward that consists of four elements: embedding data standards, taking a structured approach, addressing legacy issues and enabling data sharing.
“The Committee of Public Accounts has urged [the] Cabinet Office to identify and prioritise the top 10 data standards of benefit to government,” it notes.
The NAO welcomes the setting up of a CDO Council in 2021, the creation of the Data Standards Authority in 2020, and the creation of a Data Architecture Design Authority, described as “a new body to review, approve and monitor adoption of data architecture principles and frameworks”.
In relation to resolving the legacy issue, the guide backs up the Committee of Public Accounts’ recommendation that the Cabinet Office and the Department for Digital, Culture, Media and Sport should identify the main ageing IT systems that, if fixed, would allow government to use data better; and ensure that whenever departments replace or modify these systems it is done with full consideration of how the systems will support better use of data in government.
The guide’s recommendation on data sharing leans on the Open Data Institute’s Assessing risks when sharing data: a guide. It draws attention to its own 2018 report on the Windrush scandal, “where the department concerned [the Home Office] shared data without fully assessing its quality with the potential for citizens being wrongly detained, removed or denied access to public services”, as an example of how damage could be caused by the imprudent sharing of government data.
The report concludes by reiterating a recognition that government data is a leading cause of inefficiencies, that underlying data issues need to be fixed, that “focused effort, funding and prioritisation” is essential for data management in government, and that there is a perennial danger of initiatives petering out in the face of adversity.
These recommendations seem broadly in line with those made by Michael Gove, the immediately former secretary of state of the Department for Levelling Up, Housing and Communities.
The Scot’s enthusiasm for data is well known, and featured in his notable Ditchley Park speech, given in July 2020. This postulated the leveraging of data analytics as part of an agenda for a modernisation of the state.
In it, Gove said: “Government needs to evaluate data more rigorously, and that means opening up data so others can judge the effectiveness of programmes as well. We need proper challenge from qualified outsiders. If government ensures its departments and agencies share and publish data far more, then data analytics specialists can help us more rigorously evaluate policy successes and delivery failures.”
The department he most recently led was behind the Levelling Up and Regeneration Bill, announced in the Queen’s Speech in May, which includes proposals for digital planning powers to be given to local authorities in England and Wales, based on open data.
Johnson remains the prime minister, despite having resigned as leader of the ruling Conservative Party on 7 July, one day after dismissing data evangelist Gove.
That is the ineluctable political context of the NAO’s Improving government data: a guide for senior leaders.