What can we learn from the UK Government's digital agenda and earlier computerisation efforts, stretching back to the post-war years? Dr Antonio Weiss, a new Digital State affiliated researcher at the Bennett Institute, argues for a more nuanced appreciation of both the successes and failures of Britain's historic approach to digitalisation and information technology.
This summer the UK House of Commons Science and Technology Committee lamented ‘the slowing of the Government’s digital momentum, as evidenced by other countries overtaking the UK in international rankings’. The Committee’s ‘Digital Government’ report made for wistful reading, longing for the days of ‘political leadership in digitisation’ and decrying ‘the departure of senior Civil Service figures in the Government Digital Service [GDS].’ The report was largely short-termist in its historical lens. On reading, it would be easy to conclude that ‘Digital Government’ was only born with the advent of GDS in 2011 and that all that preceded was of little import. This would be wrong. The British state, in particular in the early postwar years, was an international leader in mechanisation, computerisation and much of what is now deemed the ‘digital agenda’. Yet it lost its way. The lessons of this must be remembered, to avoid being repeated.
The British state was at the vanguard of computerisation in the 1960s and 1970s. Political support was clear: the Labour Prime Minister Harold Wilson put high expectations on science and technology to deliver his vision for modern Britain. His chancellor, James Callaghan, wrote in 1965 a memo of the need for a ‘quick review of the favoured automatic data processing programs for government departments…[with a view to] accelerate and expand existing programmes.’ As the historian of science Jon Agar has written, in this period ‘official visitors from overseas and from industry, commerce and local government came to learn’ from the British state.
Digitisation in this era was multidimensional. Programmes such as the mechanisation and automation of the tax payment ‘Pay As You Earn’ (PAYE) service at the Inland Revenue were bold and ambitious. World-leading mainframe computers such as the IBM System/360 in the Ministry of Defence were procured. Capabilities were developed in-house; in 1972 there were 4,640 trained information technology staff in central government, a sizeable number relative to industry. Advanced methodologies were developed; in the Central Computer and Telecommunications Agency (one of the many central government digital units that foreshadowed GDS) PRINCE2, a globally adopted approach to project management, was developed. And significant investments were made; in 1976 the National Enterprise Board – a state investment body – bought a 43 per cent stake in Sinclair Radionics, a hitherto leading electronics company.
But subsequent decades saw Britain retreat from its leading role. A pernicious myth of ‘decline’ took hold. Seeking to explain how the rest of the world’s economies appeared to be growing faster than Britain’s, a potent cocktail of journalists, politicians and companies seeking to profit from the public sector perpetuated the view that there was something ‘wrong’ with Britain. Even historians joined in, with many blaming a supposedly amateurish, generalist Civil Service for Britain’s demise. A new ideology of state governance swept in from America in the 1990s: the ‘New Public Management’. In this view, the state needed to become more like the private-sector to succeed.
As critiques of Britain’s approach to information technology mounted – and to be sure, in the 1990s in particular many large-scale IT programmes failed to deliver their initial lofty ambitions – successes were ignored. The much maligned Operational Strategy – a programme to automate and digitise social security payments, delivered jointly by civil servants and management consultants – was probably the biggest computerisation programme in Europe in the 1980s and 1990s. A 1995 Social Security Committee concluded it had delivered over £3bn in efficiency savings, and waiting times for benefits payments decreased whilst customer satisfaction increased. Similarly, though the National Health Service’s technology infrastructure is rarely commended, England is highly unusual in that since 2004, all GP sector electronic health records (EHRs) have been digitised. One could argue this took a long time – the initial scheme launched in 1989. But the rightly heralded success of Britain’s GOV.UK site has its origins in the vision of a single portal for state services conceptualised as far back at the 1996 ‘Government Direct’ report. Such transformations take time.
As Britain seeks to regain its role as a world leader in the digital agenda, a more nuanced appreciation and reading of its historic successes and failures is required. Without this, simplistic, knee-jerk reactions are likely to result. The answer is never a straightforward choice between public and private; centralised or decentralised; agile versus waterfall; good leadership and bad leadership. For Britain to lose its way once was unfortunate. To do so again would be careless.
The Digital State project, led by Dr Tanya Filer, sets out both to lead policy research and provide a forum for broad-ranging discussion with academics and policymakers on the opportunities and challenges that digital technologies pose to policymaking, governance and democracy.
The views and opinions expressed in this post are those of the author(s) and not necessarily those of the Bennett Institute for Public Policy.