The Centre for Data Ethics and Innovation (“CDEI”) has published its report on the UK’s implementation of data-driven technology and Artificial Intelligence (“AI”) to enable responses to the coronavirus (COVID-19) pandemic (the “Report”). The use of applications incorporating AI, is growing, particularly in the medical and public health sectors. However, AI did not have as significant a role to play in relief efforts beyond vaccine discovery as many technology and science commentators expected. This may in part be explained by the results in the CDEI’s longitudinal survey cited in the Report (discussed below), that concludes that public support for increased use of digital technology is dependent on the public’s trust that sufficient governance is in place for the use and operation of data-driven technologies.
WHAT IS THE CDEI
The CDEI is an independent committee of cross-industry experts, set up in 2018 by the Government to advise on how the UK can benefit from the use of data-driven technology and to recommend how the Government can create the regulatory, technological and ethical environment best suited to its application. The CDEI is also part of a working group of 32 regulators and organisations set up to consider AI-related issues (we discuss this in further detail in our article: Planning for the AI landscape of the future: the latest from the UK Government).
From March 2020, the CDEI began to keep a record of examples where data-driven technologies were being used to either suppress the virus or deal with its impact. These uses were documented within the CDEI’s database for novel use-cases of data and AI (118 in total): the ‘COVID-19 Repository’ (the “Repository”). Each entry into the Repository was assigned a primary purpose: (1) managing the immediate public health crisis; (2) supporting the public health response and mitigating the effects of lockdown; and (3) building future resilience and aiding the recovery. The CDEI examined these use-cases and their findings revealed some notable trends and patterns which we discuss below.
Often the key issue when discussing the prevalence, effectiveness and future outlook of data-driven technologies is the question of public trust. The CDEI sought to gather information on public attitudes to adoption of such technology and its rapid deployment during the pandemic by polling more than12,000 individuals from June to December 2020.
A key finding from the survey is that, “to realise a sustainable increase in the use of new technologies, the analysis suggests the critical importance of building and maintaining trustworthy governance.” The data showed this was the most significant predictor in public support for the adoption of technologies., Whilst 43 per cent of the individuals surveyed trust that the right rules are already in place 24 per cent disagreed. This general result appears to be largely consistent across age, region and gender. It is worth comparing the attitudes revealed in the Report, which are mostly specific to the use of digital technology in dealing with the pandemic, with data gathered by the Department for Business, Energy and Industrial Strategy (“BEIS”) in 2020. In the BEIS survey (which we discuss in the article: “AI in the UK: No room for complacency” and no room for a separate AI regulation”), 44 per cent of people said that they were neither positive, nor negative about AI, with a further 8 per cent saying they did not know. Only 28 per cent of people said they were positive about AI, while 20 per cent felt negative about it.
The Report discusses ten key themes drawn from the CDEI’s analysis of the Repository:
- Conventional data analysis has been at the heart of the COVID-19 response, not AI.
- Existing datasets provided the basis for much of the pandemic response.
- New methods of data storage were implemented to enable data sharing.
- In the face of a public health crisis, community data sharing increased.
- Local governments increasingly realised the importance and value of data.
- Many existing tools have been repurposed to solve COVID-19 related problems.
- Where AI is prevalent, it is often being used in a healthcare setting.
- Data-driven tools are also being used to measure and understand the effects of new rules.
- The focus is beginning to shift towards building future resilience.
- Data sharing across borders facilitated the discovery of new vaccines and treatments.
In this article we summarise five of the themes.
“Theme 1” – Data has been at the heart of the response
Data collection and analysis has been key in enabling the Government to act on tackling the pandemic by being able to trace infection and death rates. However, it has also been used to gauge where equipment is needed across hospitals and analysing population movement. The ONS and its Data Science Campus looked at the impact of pandemic restrictions and was able to provide the Government with timely indicators on, “the impact of social distancing, the number of people in self-isolation, changes to trade in goods and the effect on businesses”. To assist national organisations in coordinating the response, the Government commissioned the NHS to develop a data platform to allow secure, timely and reliable data to be shared.
On the other hand, AI/ML use was not widespread according to the CDEI. They put this partly down to the lack of access to training data used to train algorithms due to the fact that COVID-19 is still such a new phenomenon.
“Theme 3” – New initiatives for data storage and analysis
Challenges to data collection and storage were brought to the fore by the sudden onset of the pandemic. Organisations and public bodies urgently needed access to data to help keep the public safe and services running, and so new data sharing efforts were required. Many datasets were opened up to the private sector for the first time, such as supermarkets being given information on vulnerable patients most in need of shopping assistance. Large data sets were pooled to enable more sophisticated analysis and identify signs of system stress in public services. However, these initiatives created administrative burdens including developing legal agreements, oversight measures and data storage tools. One initiative was the national chest imaging database (NCCID), a centralised database pooling X-Ray, CT and MRI images from across the country to better understand the virus.
“Theme 6” – AI put to use in healthcare settings
As mentioned above, data collection and analysis throughout the pandemic has been extensive. However, AI/ML also played a role in the healthcare setting such as via chatbots providing policy and guidance updates to hospital staff. Additionally, an AI-driven testing tool was developed and trialled by Oxford University Hospitals to screen patients for COVID-19. The test was trialled, studied and eventually approved for clinical pilot. It allows patients to be tested within the first hour of them arriving to hospital and is significantly quicker than the common swab test. The AI model used within the tool (CURIAL) was trained using blood samples and observations and it operates by attempting to find a biological and physiological signature of the virus in potential COVID-19 patients.
“Theme 7” – Tools repurposed for COVID-19
The CDEI points to data-based innovation as a factor in the country’s response to the crisis which it believes would otherwise have been much slower. Existing AI applications were, “pivoted”, and this was seen across sectors as, “different disciplines were able to support each other with novel applications of their existing tools in a new field.”
In response to concerns around the volume of non-factual virus-related content, Facebook increased use of its automated content moderation systems to direct users to factual information. Also, University Hospitals of Morecambe Bay NHS Foundation Trust trialled a mixed reality headset, the HoloLens 2 designed by Microsoft, to minimise facet-to-face contact between patients and hospital staff.
“Theme 8” – Measuring the effects of pandemic rules
Understanding the effectiveness of social distancing and other lockdown measures has been made easier by AI and data-driven technology. Examples of uses of such technology have included providing the public with data to help plan their journeys and avoid busy city centres at times when social distancing is not as easy. According to the Report, researchers were also able to use this technology to monitor how social distancing measures were changing human behaviour, drawing on a combination of traffic flow, air quality and energy consumption data. A number of companies have introduced wearables (such as Bump) for automating social distance control at work.
Whilst the Report could be seen as a disappointing outcome for the use of AI, the CDEI nevertheless considers that, as the focus is beginning to shift towards building future resilience, that AI will play a greater role in public health.
Although the Report highlights the range of ways in which AI and data technologies were deployed to manage the public health crisis, monitor the effects of lockdown and help keep public services running, it may be some time before we can fully evaluate the impact of the initiatives and the consequences of such large-scale and rapid pooling of public data.
What has become apparent is that the UK policymakers intend to take an active role in building up trust and effective governance via a new AI strategy. On 12 March, the Digital Secretary Oliver Dowden announced a new strategy to “unleash the transformational power of artificial intelligence”. The new AI strategy will focus on: (i) growth of the economy through widespread use of AI technologies; (ii) ethical, safe and trustworthy development of responsible AI; and (iii) resilience in the face of change through an emphasis on skills, talent and R&D. The Government is set to work on the proposals by the AI Council which we have discussed in the article: “AI Roadmap”: UK AI Council calls for a national strategy. However, it is not yet clear what specific role the CDEI will play in this strategy process. The Government has previously noted in its response to the report by the House of Lords (discussed here) that it is considering what CDEI’s future functions should be.
Finally, the CDEI has developed a Trust Matrix to help organisations build their own trustworthy governance system for how they use data and AI technologies. For more information about this and the CDEI’s work, you can visit their blog available here.
The authors would like to thank Aadam Sattar, trainee solicitor, for his assistance in writing this article.