In a recent study posted to the medRxiv* server, researchers in the Netherlands and the United Kingdom formulated recommendations for future pandemics using lessons learned from the testing programs implemented in the United Kingdom (UK) during the ongoing coronavirus disease 2019 (COVID-19) pandemic.
Study: A multistage mixed-methods evaluation of the UKHSA testing response during the COVID-19 pandemic in England. Image Credit: zstock / Shutterstock
*Important notice: medRxiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as conclusive, guide clinical practice/health-related behavior, or treated as established information.
Background
In 2020, the UK Health Security Agency (UKHSA) established a large-scale testing program via the National Health Service (NHS) in the UK to rapidly identify individuals who had COVID-19 to ease the burden on the already inundated UK health system. However, given the complexity and magnitude of the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing response in England, previous relevant research is scarce on the success of such programs. So, UKHSA commissioned an independent evaluation to assess the success of this approach.
The delivery of COVID-19 testing for each of the nine target populations was multi-modal. Since the UKHSA executed COVID-19 testing through in-person testing (e.g., mobile testing units), pharmacies, and self-test kits, the UKHSA stakeholders continually reviewed the success of these approaches amid changing epidemiological prevalence, emerging scientific evidence, and vaccination rollout.
About the study
In the current study, researchers adopted a stepwise mixed-methods approach to address key research questions relating to the rollout of testing to nine target populations via the NHS COVID-19 testing services in England between October 2020 and March 2022.
One of these questions helped the researchers understand what factors affected the delivery and uptake of each service. Another question helped them assess the barriers and facilitators to access, use, and deliver each service. In addition, the team evaluated the cost to the government and the cost-effectiveness of each service.
The researchers developed a "Theory of Change" (ToC) approach to mapping the causal pathways for each of the nine service settings. More importantly, they regularly consulted UKHSA stakeholders to discuss causal assumptions and hypotheses evolving throughout the current study. The team leveraged narrative reviews as a method for synthesizing quantitative and qualitative data, which they fed into the ToC. The team converged this data with publically available data to cover all people working in healthcare settings, universities, and schools. Where feasible, they also merged data with publicly accessible datasets stratified based on age, income estimates, urban and rural settings, and ethnicity.
Furthermore, the research teams worked concurrently across different service settings and communicated their findings at weekly meetings. It helped them discuss and explore the emerging findings (if deemed necessary). The idea was to synthesize findings across all the testing services to inform program-level insights and the overarching ToC. Finally, they wanted to develop an easily accessible, ready-to-use, modifiable dashboard with pandemic preparedness testing packages structured for low, medium, and high levels of preparedness.
While the current study primarily performed a retrospective evaluation, it also had prospective components to inform testing strategy in preparation for future pandemics, which it did through participatory modeling simulation and policy analysis.
Discussion and conclusion
The study datasets covered SARS-CoV-2 seroprevalence surveys and COVID-19 testing and vaccination data, and COVID-19 cases, hospitalizations, and deaths. Data analysis provided a detailed overview of testing implementation outcome indicators of each service identified in the ToC. In addition, it helped the researchers understand the reach of each testing service and estimate their impact, which they eventually fed into the cost-effectiveness evaluation.
The cost-effectiveness evaluations of testing strategies covered test unit costs and deployment costs. It also quantified the economic productivity gained by shortening quarantine periods and savings to the taxpayer. Intriguingly, perceptions of disease risk and socioeconomic factors drove adherence to testing policy.
Future testing strategies should account for disease epidemiology, health system capacity, and public engagement, for instance, reasons for reduced testing uptake. Further, it requires a situational assessment of the intensity of disease transmission and a cost-benefit analysis of each strategy. In a nutshell, these recommendations for future pandemic preparedness strategies should consider the interplay among the various strands of the current study's evaluation.
The idea was to create an easily accessible, ready-to-use, modifiable dashboard with pandemic preparedness testing packages structured for low, medium, and high levels of preparedness.
The current study is a provisional draft protocol representing research in progress. Nevertheless, it is the first pan-UK evaluation of the testing response to COVID-19 that made recommendations that would ensure that the research and development and regulatory pipeline remains fit for anticipating diagnostic requirements for diseases of epidemic potential. To conclude, the approach proposed in the current study could very well aid the evaluation of responses to other pandemics and other types of interventions.
*Important notice: medRxiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as conclusive, guide clinical practice/health-related behavior, or treated as established information.
Journal reference:
- Preliminary scientific report.
A multistage mixed-methods evaluation of the UKHSA testing response during the COVID-19 pandemic in England, Reshania Naidoo, Billie Andersen-waine, Prabin Dahal, Sophie Dickinson, Ben Lambert, Melinda Mills, Catherine Molyneux, Emily Rowe, Sarah Pinto-Duschinsky, Kasia Stepniewska, Rima Shretta, Merryn Voysey, Marta Wanat, Gulsen Yenidogan, Lisa J White, EY-Oxford Health Analytics Consortium, medRxiv pre-print 2022, DOI: https://doi.org/10.1101/2022.10.27.22281604, https://www.medrxiv.org/content/10.1101/2022.10.27.22281604v1