Dr Zisis Kozlakidis directs our thoughts towards infection prevention and control, including technology, in this the first of a special two-part series
Infection prevention and control remains a major challenge for healthcare systems around the world, especially as it is fundamental for patient safety and health system performance.
The ongoing COVID-19 pandemic is evidence of the potentially disruptive nature of microbial pathogens infecting populations at scale. From a population perspective, the confluence of three aspects is required to ensure the successful transmission of infections; these are: a microbial pathogen, a susceptible host and the appropriate environment that would facilitate such transmission.
For example, in the case of COVID-19 large crowds/gatherings were often associated with the risk of potential super-spreader transmission events, and public health intervention measures opted to control the environmental conditions of such a potential transmission (e.g., by imposing lockdowns, washing hands, etc.).
Infection prevention and control requires a global approach
Another critical aspect of infection prevention and control is the need for an almost concurrent understanding of both local and national/international impacts of the spread of infections. In particular, for a number of viruses (Human Immunodeficiency virus [HIV], Hepatitis C virus [HCV], Influenza viruses, Middle East Respiratory Syndrome [MERS] and many others) and bacterial pathogens (Tuberculosis [TB], campylobacter and others), the transmission is taking place at an international scale. As such, local surveillance and treatment activities would need to be integrated at a regional/international level so that better-informed infection prevention and control programs can materialize. There are successful examples, where such an understanding has been achieved.
Specifically, in 2011, the WHO reached an international agreement on a framework for pandemic influenza preparedness, that would facilitate the sharing of influenza virus samples and data, allow vaccine access and address aspects relevant to low-income and middle-income countries (LMICs). (1) Similarly, in 2015, during the Ebola virus outbreak in west Africa, the public availability of data during this public health emergency was emphasized. The same approach was replicated by leading scientific organizations and health agencies in 2015/6 during the Zika virus outbreak in the Americas, as well as the current COVID-19 pandemic, which encouraged widespread and rapid data availability.
However, it should be noted that while data and biological samples were shared during public health emergencies, such crisis experiences provide a strong argument that data sharing should not simply be limited to emergencies or a few high-priority threats, but instead should be an integral part of the global infection prevention and control response. (2)
Technological advances transforming clinical microbiology
At this point it should be noted that at the same time as global efforts for controlling infectious diseases are gathering pace, clinical microbiology is also witnessing a transformational change in the detection of pathogens at scale advances in the deployment of molecular, genome sequencing-based, and mass spectrometry-driven detection, identification, and characterization assays.
These technological advances mean that the detection and genomic analyses of infectious disease pathogens are more accurate and detailed than ever before, as well as taking place in a high-throughput manner. However, the current gaps in our knowledge relating to detection, characterization, effective treatment, and follow-up constrain national governments and international organizations in their efforts to detect evolving trends and emerging threats.
Therefore, good quality clinical microbiology, implemented at a population level through laboratory networks is necessary for effective antimicrobial resistance surveillance and control for example. Having said that, low-resource settings still face severe infrastructural, technical, and human resource challenges towards such an implementation.
The major future challenge: anti-microbial resistance (AMR)
One of the main reasons why infection prevention and control is quickly becoming a requirement for healthcare is the emergence of anti-microbial resistance (AMR). Bacterial antimicrobial resistance (AMR) – which occurs when changes in bacteria cause the drugs used to treat infections to become less effective – is one of the leading public health threats of the 21st century.
(3) One of the forecasts by the Review on Antimicrobial Resistance, commissioned by the UK Government, argued that AMR could kill up to 10 million people per year by 2050 – though this estimate has been contested, it provides a clear indication regarding the magnitude of the existing problem. Yet, information about the current magnitude of the burden of bacterial AMR, trends in different parts of the world, and the leading pathogen–drug combinations contributing to bacterial AMR burden remains largely localized. In other words, there exist very good studies adding significantly to the body of work on AMR, though still remain insufficient to understand the overall global burden of AMR and identify and target the highest priority pathogens in different locations (especially in resource-restricted settings).
Therefore, significant amount of work requires to take place, both in terms of understanding the pathogens and their transmission, as well as in terms of introducing effective control measures. Perhaps the COVID-19 pandemic has provided a silver lining in the latter aspect, as a number of innovative approaches have been introduced in an effort to aid infection prevention and control. One such example is the disinfection of clinical wads and surfaces by robots carrying UV lamps- another is the creation of even more accurate mathematical models that can predict the scale and speed of outbreaks.
“The ongoing COVID-19 pandemic is evidence of the potentially disruptive nature of microbial pathogens infecting populations at scale.”
Novel technologies can help advance clinical microbiology
Conclusively, infection prevention and control emerge as both a local and global challenge alike and are now prioritized as such by national healthcare authorities and the WHO. Novel technologies can aid in the advancement of clinical microbiology in support of infection prevention and control; however, the implementation of such technologies is variable across the world.
Importantly, clinical microbiology in resource-restricted settings shall not be viewed as an “entry-level version” of its counterpart in high-resource settings. Conversely, it should be provided opportunities to develop context- dependent solutions that will contribute to the sharing of biological data within an established global framework.
The actions outlined above, are part of the global discussion aimed to deliver infection prevention and control capabilities, such that cost-effective laboratories will continue to operate at the frontline of antimicrobial resistance containment; microbial pathogen detection, identification and treatment.
References
World Health Organization (WHO). Sixty-fourth World Health Assembly. WHA64.5. Pandemic influenza preparedness: sharing of influenza viruses and access to vaccines and other benefits. Geneva, 2011.
Kozlakidis, Z, et al. “Global health and data-driven policies for emergency responses to infectious disease outbreaks.” The Lancet Global Health 8.11 (2020): e1361-e1363.
Murray, CJL, et al. “Global burden of bacterial antimicrobial resistance in 2019: a systematic analysis.” The Lancet 399.10325 (2022): 629-655.
Disclaimer
Where authors are identified as personnel of the International Agency for Research on Cancer/WHO, the authors alone are responsible for the views expressed in this article and they do not necessarily represent the decisions, policy or views of the International Agency for Research on Cancer/WHO.