In a recent study published in the Proceedings of the National Academy of Sciences journal, researchers assessed the intensities and frequencies of extreme novel epidemics.
To test theories and models and to inform public health risk assessment, the team utilized epidemic intensity, which is defined as the number of deaths in the global population and the duration of the epidemic, as well as the rate of emergence of infectious disease outbreaks. These values were employed to estimate the likelihood of extreme pandemics like coronavirus disease 2019 (COVID-19). Although important, the process of compiling and examining a thorough worldwide historical record spanning a variety of diseases is still unexplored.
About the study
In the present study, researchers calculated the yearly probability of experiencing extreme epidemics by assembling and reviewing a global dataset of historical epidemics between 1600 and the present.
The number of fatalities observed per unit of time was a key indicator of an epidemic. This characteristic determined how well the health care systems responded to epidemics and the corresponding socioeconomic adversities that they caused. Therefore, the team defined and researched the epidemic intensity, measured in fatalities per year, which took into account the total number of epidemic-related deaths (s), the size of the world's population at the start of the epidemic (S0(t)), and the epidemic's duration (d). Furthermore, global population history reconstructions were employed to calculate S0(t) and pandemic intensity.
In total, 395 epidemic episodes observed between 1600 and 1945 were the subject of the present study. For 182 of these outbreaks, data on the length and the number of fatalities were available. Given that an epidemic had already occurred, the team employed the probability distribution of epidemic intensity to investigate the 182 recorded epidemic intensities between 1600 and 1945. Additionally, the probability distribution of how many epidemics would occur in a particular year was considered. Due to changing interactions between humans and their environment and their significant impact on the rate at which new epidemics originate, the study anticipated that this distribution will change over time.
All 395 known outbreaks between 1600 and 1945 were used to assess the fluctuating number of epidemic situations yearly. The team considered the probability distribution of the maximum magnitude among epidemics within a predetermined fixed time interval (w).
Results
A generalized Pareto distribution (GPD) over nearly four orders of magnitude of the independent variable provided an efficient description of the empirical exceedance frequency distribution of epidemic intensity. The GPD displayed a power-law tail, which denoted the lack of a typical epidemic intensity and a gradually declining probability of intense epidemics. The reduction of assessed epidemic intensities over a single distribution for a variety of diseases involved spanning such a long observation period supported the general validity of the GPD over time. Therefore, it was assumed that this probability distribution of epidemic intensity was time-independent while the rate of illness onset varied.
The traditional theory of extremes assumes that the process of event occurrence is stationary. Thus, according to this interpretation, the recurrence of epidemic events was controlled by a fixed rate. The time series showed coherent temporal patterns. Moreover, the team noted that 12 was the highest number of events occurring in a single year, while there was a nine-fold variation in events occurring each year among the 345 years assessed in the study.
To demonstrate the use of the average recurrence interval T (i), the team considered a pandemic with a severity equivalent to or greater than the Spanish flu, which claimed between 20 and 100 million lives. Here, the team predicted that this pandemic occurred when its mean recurrence time was 91 years, with the value of i being 5.7‰ per year. The average recurrence time of a similar severity is 400 years based on the observed number of outbreaks in the analyzed dataset from the most recent 20-year period between 2000 and 2019. A simplistic calculation based on the assumption that the generalized extreme value (GEV) was stationary resulted in a lower and constant T of 235 years.
Overall, the present study estimated the frequency of future COVID-19-like events. According to the study findings, the epidemic spreads at a rate of approximately 2.5 million deaths per year, which, when normalized according to the global population, results in an epidemic intensity of 0.33‰ per year.