Image: Fallout/Bethesda Game Studios

Oxford’s Global Priorities Project has compiled a list of catastrophes—both natural and self-inflicted—that could kill off 10 percent or more of the human population. It’s a real buzzkill of a report and it says that any of these catastrophes could happen within the next five years.

Titled “Global Catastrophic Risks 2016,” the report ranks the most dangerous threats facing our civilization according to probability. The likeliest risks include nuclear war and pandemics (both natural and deliberately engineered), followed by disasters stemming from runaway climate change, geoengineering run amok, and disruptions posed by artificial intelligence. The report also included low-probability—but high impact—events, like asteroid impacts and supervolcanic eruptions.


The Global Priorities Project defines a global catastrophic risk as “events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” These hazards are qualitatively distinct from existential risks, which are severe enough to wipe out all humans. Though not as apocalyptic, catastrophic risks are still grave events with serious global consequences. In the new report, researcher Sebastian Farquhar and colleagues warn that some of these perils are more likely than we realize, and that governments aren’t doing what’s necessary to mitigate the risks, or plan for their outcome.

Global catastrophic risks are exceptionally rare, but they do happen. For example, the Justinian Plague of 541-542 AD killed as many as 17 percent of the world’s population. More recently, the Spanish Flu of 1918-19 killed an estimated 10 percent of the world’s population, which exceeded the entire death toll of the Great War that preceded it. These events are unlikely in any given decade, which probably explains why they tend to receive limited attention.

Historic plagues and pandemics. (Image: Global Priorities Project).


“But even when the probability is low, the sheer magnitude of an adverse outcome warrants taking these risks seriously,” write the authors. “A global catastrophic risk not only threatens everyone alive today, but also future generations. Reducing these risks is therefore both a global and an intergenerational public good.”

Image: Global Priorities Project.

Asteroid impacts and supervolcanic eruptions made the list, but were ranked low in terms of likelihood. Other low probability events included catastrophic climate change, catastrophic disruption from artificial intelligence, and a geoengineering disaster. Pressing threats included a natural pandemic, nuclear war, and a deliberately engineered pandemic.


Looking at these rankings, the researchers got it (mostly) right, but the odds of an asteroid impact or supervolcanic eruption is exceedingly small, and almost not worthy of consideration (e.g. the odds of an asteroid striking the Earth is about 1 in 1,250 for each century, while supervolcanoes erupt about once every 30,000-50,000 years).

The risks posed by AI are probably overstated within the five-year time frame, but it’s something we should most certainly be concerned about. Likewise, a climate change catastrophe also seems unlikely within the next five years.


On the other hand, natural and human-made pandemics are definitely high-risk, high-probability events; our globalized civilization is spreading diseases faster than ever before, and our biotechnologies are making it possible to weaponize pathogens (e.g. mutating the H5N1 influenza virus to be human transmissible). Technologies like CRISPR—a powerful, cheap, and easy-to-use gene editing tool—greatly heighten the chance that a nefarious group, like Islamic State, would do such a horrible thing.

Cost per genome. (Image: Global Priorities Project).


To deal with these looming problems, the authors propose a number of solutions, including nuclear non-proliferation treaties, increased planning for serious pandemics, efforts to reduce carbon emissions, and investigations into risks posed by AI and biotechnologies.

[Global Priorities Project (pdf)]