Everyone loves predicting the end of the world. Apocalyptic fiction has become increasingly popular in recent years, with alien invasions and zombie uprisings being two of the most common ways these tales portray humanity being wiped out. While those two scenarios probably aren’t happening anytime soon - I mean come on, those zombies would decompose and fall apart after a few months - there are a number of serious and totally real risks facing humanity’s future.
To prepare for such events, an interdisciplinary team of philosophers and scientists at Oxford’s Future of Humanity Institute (FHI) recently published a report warning of humankind’s most pressing dangers. The report, titled Existential Risk: Diplomacy and Governance, seeks to outline what it calls existential risks, which it defines as events “with the potential to permanently curtail humanity’s opportunity to flourish.” While many different apocalyptic scenarios are presented throughout the report, the FHI team members highlight three events as the most likely: nuclear war, a deadly pandemic, or irreversible climate change.
The report also touches on a few other apocalypse scenarios, including various “natural processes” such as asteroid strikes, Gamma ray bursts, or supervolcano eruptions. The researchers also include a section for “unknown unknowns:”
It therefore seems likely that some future existential risks, driven by the same mechanisms, are currently unknown. For example, there may be an as yet undeveloped technology which will have huge destructive power, or some way of interacting with the environment which will threaten complete ecosystem collapse.
Undeveloped technology with huge destructive power you say? Some way of causing complete ecosystem collapse? Sound familiar. After all the doom and gloom of going over apocalypse scenario after apocalypse scenario, the researchers claim that humanity might, however, have a chance of preventing any one of these three scenarios, but only if the impossible happens - we finally learn to get along:
For each of these opportunities, humanity will require increasing levels of trust and international collaboration in order to face the challenges that threaten us all. Moreover, these risks are constantly evolving, and understanding them will need deep and sustained engagement with the global research community.
Yeah, like that’s bound to happen in light of the current anti-research climate in many countries. Speaking of which, a similar body of experts made up of some of the same Oxford researchers published a 2016 study claiming the apocalypse was likely coming within just five years - make that four now. Is this a case of some seriously pessimistic academics caught in a doom loop, or could they be onto something?