9 existential risks that have the potential to bring an end to humanity

NASA Goddard

Human extinction is something that most people probably try not to dwell too much on, but a team at Cambridge University are devoting their lives to studying the matter.

Academics, lawyers, scholars, and philosophers at the university’s Centre for the Study of Existential Risk (CSER) are specifically focusing on how best to mitigate risks that could lead to human extinction or civilisational collapse.

Haydn Belfield, an academic project manager at CSER, told Business Insider that he and his peers are looking at risks associated with artificial intelligence, biotechnology, the environment, and to a lesser degree, natural risks.

“These are ‘Global Catastrophic Risks’ — defined as those that could kill at least 10% of the global population, or cause equivalent damage,” he told Business Insider.

Here’s a roundup of the areas they’re focusing on:

An asteroid hitting Earth

The Earth is constantly being 'hit' by asteroids but most of them are harmless as they're too small and they get broken up into dust or tiny grains of sand as they pass through our atmosphere. However, every now and again, a larger asteroid will collide with our planet.

Asteroids with a diameter in excess of 100m come into contact with Earth roughly every thousand years or so, while asteroids with a diameter in excess of 10km hit the Earth every 100 million years, according to Wired. Smaller asteroids with a diameter of less than 1m hit the Earth every year.

Type of risk: Natural (CSER is less concerned about these)

A supervolcano

Jonathan Payet
Piton de la Fournaise, Volcano

Quietly lurking beneath Yellowstone National Park in the US is a 'supervolcano' that has the potential to wipe out humans when it erupts.

There are several other 'supervolcanoes' -- volcanoes capable of an eruption that produces more than 240 cubic miles of magma -- around the world that scientists and volcanologists are constantly watching, including the one at Lake Toba in Indonesia.

There have been three major volcanic eruptions at Yellowstone in the last 2 million years, according to the National Park Service (NPS), and two of them can be considered supervolcano events. The NPS writes on its website that another supervolcano eruption is possible but it is very unlikely in the next thousand or even 10,000 years.

The vast quantities of lava and ash that are spewed out of the Earth during supervolcano events have the potential to cause long-lasting climate change that could trigger a life-threatening ice age or global warming.

Type of risk: Natural (CSER is less concerned about these)

Solar flares

Explosions on the surface of the Sun can result in solar flares that have an impact on Earth. Every now and again, these solar flares are big enough to trigger geomagnetic storms that disable satellites and take out terrestrial power grids.

There is a one in eight chance of a catastrophic solar flare striking Earth by the year 2020, according to Wired.

Type of risk: Natural (CSER is less concerned about these)

Catastrophic climate change

Rising global temperatures could have potentially deadly knock-on effects for humans.

Nonlinear climate change happens when complex climate systems rapidly transition to a different 'mode' on a scale that humanity and other natural systems can't adapt to.

Type of risk: Environmental

Nuclear winter

A nuclear winter could occur if the world were to ever to end up in an all out nuclear war. Launching dozens of nuclear warheads would produce enough soot to entomb the planet in billowing, black clouds of ash that would plunge the planet into darkness and cause global temperatures to plummet. Those that weren't killed by the nuclear weapons themselves would end up dying during the nuclear winter.

Type of risk: Environmental

Genetically engineered crop or livestock disease

Sandra Mu/Getty

Man-made diseases with the potential to wipe out humanity's key food sources (crops and livestock) could also pose a threat to human populations. With millions of people around the world already on the brink of starvation as a result of famine, a genetically engineered disease that takes away swathes of food is the last thing humanity needs.

Type of risk: Biotech

Non-beneficial 'narrow' AI

A24/'Ex Machina'

Narrow AI, also known as weak AI, are pieces of software designed to learn and focus on one particular task. One example is the AlphaGo algorithm, which has been programmed by researchers at Google-owned DeepMind to play and master the Chinese board game Go. Another example is Apple's personal assistant Siri.

'Weak, or 'narrow' AI, is a present-day reality,' wrote Ryan Calo, of the Center for Internet and Society at Stanford Law School, in 2011. 'Software controls many facets of daily life and, in some cases, this control presents real issues. One example is the May 2010 'flash crash' that caused a temporary but enormous dip in the market.'

Type of risk: Artificial intelligence

Non-beneficial superintelligence

Highly advanced self-thinking machines that are terrifyingly described in Nick Bostrom's 'Superintelligence' book are another cause for concern. If these superintelligent computers outsmart humans and decide they no longer need us then we could be in trouble.

'Most researchers agree that a superintelligent AI is unlikely to exhibit human emotions like love or hate, and that there is no reason to expect AI to become intentionally benevolent or malevolent,' according to The Future of Life Institute.

If poorly programmed superintelligent machines wanted to, they could quickly develop sophisticated weapons of all shapes and sizes that could spell the end of humanity.

Type of risk: Artificial intelligence

Unknowns

Aly Weisman/Business Insider

There are, of course, the unknown risks that we haven't even thought of yet. Who knows what they are.

Type of risk: Everything

NOW WATCH: Tech Insider videos

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.