Six ways we all die

Sumir Karayi
8 min readOct 19, 2020

Thinking about human existential risks

I just had an enjoyable morning. I randomly picked a direction for a walk and found two lovely parks fairly close to my home. I also started listening to one of my favorite podcasts, BBC Analysis. Today’s were about existential risks, starting with Planning for the Worst and continuing the theme — Will humans survive the century. This topic is just so interesting right now. I also looked up some websites sites[1] and thought I’ll share my thoughts.

Coronavirus

Since the pandemic we have collectively become aware of the potential of a virus to kill many humans and have certainly experienced significant disruption. What is interesting is that the pandemic caught every single government by surprise, which is really interesting because it is each government’s job to think about our collective survival and plan long term. If all our governments were not prepared for a virus attack then what other risks are we not prepared for?

This is a bit of fun as a thought experiment. Let’s consider a few risks which are more than likely — biological, extraterrestrial, geological, nuclear war, artificial intelligence, climate and overpopulation.

Biological

In 1918 the Spanish flu killed 50 million people and infected 500 million people[2]. In 1957 the Asian flu killed 1.1 million people. In 1968 another flu killed 1 million people. A quick search shows the UK government published several papers over the last twenty years on pandemic preparation. So why were no governments prepared for COVID? After recent SARS and MERS epidemics it did not take a genius to guess that it is likely we are going to get another novel flu within our lifetimes.

Consider if COVID transmitted slightly more easily, lay dormant longer, and killed a few more of the infected people, the outcomes could be horrifically worse. Epidemiologists can easily model scenarios where much of the world's population does not survive. This is not science fiction and while getting through this crisis is paramount, we should not lose sight of the fact that we are coping badly with COVID. How will we cope if SARS-CoV-3 (next one) is much worse?

Biological pathogen research generally doesn’t pay as much as cancer treatments. Consequently, there has been a lack of investment in commercially funded research. I suspect governments will want to ensure that adequate funding levels are preserved but we all will need to make sure our governments don’t forget.

Extraterrestrial

The Sun gives us life but can just as easily take it away or at the very least completely disrupt our technology-based existence. Solar flares and coronal mass ejections have proven to disrupt power, communications, and pretty much anything dependent on technology. What we tend to forget is that the Sun is an omnipresent fusion bomb as big as a million earths. Sometimes the explosions are big enough to let huge quantities of material escape as a flare or a mass ejection[3]. In 1859 a solar flare[4] created a geomagnetic storm that resulted in such bright auroras that people could read at night without additional light and caused havoc with the nascent telegraphic system in Eastern US. Today such a storm would disrupt all communications and power systems, potentially for days, weeks, or longer. This is because solar flares bring with them powerful magnetic fields which can generate huge amounts of current in power and telecommunications transmission lines and blow up the equipment at either end. You may think this is unlikely but a similarly powerful solar flare narrowly missed the earth in 2012. A less powerful one disrupted the lives of 6 million people in Canada in 1989.

This kind of event is likely enough that we need to be prepared. We will have 8 mins initial warning (the time light takes to travel to earth) and then a day or two to prepare for everything shutting down. Are we prepared? Imagine what people would do if nothing worked, no power, phone, or radio for several days or weeks? How does the government communicate with everyone? What are people meant to do if no shops are open? How do we look after the old and infirm?

An asteroid or another object large enough to annihilate everything on earth is unlikely. All the objects in the solar system capable of causing serious damage are tracked and thankfully none is heading for us. I am not sure how we can track objects traveling from outside the solar system and whether any are traveling fast enough to be a big surprise. So let’s assume this one is unlikely enough to worry about. Finally, if we do have a collision event like the one 65 million years ago[5], which caused the massive Yucatán crater and killed most dinosaurs then we are looking at a mass extinction and there is little point in worrying about it. At least until we can settle on other planets.

Geological

Global geological risks are mainly from earthquakes and volcanoes. Earthquakes and their widespread impact became obvious to most of us on 26 Dec 2004 with the megaquake in the Indian Ocean[6]. The tsunami killed a quarter of a million people. Many times that many people suffered and some economies are still recovering. The long-term impact on ecology was just as significant. Diving a few years ago around the Andaman islands was upsetting as the coral had not recovered and one of the world's best coral reefs is ruined, maybe forever.

Note: The Indian ocean earthquake was of the magnitude of 9.1–9.3. The largest recorded earthquake was in Chile in 1960 of magnitude 9.4–9.6. The numbers are almost the same but these are logarithmic scales and the Chilean one was more than twice the amplitude as that of the Indian ocean. In the Richter and more contemporary scales, a 0.2 increase means that the amplitude of the earthquake doubles, and the energy it releases trebles. I feel log scales are easily misunderstood and can make us less worried than we should be about massive increases in quantity or risk, such as with COVID transmissions.

It is highly likely there will be more earthquakes that disrupt large parts of the world such as the Really Big one[7] in Northwest US but they are unlikely to be globally threatening events.

Volcanoes can cause disruption as we all found out in 2010 with the unpronounceable Icelandic volcano, Eyjafjallajökull. What we need to remember is that it wasn’t a big one. Supervolcanoes like the one in Yellowstone could deposit several centimeters to meters of ash across the US, reduce the global temperature by 5–15 C and destroy crops across the world for a year. This will mean catastrophic global famine and other significant problems. It is highly unlikely that Yellowstone will erupt anytime soon but volcanoes are erupting all the time. If one goes off in Siberia as is more likely, we also have permafrost vaporizing which additionally will release huge amounts of greenhouse gases. The risk seems fairly high that such an event will affect the world in the foreseeable future. While many volcanoes have a localized effect some will have a global impact so we need to plan for this contingency.

Nuclear war

Many of us were not around in the worst days of the cold war when global nuclear conflict was a clear and present danger. Thankfully we are not in that precarious position anymore. But consider this scenario. If the US and Norway launch a rocket into space with a trajectory similar to an Intercontinental Ballistic Missile (nuclear), then the Russian don’t have long to decide on whether to respond. What if they choose to retaliate on the assumption that the US has launched a premeditated strike? This scenario isn’t fiction and did happen, thankfully Yeltsin decided that the launch wasn’t a threat to Russia.

I believe our checks and balances work well in human reaction timeframes. But what happens when AI is used to ensure that the military can react faster and more efficiently. Humans are simply too slow to control AI decision-makers and accidents are much more likely. Maybe this scenario can be avoided but the appeal of faster and more effective AI-based weapons is unlikely to be lost on the military. I suspect the best we can do is ask for transparency in the use of AI-based decision making in weapon systems.

Artificial Intelligence

Right now we are using AI as highly specialized tools and there is no general artificial intelligence. It is highly likely though that as specialized intelligences become better connected to each other and as research into general intelligence continues at a frenetic pace, we are going to have a singularity event. In AI terms, this means genuine general intelligence, potential self-awareness, and a set of operating rules which humans may not control. This is possible and likely in our lifetimes.

I suspect that there is a high risk that something is going to go wrong in the near future but unlikely to be a global catastrophic event. Things will go wrong because specific intelligences are designed by humans with benevolent or malevolent intentions. Consider for example someone designing intelligence to fight a war as efficiently as possible. It is likely that as the intelligence becomes more sophisticated it will consider humans as expendable objects. What if many such AIs are convinced of a false premise and start a war? General artificial intelligence with a wide-ranging control is needed to cause a truly catastrophic event and we are not there yet.

Another near-term problem is that if we continue to adopt specific intelligence AIs, we are going to need more and more sophisticated AIs to supervise these intelligences as humans will not be nearly fast enough to control them. Otherwise costly mistakes will become more likely.

Climate change and overpopulation

If humans want to live in the way we are choosing to, then there are simply too many people in the world. The rich countries point the blame to poor countries with larger populations. There is also a valid argument by the poor countries that the rich countries consume a lot more resources period. The challenge is that we as a species have boundless ambition in consuming resources and keep increasing our appetite. The challenge is that resources are not infinite and the impact of our ever-increasing consumption is that we are polluting the planet irreparably, making it uninhabitable for the majority of other lifeforms and will eventually make it impossible for many humans to share this planet.

This existential risk is the most pernicious as it is gradual, is caused by us living normal lives and there is no direct consequence to individual action. I suspect there are very few humans who are consciously trying to destroy the environment and ecosystems, raising the temperature of the planet, removing the habitat for all other lifeforms, amongst other destructive actions. People mostly are just trying to earn enough to live, support their families, and have some fun.

If we accept that human beings are unlikely to change anytime soon then the only answer is a significant reduction in our numbers. This can happen relatively fast but unlikely as the only practical way is a global plan. We would need a living wage for everyone, universal healthcare, sensible distribution of resources, which would need a rethink of capitalist enterprise, and national boundaries. You would be right in thinking this is impossible but how else do we deal with the most obvious and likely risk to humanity? We need an ordered decrease from 8 billion to maybe a few million people. Then we can share the planet with each other and other life. Maybe then even thrive over the next century and achieve the potential of becoming a species that can populate other planets and truly become invulnerable.

[1] University of Cambridge, Center for study of existential risk
https://www.cser.ac.uk/

University of Oxford, Future of humanity institute
https://www.fhi.ox.ac.uk/

This Wikipedia article has interesting information but seems a little biased. Wikipedia Global Catastrophic Risk https://en.wikipedia.org/wiki/Global_catastrophic_risk

[2] https://en.wikipedia.org/wiki/Spanish_flu

[3] https://en.wikipedia.org/wiki/Geomagnetic_storm

[4] https://en.wikipedia.org/wiki/Carrington_Event

[5] https://en.wikipedia.org/wiki/Chicxulub_crater

[6] https://en.wikipedia.org/wiki/2004_Indian_Ocean_earthquake_and_tsunami

[7] https://www.newyorker.com/magazine/2015/07/20/the-really-big-one

--

--

Sumir Karayi

Moved from the Himalayas to London, learnt a bit about software. Trying to give back through my foundation.