Cambridge Centre for the Study of Existential Risk

University of Cambridge: Cambridge Centre for the Study of Existential Risk

Some of Britain’s finest minds are drawing up a “doomsday list” of catastrophic events that could devastate the world, pose a threat to civilization and might even lead to the extinction of the human species.

Members include Stephen Hawking, the worlds’ most famous living scientist, Martin Rees. emeritus professor of cosmology and astrophysics at Cambridge, Huw Price professor of philosophy at Cambridge, Jaan Tallinn: co-founder of Skype, and Robert May, past president of the Royal Society.

Some of their concerns are:

  • Human technology may pose new, extinction-level risks to our species
  • Events could arise as unexpectedly as the 2008 financial crisis that might cause world-wide disruption
  • Our increasing reliance on technology and the formation of complex interconnected networks is making society more vulnerable, because if something goes wrong in one system, it can affect all the others (i.e. power, food supplies, financial system).
  • We import most of our fossil fuels from abroad, so a conflict over resources in the future is possible
  • In a modern, efficient world, we no longer stockpile food. If the supply is disrupted for any reason, it would take about 48-hours before it runs out and riots begin
  • Although some of these events have a low probability, if one occurs the consequences would be catastrophic. But politicians only focus on short-term problems, and the public is in denial about what we’re doing to the planet and the consequences to their grandchildren (i.e. climate change / the 9 boundaries we must not cross), and the vulnerability of an interconnected world to the actions of terrorism by a small group or one individual.

Cyber attacks: One of the biggest threats is some kind of attack on the computers controlling the electricity grids around the world. Loss of electrical power would have immediate and possibly severe consequences if it could not be restored quickly.

Systemic risk. Complex interactions between a rising global population, greater pressure being placed on natural resources, more complex supply chains, and an increasing reliance on both on interconnected technologies and interconnected markets. Our interconnected world depends on elaborate networks: electric power grids, air traffic control, international finance, just-in-time delivery to name just a few. Unless these are highly resilient, their manifest benefits could be outweighed by catastrophic (albeit rare) breakdowns cascading through the system.

Resource depletion or ecological destruction. The natural resources needed to sustain a high-tech civilization are being used up. If some other cataclysm destroys the technology we have, it may not be possible to climb back up to present levels if natural conditions are less favorable than they were for our ancestors, for example if the most easily exploitable coal, oil, and mineral resources have been depleted.

Bioterrorism:  Large infrastructure is required to build and deliver nuclear weapons, but genetically engineered harmful microbes or viruses could be developed in a relatively simple laboratory.

Food shortages: The modern food industry is based on “just in time” delivery with little or no stockpiling. Failure of the information networks controlling this could quickly lead to shortages and food riots.

Nuclear holocaust. Even if some humans survive the short-term effects of a nuclear war, it could lead to the collapse of civilization.

Genetically engineered biological agent. As genetic technology advances, it may become possible for a tyrant, terrorist, or lunatic to create a doomsday virus, an organism that combines long latency with high virulence and mortality

Pandemics:  Increasing mobility makes it more likely a new,  infection could quickly spread around the world via air travel before a vaccine is developed to combat it.

Agriculture. there has been a trend towards more widespread use of fewer genetic varieties of crop, potentially increasing the vulnerability of global food supplies to emerging pathogens.

Asteroid or comet strikes the earth. Not likely, but possible, happens every half million years or so.  Since Bostroms paper was published, it looks like many of the past large extinctions were from global warming rather than comets or asteroids.

Not likely (don’t worry): solar flares, supernovae, black hole explosions or mergers, gamma-ray bursts, galactic center outbursts, supervolcanoes, loss of biodiversity, buildup of air pollution, gradual loss of human fertility.

My opinion: Peak oil, coal, natural gas less the odds of runaway greenhouse

Runaway climate catastrophe:  Climatologists fear that, as the climate is polluted with increasing quantities of carbon dioxide, it may pass a tipping point after which feedback effects cause it to get warmer and warmer.

My opinion:  computer chips will be among the first technologies to fail.  This is a silly worry

Malign computers: Some experts fear that increasingly intelligent computers may one day turn “hostile” and not perform as they were designed.

The 4 levels of risk are: (Bostrom)

Bangs – Earth-originating intelligent life goes extinct in relatively sudden disaster resulting from either an accident or a deliberate act of destruction.

Crunches – The potential of humankind to develop into posthumanity[7] is permanently thwarted although human life continues in some form.

Shrieks – Some form of posthumanity is attained but it is an extremely narrow band of what is possible and desirable.

Whimpers – A posthuman civilization arises but evolves in a direction that leads gradually but irrevocably to either the complete disappearance of the things we value or to a state where those things are realized to only a minuscule degree of what could have been achieved.

I was so annoyed with the idea that malign computers could be a problem I wrote the following letter (and to the Global Catastrophic Risk Institute as well):

I think your worry about malign computers is highly unlikely.

Microchips are the pinnacle of civilization, the most complex product, and therefore the most vulnerable to supply chain failure, cascading failure, single-source failures, energy supply shocks, financial collapse, and all the other bangs, crunches, shrieks, and whimpers.

The Fragility of Microchips

Microchips and Fab Plants: a Detailed description

Motherboards in Computers – too complex to make in the future

High-Tech can’t last: Limited minerals & metals essential for wind, solar, microchips, cars, & other high-tech gadgets

The real threat to civilization is the exponential decline of all fossil fuels and other natural resources (topsoil, aquifers, fisheries, forests, etc).

The importance of fossil fuels to human civilization

The world depends on oil for transportation – agriculture and trucks can’t be electrified, but the only energy resource that could fuel the existing 1 billion combustion engines are biofuels. But that won’t happen for many reasons Peak Soil: Why Biofuels are Not Sustainable and a Threat to America’s National Security

In fact, there are no alternative energy resources which can replace fossil fuels:

No single or combination of alternative energy resources can replace fossil fuels

Martin Hoffert, et al 2002 Advanced Technology Paths to Global Climate Stability: Energy for a Greenhouse Planet, Science. Vol 298

David Fridley, LBNL scientist, on why alternative energy won’t save us

Tilting at Windmills, Spain’s disastrous attempt to replace fossil fuels with Solar Photovoltaics

This is too large a topic to cover in an email, for more information on this topic, see my

Alternative Energy Reading List, big picture book list, and the energy section of my website, www.energyskeptic.com

The good news is that we may not go extinct – the carrying capacity of homo sapiens without fossil fuels is probably 1 billion or less. All of the harm and risk of crossing the 9 boundaries comes from fossil fuel energy.

Alice Friedemann   www.energyskeptic.com

I’ve been studying systemic risks, cascading failures, and so on over 10 years. My career was in Information technology, first as an assembler programmer and eventually systems engineer and architect. Now I am a science writer specializing in energy and natural resources. I try to use only peer-reviewed science from the best scientific journals.

References

Bostrom, Nick. 2002. Existential Risks Analyzing Human Extinction Scenarios and Related Hazards. Journal of Evolution and Technology, Vol. 9, No. 1

This entry was posted in Cambridge Centre Study of Existential Risk, Scientists Warnings to Humanity. Bookmark the permalink.

Comments are closed.