Week 5: Existential Risk

“So if we drop the baton, succumbing to an existential catastrophe, we would fail our ancestors in a multitude of ways. We would fail to achieve the dreams they hoped for; we would betray the trust they placed in us, their heirs; and we would fail in any duty we had to pay forward the work they did for us. To neglect existential risk might thus be to wrong not only the people of the future, but the people of the past.”

– Toby Ord


This week we’ll cover the definition of an existential risk; examine why existential risks might be a moral priority; and explore why existential risks are so neglected by society.

 

Organisation Spotlight

Future of Humanity Institute 

The Future of Humanity Institute (FHI) is a multidisciplinary research institute working on big picture questions for human civilisation and exploring what can be done now to ensure a flourishing long-term future.


Currently, their four main research areas are:

  • Macrostrategy investigating which crucial considerations are shaping what is at stake for the future of humanity

  • Governance of AIunderstanding how geopolitics, governance structure, and strategic trends will affect the development of advanced artificial intelligence 

  • AI Safety – researching computer science techniques for building safer artificially intelligent systems 

  • Biosecurity working with institutions around the world to reduce risks from especially dangerous pathogens

 

Organisation Spotlight

Nuclear Threat Initiative

The Nuclear Threat Initiative (NTI) works to prevent catastrophic attacks of a nuclear, biological, radiological, chemical or cyber nature. Alongside other projects, they work with heads of state, scientists, and educators to develop policies to reduce reliance on nuclear weapons, prevent their use, and end them as a threat.

Required Materials

Recommended reading 

More to explore

Global governance and international peace

Climate Change

Nuclear security
























Week 6: Emerging Technologies

One way to look for opportunities to accomplish as much good as possible is to ask “which developments might have an extremely large or irreversible impact on human civilisation?” During this week, we’ll explore a few technological trends which might have relevance for existential risk. This week, understandably, can’t cover all the major considerations for what the future will be like, but we aim to cover two key emerging technologies that might be less well known – transformative artificial intelligence and advances in biotechnology.

 

Organisation Spotlight 

Centre for Security and Emerging Technology 

The Centre for Security and Emerging Technology (CSET) is a policy research organisation that produces data-driven research at the intersection of security and technology, providing nonpartisan analysis to the US policy community. 


They are currently focusing on the effects of progress in artificial intelligence, advanced computing and biotechnology. 


CSET is aiming to prepare the next generation of decision-makers to address the challenges and opportunities of emerging technologies. Their staff include renowned experts with experience directing intelligence and research operations at the National Security Council, the intelligence community and the Departments of Homeland Security, Defense and State.

Required Materials

Recommended reading 

More to explore

Global historical trends

Biosecurity

Shaping the development of artificial intelligence

Other

Exercise (30 mins.)

Every day each of us makes judgments about the future in the face of uncertainty. Some of these judgments can have a huge impact on our lives, so it’s really important that we make them as accurately as possible. But what can you do if you have limited information about the future? This week we’ll practice making predictions, with the goal of honing your ability to make accurate judgments in uncertain situations.

 

The aim of the exercise is to help you become “well-calibrated.” This means that when you say you’re 50% confident, you’re right about 50% of the time, not more, not less; when you say you’re 90% confident, you’re right about 90% of the time; and so on. The app you’ll use contains thousands of questions  – enough for many hours of calibration training – that will measure how accurate your predictions are and chart your improvement over time. Nobody is perfectly calibrated; in fact, most of us are overconfident. But various studies show that this kind of training can quickly improve the accuracy of your predictions. 

 

Of course, most of the time we can’t check the answers to the questions life presents us with, and the predictions we’re trying to make in real life are aimed at complex events. The Calibrate Your Judgement tool helps you practice on simpler situations where the answer is already known, providing you with immediate feedback to help you improve.

 

Exercise – Calibrate Your Judgement