Global Catastrophic Risk Research Page

Global catastrophic risks (GCR) are risks of events that could significantly harm or even destroy human civilization at the global scale. GCR is related to the concept of existential risk, which is risk of events that would cause humanity to no longer exist. (Note that Nick Bostrom, who coined the term existential risk, defines it in a slightly different way.) Prominent GCRs include climate change, nuclear warfare, pandemics, and artificial general intelligence. Due to the breadth of the GCRs themselves and the issues that GCRs raise, the study of GCR is quite interdisciplinary.

According to a range of ethical views, including my views, reducing GCR should be our top priority as individuals and as a society. In short, if a global catastrophe occurs, then not much else matters, since so much of what we might care about (such as human wellbeing, the wellbeing of non-human animals, or the flourishing of ecosystems) would be largely or entirely wiped out by the catastrophe. The details about prioritizing GCR are a bit more complicated than this (and are part of ongoing research), but GCR does nonetheless remain a (or the) top priority from a range of views.

GCR is my primary research focus, and is the focus of my primary affiliation, the Global Catastrophic Risk Institute (GCRI). Much of my involvement in GCR research is oriented towards mapping out the topic and building research community, as documented in GCRI's GCR Community Project page.

Most of my publications have a GCR theme - see the GCRI publications page.

Created 20 Jul 2010 * Updated 20 Jun 2016