Managing Extreme Technological Risk

  • TWCF Number:

    0128

  • Project Duration:

    October 1, 2015 - September 30, 2018

  • Core Funding Area:

    Other Charitable Purposes

  • Region:

    Europe

  • Amount Awarded:

    $2,744,668

Director: Professor Huw Price

Institution: University of Cambridge

We know of natural risks to our species, such as asteroid impacts. We also know that we threaten our own existence, for example, by nuclear war. These home-grown extreme risks have been with us for decades, but many distinguished scientists are now concerned that developments in technology (e.g., artificial intelligence or biotechnology) may soon pose more direct catastrophic threats.

The new capabilities of such technologies might place the power to do catastrophic harm in dangerously few human hands. Or, in the case of AI, they might take it out of our hands altogether. The nature and level of such extreme technological risks (ETRs) are, at present, difficult to assess because they have received little serious scientific attention. The need to consider them is a new one, but it is one that we seem likely to face for the foreseeable future as technology becomes even more powerful. Hence, it is crucial to investigate these risks seriously and learn how they can best be managed in the long term.

The purpose of the new Cambridge Centre for the Study of Existential Risk (CSER), within which this project is based, is to lead the international scientific community in conducting research on these issues. This project begins with a preliminary model of the components needed for the study and management of ETR. We will refine the model by implementing some of it, and by learning from our successes and failures. In this way, we will clarify both the present ETRs themselves and how risks of this class should be managed in the future. Our outputs will be both theoretical (e.g., scientific papers) and practical (e.g., establishing best practices for ETR mitigation). In the process, we will greatly expand the community of scholars, technologists, and policymakers who are aware of the task of managing ETRs, as well as their understanding of the nature of this task.

We hope these efforts will substantially reduce the long-term risk of our species being wiped out by its own technological success.

Disclaimer

Opinions expressed on this page, or any media linked to it, do not necessarily reflect the views of Templeton World Charity Foundation, Inc. Templeton World Charity Foundation, Inc. does not control the content of external links.