29 min

Talking Politics Guide to ... Existential Risk TALKING POLITICS

    • News

David talks to Martin Rees about how we should evaluate the greatest threats facing the human species in the twenty-first century. Does the biggest danger come from bio-terror or bio-error, climate change, nuclear war or AI? And what prospects does space travel provide for a post-human future?
Talking Points:
Existential risk is risk that cascades globally and is a severe setback to civilization. We are now so interconnected and so empowered as a species that humans could be responsible for this kind of destruction.
There are natural existential risks too, such as asteroids. But what is concerning about the present moment is that humans have the ability to affect the entire biosphere.This is a story about technology, but it’s also about global population growth and the depletion of resources.
There are four categories of existential risk: climate change, bioterror/bioerror, nuclear weapons, and AI/new technology.
Climate Change has a long tail, meaning that the risk of total catastrophe is non-negligible.Bioterror/bio-error is becoming more of a risk as technology advances. It’s hard to predict the consequences of the misuse of biotech. Our social order is more vulnerable than it used to be. Overwhelmed hospitals could lead to a societal breakdown.Machine learning has not yet reached the level of existential risk. Real stupidity, not artificial intelligence, will remain our chief concern in the coming decades. Still, AI could make certain kinds of cyber-attacks much worse.The nuclear risk has changed since the Cold War. Today there is a greater risk that some nukes go off in a particular region, although global catastrophe is less likely.
These threats are human-made. Solving them is also our responsibility.
We can’t all move to Mars. Earth problems have to be dealt with here.There are downsides to tech, but we will also need it. Martin describes himself as a technical optimist, but a political pessimist.
Mentioned in this episode:
Martin Weitzman on long tail risks and climate changeThe Stern Review on climate change, 10 years onA review of Jared Diamond’s Collapse.
Further Learning:
Martin’s new book, On the Future: Prospects for HumanityThe Centre for the Study of Existential Risk at CambridgeThe Talking Politics Guide to Nuclear WeaponsWho wants to colonize Mars? And as ever, recommended reading curated by our friends at the LRB can be found here: lrb.co.uk/talking

David talks to Martin Rees about how we should evaluate the greatest threats facing the human species in the twenty-first century. Does the biggest danger come from bio-terror or bio-error, climate change, nuclear war or AI? And what prospects does space travel provide for a post-human future?
Talking Points:
Existential risk is risk that cascades globally and is a severe setback to civilization. We are now so interconnected and so empowered as a species that humans could be responsible for this kind of destruction.
There are natural existential risks too, such as asteroids. But what is concerning about the present moment is that humans have the ability to affect the entire biosphere.This is a story about technology, but it’s also about global population growth and the depletion of resources.
There are four categories of existential risk: climate change, bioterror/bioerror, nuclear weapons, and AI/new technology.
Climate Change has a long tail, meaning that the risk of total catastrophe is non-negligible.Bioterror/bio-error is becoming more of a risk as technology advances. It’s hard to predict the consequences of the misuse of biotech. Our social order is more vulnerable than it used to be. Overwhelmed hospitals could lead to a societal breakdown.Machine learning has not yet reached the level of existential risk. Real stupidity, not artificial intelligence, will remain our chief concern in the coming decades. Still, AI could make certain kinds of cyber-attacks much worse.The nuclear risk has changed since the Cold War. Today there is a greater risk that some nukes go off in a particular region, although global catastrophe is less likely.
These threats are human-made. Solving them is also our responsibility.
We can’t all move to Mars. Earth problems have to be dealt with here.There are downsides to tech, but we will also need it. Martin describes himself as a technical optimist, but a political pessimist.
Mentioned in this episode:
Martin Weitzman on long tail risks and climate changeThe Stern Review on climate change, 10 years onA review of Jared Diamond’s Collapse.
Further Learning:
Martin’s new book, On the Future: Prospects for HumanityThe Centre for the Study of Existential Risk at CambridgeThe Talking Politics Guide to Nuclear WeaponsWho wants to colonize Mars? And as ever, recommended reading curated by our friends at the LRB can be found here: lrb.co.uk/talking

29 min

Top Podcasts In News

Serial
Serial Productions & The New York Times
The Daily
The New York Times
Up First
NPR
The Tucker Carlson Podcast
Tucker Carlson Network
Prosecuting Donald Trump
MSNBC
The Ben Shapiro Show
The Daily Wire

More by Talking Politics