12 episodes

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

The End Of The World with Josh Clark iHeartRadio

    • Science
    • 4.9, 686 Ratings

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

    Simulation Argument (Epilogue)

    Simulation Argument (Epilogue)

    There’s one last thing. Maybe the reason why we don’t see other intelligent life, maybe the reason we are in the astoundingly unique position of having to save the future of the human race, is because we are simulated human beings. It would explain a lot. (Original score by Point Lobo.)
    Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; Anders Sandberg, Oxford University philosopher; Seth Shostak, director of SETI
    Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

    • 53 min
    End

    End

    Josh explains that to survive the next century or two – to navigate our existential threats – all of us will have to become informed and involved. It will take a movement that gets behind science done right to make it through the Great Filter. (Original score by Point Lobo.) 
    Interviewees: Toby Ord, Oxford University philosopher; Sebastian Farquahar, Oxford University philosopher
    Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

    • 47 min
    Embracing Catastrophe

    Embracing Catastrophe

    We humans are our own worst enemies when it comes to what it will take to deal with existential risks. We are loaded with cognitive biases, can’t coordinate on a global scale, and see future generations as freeloaders. Seriously, are we going to survive? (Original score by Point Lobo.)
    Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; Toby Ord, Oxford University philosopher; Anders Sandberg, Oxford University philosopher; Sebastian Farquahar, Oxford University philosopher; Eric Johnson, University of Oklahoma professor of law 
    Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

    • 51 min
    Physics Experiments

    Physics Experiments

    Surprisingly the field of particle physics poses a handful of existential threats, not just for us humans, but for everything alive on Earth – and in some cases, the entire universe. Poking around on the frontier of scientific understanding has its risks. (Original score by Point Lobo.) 
    Interviewees: Don Lincoln, Fermi National Laboratory senior experimental particle physicist; Ben Shlaer, University of Auckland cosmologist University of Auckland; Daniel Whiteson, University of California, Irvine astrophysicist; Eric Johnson, University of Oklahoma professor of law 
    Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

    • 1 hr 19 min
    Biotechnology

    Biotechnology

    Natural viruses and bacteria can be deadly enough; the 1918 Spanish Flu killed 50 million people in four months. But risky new research, carried out in an unknown number of labs around the world, are creating even more dangerous humanmade pathogens. (Original score by Point Lobo.) 
    Interviewees: Beth Willis, former chair, Containment Laboratory Community Advisory Committee; Dr Lynn Klotz, senior fellow at the Center for Arms Control and Non-Proliferation.
    Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

    • 1 hr 2 min
    Artificial Intelligence

    Artificial Intelligence

    An artificial intelligence capable of improving itself runs the risk of growing intelligent beyond any human capacity and outside of our control. Josh explains why a superintelligent AI that we haven’t planned for would be extremely bad for humankind. (Original score by Point Lobo.)
    Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+); Sebastian Farquahar, Oxford University philosopher.

    Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

    • 47 min

Customer Reviews

4.9 out of 5
686 Ratings

686 Ratings

Hi there 2256 ,

Biochemistry in 2020

A little too on the nose listening to this podcast rn

Ronel Dsouza ,

Thanks Josh !

This is information we should all be aware of, josh is doing important work spreading it and doing it in a way that is easily consumable and entertaining, no easy task considering the weight of the subject. I only wish there was more, I’ve listened to the series countless times and will probably continue to do so for a long time, I think I could listen to Josh describe the intricacies of the universe all day.

Noctum6 ,

Truly a pleasure to listen to

I was hooked. This was one of the greatest things I’ve ever listened to. It was well structured and paced. The content was truly interesting, and the tone that it was delivered in, resonated with me very well. I wish that it did not have to end. Well at least not until the world ends. 😀

Top Podcasts In Science

Listeners Also Subscribed To

More by iHeartRadio