12 episodes

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

The End Of The World with Josh Clark iHeartPodcasts

    • Science

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

    Fermi Paradox

    Fermi Paradox

    Ever wondered where all the aliens are? It’s actually very weird that, as big and old as the universe is, we seem to be the only intelligent life. In this episode, Josh examines the Fermi paradox, and what it says about humanity’s place in the universe.

    • 36 min
    Great Filter

    Great Filter

    The Great Filter hypothesis says we’re alone in the universe because the process of evolution contains some filter that prevents life from spreading into the universe. Have we passed it or is it in our future? Humanity’s survival may depend on the answer.

    • 43 min
    X Risks

    X Risks

    Humanity could have a future billions of years long – or we might not make it past the next century. If we have a trip through the Great Filter ahead of us, then we appear to be entering it now. It looks like existential risks will be our filter.

    • 39 min
    Natural Risks

    Natural Risks

    Humans have faced existential risks since our species was born. Because we are Earthbound, what happens to Earth happens to us. Josh points out that there’s a lot that can happen to Earth - like gamma ray bursts, supernovae, and runaway greenhouse effect.

    • 37 min
    Artificial Intelligence

    Artificial Intelligence

    An artificial intelligence capable of improving itself runs the risk of growing intelligent beyond any human capacity and outside of our control. Josh explains why a superintelligent AI that we haven’t planned for would be extremely bad for humankind.

    • 41 min
    Biotechnology

    Biotechnology

    Natural viruses and bacteria can be deadly enough; the 1918 Spanish Flu killed 50 million people in four months. But risky new research, carried out in an unknown number of labs around the world, are creating even more dangerous humanmade pathogens.

    • 57 min

Top Podcasts In Science

Making Sense with Sam Harris
Sam Harris
Radiolab
WNYC Studios
Freakonomics, M.D.
Freakonomics Radio + Stitcher
The Skeptics' Guide to the Universe
Dr. Steven Novella
Why This Universe?
Dan Hooper, Shalma Wegsman
Speaking of Psychology
American Psychological Association

You Might Also Like

Stuff To Blow Your Mind
iHeartPodcasts
Stuff You Should Know
iHeartPodcasts
Stuff They Don't Want You To Know
iHeartPodcasts
Ridiculous History
iHeartPodcasts
Daniel and Jorge Explain the Universe
iHeartPodcasts
BrainStuff
iHeartPodcasts

More by iHeartRadio

Stuff You Missed in History Class
iHeartPodcasts
Stuff You Should Know
iHeartPodcasts
Astray
iHeartPodcasts
Therapy for Black Girls
iHeartPodcasts and Joy Harden Bradford, Ph.D.
Drink Champs
Interval Presents
Dead Ass with Khadeen and Devale Ellis
iHeartPodcasts