For Humanity: An AI Safety Podcast
For Humanity, An AI Safety Podcast is the the AI Safety Podcast for regular people. Peabody, duPont-Columbia and multi-Emmy Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2-10 years. This podcast is solely about the threat of human extinction from AGI. We’ll name and meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.
Bande-annonce
John Sherman is Ideal For This Topic
26/04/2024
Wow! Already a few episodes in, I am quite impressed and looking forward to listening to more of this podcast. I’m fully aware that AI is fiercely important for obvious reasons; - some intriguing, most horrifying; yet I am woefully uninformed on what is AI’s true technological value and what is its hype. Additionally, I am familiar with John Sherman and have followed his illustrious, interesting and award winning career. The intriguing concept of Artificial Intelligence, a fascinating panel of guests , and John’s far reaching political, social and cultural awareness, his many journalistic achievements, his integrity, his probing intellect, his natural flair for the spoken word and his creative expression all percolate into a perfectly balanced alchemy; resulting in an important podcast that delivers informed, diverse , and thought provoking conversations about the fascinating and world altering implications of AI.
Everyone needs to listen to this podcast
22/02/2024
I have been following the AI conversation for about one year. I am not a tech expert so it’s been difficult to understand everything I hear. This podcast does an incredibly effective job of taking a complex and divisive issue and making it understandable. John is a great host, combining whit and intellect to bring this issue to light. Thanks for making this happen!
Provocative
07/11/2023
A solid start to a provocative series. Almost like meta-analysis of the best current thinking on this topic, that’s very access to lay people like me. Definitely has my concern up. I am looking forward to seeing where John takes this in future episodes…
Inappropriate Language
04/01/2024
Couldn’t even get past one of the trailers.
À propos
Informations
- CréationJohn Sherman
- Années d’activité2023 - 2025
- Épisodes108
- ClassificationContenu explicite
- Copyright© John Sherman
- Site web de l’émission
Vous aimeriez peut‑être aussi
- TechnologiesDeux fois par semaine
- TechnologiesChaque semaine
- TechnologiesDeux fois par semaine
- TechnologiesChaque semaine
- GouvernementToutes les 2 semaines
- TechnologiesToutes les 2 semaines
- SciencesToutes les 2 semaines