Episode #47 Trailer : “Can AI Be Controlled?“ For Humanity: An AI Risk Podcast

For Humanity: An AI Safety Podcast

In Episode #47 Trailer, host John Sherman talks with Buck Shlegeris, CEO of Redwood Research, a non-profit company working on technical AI risk challenges. The discussion includes Buck’s thoughts on the new OpenAI o1-preview model, but centers on two questions: is there a way to control AI models before alignment is achieved if it can be, and how would the system that’s supposed to save the world actually work if an AI lab found a model scheming. Check out these links to Buck’s writing on these topics below:

https://redwoodresearch.substack.com/p/the-case-for-ensuring-that-powerful

https://redwoodresearch.substack.com/p/would-catching-your-ais-trying-to

Senate Hearing:

https://www.judiciary.senate.gov/committee-activity/hearings/oversight-of-ai-insiders-perspectives

Harry Macks Youtube Channel

https://www.youtube.com/channel/UC59ZRYCHev_IqjUhremZ8Tg

LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE

https://pauseai.info/local-organizing

Please Donate Here To Help Promote For Humanity

https://www.paypal.com/paypalme/forhumanitypodcast

EMAIL JOHN: forhumanitypodcast@gmail.com

This podcast is not journalism. But it’s not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. 

For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.

Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.

RESOURCES:

JOIN THE FIGHT, help Pause AI!!!!

Pause AI

SUBSCRIBE TO LIRON SHAPIRA’S DOOM DEBATES on YOUTUBE!!

https://www.youtube.com/@DoomDebates

Join the Pause AI Weekly Discord Thursdays at 2pm EST

  / discord  

https://discord.com/invite/pVMWjddaW7

Max Winga’s “A Stark Warning About Extinction”

https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22

For Humanity Theme Music by Josef Ebner

Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg

Website: https://josef.pictures

BUY STEPHEN HANSON’S BEAUTIFUL AI RISK BOOK!!!

https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom

22 Word Statement from Center for AI Safety

Statement on AI Risk | CAIS

https://www.safe.ai/work/statement-on-ai-risk

Best Account on Twitter: AI Notkilleveryoneism Memes 

https://twitter.com/AISafetyMemes

무삭제판 에피소드를 청취하려면 로그인하십시오.

이 프로그램의 최신 정보 받기

프로그램을 팔로우하고, 에피소드를 저장하고, 최신 소식을 받아보려면 로그인하거나 가입하십시오.

국가 또는 지역 선택

아프리카, 중동 및 인도

아시아 태평양

유럽

라틴 아메리카 및 카리브해

미국 및 캐나다