5 episodes

What should be the trajectory of intelligence beyond humanity?The Trajectory pull covers realpolitik on artificial general intelligence and the posthuman transition - by asking tech, policy, and AI research leaders the hard questions about what's after man, and how we should define and create a worthy successor (danfaggella.com/worthy). Hosted by Daniel Faggella.

The Trajectory Daniel Faggella

    • Technology

What should be the trajectory of intelligence beyond humanity?The Trajectory pull covers realpolitik on artificial general intelligence and the posthuman transition - by asking tech, policy, and AI research leaders the hard questions about what's after man, and how we should define and create a worthy successor (danfaggella.com/worthy). Hosted by Daniel Faggella.

    Dileep George - Keep Strong AI as a Tool, Not a Successor (AGI Destinations Series, Episode 4)

    Dileep George - Keep Strong AI as a Tool, Not a Successor (AGI Destinations Series, Episode 4)

    This is an interview with Dileep George, AI Researcher at Google DeepMind, previously CTO and Co-founder of Vicarious AI.

    This is the fourth episode in a 5-part series about "AGI Destinations" - where we unpack the preferable and non-preferable futures humanity might strive towards in the years ahead.

    Watch Dileep's episode on The Trajectory YouTube channel: https://youtu.be/nmsuHz43X24 

    See more of Dileep's ideas - and his humorous AGI comics - at: https://dileeplearning.github.io/

    Some of the resources referenced in this episode:

    -- The Intelligence Trajectory Political Matrix: http://www.danfaggella.com/itpm

    ...

    There three main questions we'll be covering on The Trajectory:

    1. Who are the power players in AGI and what are their incentives?

    2. What kind of posthuman future are we moving towards, or should we be moving towards?

    3. What should we do about it?

    If this sounds like it's up your alley, I'm glad to have you here.

    Connect:
    -- Web -- danfaggella.com/trajectory
    -- Twitter -- twitter.com/danfaggella
    -- LinkedIn -- linkedin.com/in/danfaggella
    -- Newsletter -- bit.ly/TrajectoryTw
    -- YouTube -- https://youtube.com/@trajectoryai

    • 1 hr 11 min
    Ben Goertzel - Regulating AGI May Do More Harm Than Good (AGI Destinations Series, Episode 3)

    Ben Goertzel - Regulating AGI May Do More Harm Than Good (AGI Destinations Series, Episode 3)

    This is an interview with Ben Goertzel, CEO of SingularityNET, and AGI researcher for many decades.

    This is the third episode in a 5-part series about "AGI Destinations" - where we unpack the preferable and non-preferable futures humanity might strive towards in the years ahead.

    Watch Ben's episode on The Trajectory YouTube channel: https://youtu.be/faU0EdQHDpY

    Read more from Ben on X: https://twitter.com/bengoertzel

    I often recommend Ben's "Cosmist Manifesto" as a relatively frank and honest take on posthuman / AGI futures: https://www.amazon.com/Cosmist-Manifesto-Practical-Philosophy-Posthuman/dp/0984609709

    Some of the resources referenced in this episode:

    -- The Intelligence Trajectory Political Matrix: http://www.danfaggella.com/itpm
    -- The SDGs of Strong AGI: https://emerj.com/ai-power/sdgs-of-ai/

    ...

    There three main questions we'll be covering on The Trajectory:

    1. Who are the power players in AGI and what are their incentives?

    2. What kind of posthuman future are we moving towards, or should we be moving towards?

    3. What should we do about it?

    If this sounds like it's up your alley, I'm glad to have you here.

    Connect:
    danfaggella.com/trajectory
    twitter.com/danfaggella
    linkedin.com/in/danfaggella

    Newsletter:
    bit.ly/TrajectoryTw

    • 1 hr 5 min
    Jaan Tallinn - The Case for a Pause Before We Birth AGI (AGI Destinations Series, Episode 2)

    Jaan Tallinn - The Case for a Pause Before We Birth AGI (AGI Destinations Series, Episode 2)

    This is an interview with Jaan Tallinn, billionaire tech mogul (Kazaa, Skype), brilliant thinker, and long-time funder for AI safety causes.

    This is the second episode in a 5-part series about "AGI Destinations" - where we unpack the preferable and non-preferable futures humanity might strive towards in the years ahead.

    Watch Jaan's full episode on YouTube: https://www.youtube.com/watch?v=gnIKpsqrtsE

    This episode referred to the following other essays and resources:

    -- The Intelligence Trajectory Political Matrix: danfaggella.com/itpm
    -- The SDGs of Strong AGI: https://emerj.com/ai-power/sdgs-of-ai/

    ...

    There three main questions we'll be covering on The Trajectory:

    1. Who are the power players in AGI and what are their incentives?

    2. What kind of posthuman future are we moving towards, or should we be moving towards?

    3. What should we do about it?

    If this sounds like it's up your alley, I'm glad to have you here.

    Connect:
    danfaggella.com/trajectory
    twitter.com/danfaggella
    linkedin.com/in/danfaggella
    youtube.com/@trajectoryai

    Newsletter:
    bit.ly/TrajectoryTw

    • 20 min
    Yoshua Bengio - Why We Shouldn't Blast Off to AGI Just Yet (AGI Destinations Series, Episode 1)

    Yoshua Bengio - Why We Shouldn't Blast Off to AGI Just Yet (AGI Destinations Series, Episode 1)

    This is an interview with Dr. Yoshua Bengio, Turing Award Winner and Scientific Director of the Montreal Institute for Learning Algorithms.

    This is the first episode in a 5-part series about "AGI Destinations" - where we unpack the preferable and non-preferable futures humanity might strive towards in the years ahead.

    Watch Yoshua's full episode on The Trajectory YouTube channel: https://www.youtube.com/watch?v=P6Z5lgtH7_I

    Read more of Yoshua's work at: https://yoshuabengio.org/

    Some of the resources referenced in this episode:

    -- The Intelligence Trajectory Political Matrix: danfaggella.com/itpm
    -- The SDGs of Strong AGI: https://emerj.com/ai-power/sdgs-of-ai/

    ...

    There three main questions we'll be covering on the Trajectory:

    1. Who are the power players in AGI and what are their incentives?

    2. What kind of posthuman future are we moving towards, or should we be moving towards?

    3. What should we do about it?

    If this sounds like it's up your alley, I'm glad to have you here.

    Connect on social:
    danfaggella.com/trajectory
    twitter.com/danfaggella
    linkedin.com/in/danfaggella
    youtube.com/@trajectoryai

    Newsletter:
    bit.ly/TrajectoryTw

    • 1 hr 3 min
    Welcome to The Trajectory - Realpolitik on AGI and the Posthuman Future

    Welcome to The Trajectory - Realpolitik on AGI and the Posthuman Future

    There three main questions we'll be covering on the Trajectory: 

    1. Who are the power players in AGI and what are their incentives? 

    2. What kind of posthuman future are we moving towards, or should we be moving towards? 

    3. What should we do about it? 

    If this sounds like it's up your alley, I'm glad to have you here. 

    Connect: 
    danfaggella.com/trajectory
    twitter.com/danfaggella
    linkedin.com/in/danfaggella

    • 5 min

Top Podcasts In Technology

Acquired
Ben Gilbert and David Rosenthal
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Lex Fridman Podcast
Lex Fridman
Hard Fork
The New York Times
TED Radio Hour
NPR
Darknet Diaries
Jack Rhysider

You Might Also Like

Rick Wilson's The Enemies List
Resolute Square
Moonshots with Peter Diamandis
PHD Ventures
The Bulwark Podcast
The Bulwark Podcast
Fareed Zakaria GPS
CNN