475 episodes

Babbage is our weekly podcast on science and technology, named after Charles Babbage—a 19th-century polymath and grandfather of computing. Host Alok Jha talks to our correspondents about the innovations, discoveries and gadgetry shaping the world. Published every Wednesday.
If you’re already a subscriber to The Economist, you’ll have full access to all our shows as part of your subscription.
For more information about Economist Podcasts+, including how to get access, please visit our FAQs page here https://myaccount.economist.com/s/article/What-is-Economist-Podcasts

Hosted on Acast. See acast.com/privacy for more information.

Babbage from The Economist The Economist

    • Tech News

Listen on Apple Podcasts
Requires subscription and macOS 11.4 or higher

Babbage is our weekly podcast on science and technology, named after Charles Babbage—a 19th-century polymath and grandfather of computing. Host Alok Jha talks to our correspondents about the innovations, discoveries and gadgetry shaping the world. Published every Wednesday.
If you’re already a subscriber to The Economist, you’ll have full access to all our shows as part of your subscription.
For more information about Economist Podcasts+, including how to get access, please visit our FAQs page here https://myaccount.economist.com/s/article/What-is-Economist-Podcasts

Hosted on Acast. See acast.com/privacy for more information.

Listen on Apple Podcasts
Requires subscription and macOS 11.4 or higher

    Babbage picks: SpaceX’s Starship reaches orbit

    Babbage picks: SpaceX’s Starship reaches orbit

    An article from The Economist read aloud. Our science and technology section reports on the recent test flight of Elon Musk’s Starship. While the rocket failed to return to Earth, it’s a step nearer to the stars.


    For more on Starship, check out our Babbage podcast from 2022 at economist.com/starship-pod.

    Babbage: The science that built the AI revolution—part four

    Babbage: The science that built the AI revolution—part four

    What made AI models generative? In 2022, it seemed as though the much-anticipated AI revolution had finally arrived. Large language models swept the globe, and deepfakes were becoming ever more pervasive. Underneath it all were old algorithms that had been taught some new tricks. Suddenly, artificial intelligence seemed to have the skill of creativity. Generative AI had arrived and promised to transform…everything.

    This is the final episode in a four-part series on the evolution of modern generative AI. What were the scientific and technological developments that took the very first, clunky artificial neurons and ended up with the astonishingly powerful large language models that power apps such as ChatGPT?

    Host: Alok Jha, The Economist’s science and technology editor. Contributors: Lindsay Bartholomew of the MIT Museum; Yoshua Bengio of the University of Montréal; Fei-Fei Li of Stanford University; Robert Ajemian and Greta Tuckute of MIT; Kyle Mahowald of the University of Texas at Austin; Daniel Glaser of London’s Institute of Philosophy; Abby Bertics, The Economist’s science correspondent.

    On Thursday April 4th, we’re hosting a live event where we’ll answer as many of your questions on AI as possible, following this Babbage series. If you’re a subscriber, you can submit your question and find out more at economist.com/aievent.

    Listen to what matters most, from global politics and business to science and technology—subscribe to Economist Podcasts+

    For more information about how to access Economist Podcasts+, please visit our FAQs page or watch our video explaining how to link your account.

    Babbage: The science that built the AI revolution—part three

    Babbage: The science that built the AI revolution—part three

    What made AI take off? A decade ago many computer scientists were focused on building algorithms that would allow machines to see and recognise objects. In doing so they hit upon two innovations—big datasets and specialised computer chips—that quickly transformed the potential of artificial intelligence. How did the growth of the world wide web and the design of 3D arcade games create a turning point for AI?

    This is the third episode in a four-part series on the evolution of modern generative AI. What were the scientific and technological developments that took the very first, clunky artificial neurons and ended up with the astonishingly powerful large language models that power apps such as ChatGPT?

    Host: Alok Jha, The Economist’s science and technology editor. Contributors: Fei-Fei Li of Stanford University; Robert Ajemian and Karthik Srinivasan of MIT; Kelly Clancy, author of “Playing with Reality”; Pietro Perona of the California Institute of Technology; Tom Standage, The Economist’s deputy editor.

    On Thursday April 4th, we’re hosting a live event where we’ll answer as many of your questions on AI as possible, following this Babbage series. If you’re a subscriber, you can submit your question and find out more at economist.com/aievent.

    Listen to what matters most, from global politics and business to science and technology—subscribe to Economist Podcasts+

    For more information about how to access Economist Podcasts+, please visit our FAQs page or watch our video explaining how to link your account.

    Babbage picks: How smart are “smart-drugs”?

    Babbage picks: How smart are “smart-drugs”?

    An article from The Economist read aloud. Our business section reports that brain-boosting substances are all the rage but their utility is debatable.

    Babbage: The science that built the AI revolution—part two

    Babbage: The science that built the AI revolution—part two

    How do machines learn? Learning is fundamental to artificial intelligence. It’s how computers can recognise speech or identify objects in images. But how can networks of artificial neurons be deployed to find patterns in data, and what is the mathematics that makes it all possible?

    This is the second episode in a four-part series on the evolution of modern generative AI. What were the scientific and technological developments that took the very first, clunky artificial neurons and ended up with the astonishingly powerful large language models that power apps such as ChatGPT?

    Host: Alok Jha, The Economist’s science and technology editor. Contributors: Pulkit Agrawal and Gabe Margolis of MIT; Daniel Glaser, a neuroscientist at London’s Institute of Philosophy; Melanie Mitchell of the Santa Fe Institute; Anil Ananthaswamy, author of “Why Machines Learn”.

    On Thursday April 4th, we’re hosting a live event where we’ll answer as many of your questions on AI as possible, following this Babbage series. If you’re a subscriber, you can submit your question and find out more at economist.com/aievent.

    Get a world of insights for 50% off—subscribe to Economist Podcasts+

    If you’re already a subscriber to The Economist, you’ll have full access to all our shows as part of your subscription. For more information about how to access Economist Podcasts+, please visit our FAQs page or watch our video explaining how to link your account.

    Babbage: The science that built the AI revolution—part one

    Babbage: The science that built the AI revolution—part one

    What is intelligence? In the middle of the 20th century, the inner workings of the human brain inspired computer scientists to build the first “thinking machines”. But how does human intelligence actually relate to the artificial kind?
    This is the first episode in a four-part series on the evolution of modern generative AI. What were the scientific and technological developments that took the very first, clunky artificial neurons and ended up with the astonishingly powerful large language models that power apps such as ChatGPT?
    Host: Alok Jha, The Economist’s science and technology editor. Contributors: Ainslie Johnstone, The Economist’s data journalist and science correspondent; Dawood Dassu and Steve Garratt of UK Biobank; Daniel Glaser, a neuroscientist at London’s Institute of Philosophy; Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Laboratory; Yoshua Bengio of the University of Montréal, who is known as one of the “godfathers” of modern AI.
    On Thursday April 4th, we’re hosting a live event where we’ll answer as many of your questions on AI as possible, following this Babbage series. If you’re a subscriber, you can submit your question and find out more at economist.com/aievent. 
    Get a world of insights for 50% off—subscribe to Economist Podcasts+
    If you’re already a subscriber to The Economist, you’ll have full access to all our shows as part of your subscription. For more information about how to access Economist Podcasts+, please visit our FAQs page or watch our video explaining how to link your account.

    Hosted on Acast. See acast.com/privacy for more information.

    • 42 min

Top Podcasts In @@categoryName@@

Economist Podcasts
The Economist
The Intelligence from The Economist
The Economist
Money Talks from The Economist
The Economist
The World in Brief from The Economist
The Economist
The Jab from The Economist
The Economist
Futurewatch from The Economist
The Economist

You Might Also Like

Money Talks from The Economist
The Economist
Checks and Balance from The Economist
The Economist
The Intelligence from The Economist
The Economist
Drum Tower
The Economist
Economist Podcasts
The Economist
The World in Brief from The Economist
The Economist