The New Stack Podcast

The New Stack

The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack

  1. -8 H

    Jupyter Deploy: the New Middle Ground between Laptops and Enterprise

    At JupyterCon 2025, Jupyter Deploy was introduced as an open source command-line tool designed to make cloud-based Jupyter deployments quick and accessible for small teams, educators, and researchers who lack cloud engineering expertise. As described by AWS engineer Jonathan Guinegagne, these users often struggle in an “in-between” space—needing more computing power and collaboration features than a laptop offers, but without the resources for complex cloud setups.  Jupyter Deploy simplifies this by orchestrating an entire encrypted stack—using Docker, Terraform, OAuth2, and Let’s Encrypt—with minimal setup, removing the need to manually manage 15–20 cloud components. While it offers an easy on-ramp, Guinegagne notes that long-term use still requires some cloud understanding. Built by AWS’s AI Open Source team but deliberately vendor-neutral, it uses a template-based approach, enabling community-contributed deployment recipes for any cloud. Led by Brian Granger, the project aims to join the official Jupyter ecosystem, with future plans including Kubernetes integration for enterprise scalability.  Learn more from The New Stack about the latest in Jupyter AI development:  Introduction to Jupyter Notebooks for Developers Display AI-Generated Images in a Jupyter Notebook  Join our community of newsletter subscribers to stay on top of the news and at the top of your game.   Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    22 min
  2. -1 J

    From Physics to the Future: Brian Granger on Project Jupyter in the Age of AI

    In an interview at JupyterCon, Brian Granger — co-creator of Project Jupyter and senior principal technologist at AWS — reflected on Jupyter’s evolution and how AI is redefining open source sustainability. Originally inspired by physics’ modular principles, Granger and co-founder Fernando Pérez designed Jupyter with flexible, extensible components like the notebook format and kernel message protocol. This architecture has endured as the ecosystem expanded from data science into AI and machine learning.  Now, AI is accelerating development itself: Granger described rewriting Jupyter Server in Go, complete with tests, in just 30 minutes using an AI coding agent — a task once considered impossible. This shift challenges traditional notions of technical debt and could reshape how large open source projects evolve. Jupyter’s 2017 ACM Software System Award placed it among computing’s greats, but also underscored its global responsibility. Granger emphasized that sustaining Jupyter’s mission — empowering human reasoning, collaboration, and innovation — remains the team’s top priority in the AI era.   Learn more from The New Stack about the latest in Jupyter AI development:  Introduction to Jupyter Notebooks for Developers  Display AI-Generated Images in a Jupyter Notebook    Join our community of newsletter subscribers to stay on top of the news and at the top of your game.      Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    23 min
  3. -2 J

    Jupyter AI v3: Could It Generate an ‘Ecosystem of AI Personas’?

    Jupyter AI v3 marks a major step forward in integrating intelligent coding assistance directly into JupyterLab. Discussed by AWS engineers David Qiu and Piyush Jain at JupyterCon, the new release introduces AI personas— customizable, specialized assistants that users can configure to perform tasks such as coding help, debugging, or analysis. Unlike other AI tools, Jupyter AI allows multiple named agents, such as “Claude Code” or “OpenAI Codex,” to coexist in one chat.  Developers can even build and share their own personas as local or pip-installable packages. This flexibility was enabled by splitting Jupyter AI’s previously large, complex codebase into smaller, modular packages, allowing users to install or replace components as needed. Looking ahead, Qiu envisions Jupyter AI as an “ecosystem of AI personas,” enabling multi-agent collaboration where different personas handle roles like data science, engineering, and testing. With contributors from AWS, Apple, Quansight, and others, the project is poised to expand into a diverse, community-driven AI ecosystem. Learn more from The New Stack about the latest in Jupyter AI development:  Introduction to Jupyter Notebooks for Developers Display AI-Generated Images in a Jupyter Notebook Join our community of newsletter subscribers to stay on top of the news and at the top of your game.  Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    23 min
  4. 31 OCT.

    Stop Writing Code, Start Writing Docs

    In this episode of The New Stack Podcast, hosts Alex Williams and Frederic Lardinois spoke with Keith Ballinger, Vice President and General Manager of Google Cloud Platform Developer Experience (GPC), about the evolution of agentic coding tools and the future of programming. Ballinger, a hands-on executive who still codes, discussed Gemini CLI, Google’s response to tools like Claude Code, and his broader philosophy on how developers should work with AI. He emphasized that these tools are in their “first inning” and that developers must “slow down to speed up” by writing clear guides, focusing on architecture, and documenting intent—treating AI as a collaborative coworker rather than a one-shot solution.  Ballinger reflected on his early AI experiences, from Copilot at GitHub to modern agentic systems that automate tool use. He also explored the resurgence of the command line as an AI interface and predicted that programming will increasingly shift from writing code to expressing intent. Ultimately, he envisions a future where great programmers are great writers, focusing on clarity, problem decomposition, and design rather than syntax.  Learn more from The New Stack about the latest in Google AI development:  Why PyTorch Gets All the Love  Lightning AI Brings a PyTorch Copilot to Its Development Environment  Ray Comes to the PyTorch Foundation  Join our community of newsletter subscribers to stay on top of the news and at the top of your game.  Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    1 h 3 min
  5. 10 OCT.

    Harness CEO Jyoti Bansal on Why AI Coding Doesn't Help You Ship Faster

    Harness co-founder Jyoti Bansal highlights a growing issue in software development: while AI tools help generate more code, they often create bottlenecks further along the pipeline, especially in testing, deployment, and compliance. Since its 2017 launch, Harness has aimed to streamline these stages using AI and machine learning. With the rise of large language models (LLMs), the company shifted toward agentic AI, introducing a library of specialized agents—like DevOps, SRE, AppSec, and FinOps agents—that operate behind a unified interface called Harness AI. These agents assist in building production pipelines, not deploying code directly, ensuring human oversight remains critical for compliance and security. Bansal emphasizes that AI in development isn't replacing people but accelerating workflows to meet tighter timelines. He also notes strong enterprise adoption, with even large, traditionally slower-moving organizations embracing AI integration. On the topic of an AI bubble, Bansal sees it as a natural part of innovation, akin to the Dotcom era, where market excitement can still lead to meaningful long-term transformation despite short-term volatility.   Learn more from The New Stack about the latest in Harness' AI approach to software development:  Harness AI Tackles Software Development’s Real Bottleneck   Harnessing AI To Elevate Automated Software Testing  Join our community of newsletter subscribers to stay on top of the news and at the top of your game.  Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    39 min
  6. 3 OCT.

    How Agentgateway Solves Agentic AI’s Connectivity Challenges

    The agentic AI space faces challenges around secure, governed connectivity between agents, tools, large language models, and microservices. To address this, Solo.io developed two open-source projects: Kagent and Agentgateway. While Kagent, donated to the Cloud Native Computing Foundation, helps scale AI agents, it lacks a secure way to mediate communication between agents and tools. Enter Agentgateway, donated to the Linux Foundation, which provides governance, observability, and security for agent-to-agent and agent-to-tool traffic. Written in Rust, it supports protocols like MCP and A2A and integrates with Kubernetes Gateway API and inference gateways. Lin Sun, Solo.io’s head of open source, explained that Agentgateway allows developers to control which tools agents can access—offering flexibility to expose only tested or approved tools. This enables fine-grained policy enforcement and resilience in agent communication, similar to how service meshes manage microservice traffic. Agentgateway ensures secure and selective tool exposure, supporting scalable and secure agent ecosystems. Major players like AWS and Microsoft are also engaging in its development. Learn more from The New Stack about the latest in open source projects like Agentgateway:   Learn more from The New Stack about the latest in open source projects like Agentgateway:  Why Tech Giants Are Backing the New Agentgateway Project  AI Agents Are Creating a New Security Nightmare for Enterprises and Startups  Five Steps to Build AI Agents that Actually Deliver Business Results  Join our community of newsletter subscribers to stay on top of the news and at the top of your game.  Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    21 min
  7. 26 SEPT.

    Sentry Founder: AI Patch Generation Is 'Awful' Right Now

    David Cramer, founder and chief product officer of Sentry, remains skeptical about generative AI's current ability to replace human engineers, particularly in software production. While he acknowledges AI tools aren't yet reliable enough for full autonomy—especially in tasks like patch generation—he sees value in using large language models (LLMs) to enhance productivity. Sentry's AI-powered tool, Seer, uses GenAI to help developers debug more efficiently by identifying root causes and summarizing complex system data, mimicking some functions of senior engineers. However, Cramer emphasizes that human oversight remains essential, describing the current stage as "human in the loop" AI, useful for speeding up code reviews and identifying overlooked bugs. Cramer also addressed Sentry's shift from open source to fair source licensing due to frustration over third parties commercializing their software without contributing back. Sentry now uses Functional Source Licensing, which becomes Apache 2.0 after two years. This move aims to strike a balance between openness and preventing exploitation, while maintaining accessibility for users and avoiding fragmented product versions. Learn more from The New Stack about the latest in Sentry and David Cramer thoughts on AI development:   Install Sentry to Monitor Live Applications Frontend Development Challenges for 2021 Join our community of newsletter subscribers to stay on top of the news and at the top of your game.    Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

    45 min
4,3
sur 5
31 notes

À propos

The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack

Plus de contenus par The New Stack

Vous aimeriez peut‑être aussi