The New Stack Podcast

The New Stack

The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack

  1. 12H AGO

    Microsoft wants to make service mesh invisible

    At KubeCon EU 2026, Mitch Connors of Microsoft outlined a vision to make service meshes effectively invisible to users. Now working on Azure Kubernetes Application Network, a fully managed service built on Istio’s ambient mode, Connors aims to deliver core capabilities like mTLS without requiring users to engage with the complexity traditionally associated with service meshes. Ambient mode eliminates sidecar upgrade challenges by shifting functionality to node-level and waypoint proxies, though adoption still faces hurdles, including lagging CVE patching. Connors emphasized that AI workloads are reshaping network demands, as request variability in large language models requires smarter routing and resource management. Istio is addressing this through a two-speed model: stable APIs for reliability and experimental integrations like Agent Gateway for emerging AI protocols. Features such as inference-aware routing and policy enforcement for approved LLM endpoints highlight the mesh’s growing role in AI governance. With multi-cluster support and GPU scarcity driving workload mobility, Microsoft’s approach bets that simplifying and abstracting the mesh will broaden adoption while meeting the evolving needs of AI-driven systems. Learn more from The New Stack about service meshes:  The Hidden Costs of Service Meshes All the Things a Service Mesh Can Do Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    21 min
  2. 1D AGO

    Amazon EKS Auto Mode wants to end Kubernetes toil — one node at a time

    At KubeCon + CloudNativeCon Europe 2026 in Amsterdam, Alex Kestner, principal product manager for Amazon Elastic Kubernetes Service (EKS), discussed how Amazon EKS Auto Mode aims to reduce the operational burden of running Kubernetes at scale. While Kubernetes delivers significant power, it also introduces complexity—particularly through repetitive, day-to-day tasks like managing node lifecycles, ensuring security updates, and selecting optimal infrastructure. Kestner emphasized that much of this “undifferentiated heavy lifting” distracts platform teams from delivering business value. Amazon EKS Auto Mode addresses this by automating infrastructure operations across the full node lifecycle, shifting responsibility for key operational components outside the cluster and into AWS-managed services. Built in collaboration with the EC2 team and leveraging technologies like Karpenter, Auto Mode dynamically provisions right-sized compute resources based on workload requirements. While it doesn’t eliminate all challenges—such as unpredictable workloads or diverse deployment needs—it provides a more application-focused approach to scaling and cost optimization. Ultimately, Auto Mode represents a meaningful step toward simplifying Kubernetes operations in increasingly complex cloud-native environments. Learn more from The New Stack about the latest developments around the latest with Amazon Elastic Kubernetes Service (EKS): 2026 Will Be the Year of Agentic Workloads in Production on Amazon EKS How Amazon EKS Auto Mode Simplifies Kubernetes Cluster Management (Part 1) A Deep Dive Into Amazon EKS Auto (Part 2) Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    23 min
  3. APR 1

    Edge-forward: Akamai eyes sweet spot between centralized & decentralized AI inference

    At KubeCon + CloudNativeCon Europe 2026, Lena Hall and Thorsten Hans of Akamai outlined how the company is evolving from a CDN provider into a developer-focused cloud platform for AI. Akamai’s strategy centers on low-latency, distributed computing, combining managed Kubernetes, serverless functions, and a distributed AI inference platform to support modern workloads. With a global footprint of core and “distributed reach” datacenters, Akamai aims to bring compute closer to users while still leveraging centralized infrastructure for heavier processing. This hybrid model enables faster feedback loops critical for applications like fraud detection, robotics, and conversational AI. To address concerns about complexity, Akamai emphasizes managed infrastructure and self-service tools that abstract away integration challenges. Its platform supports open source through managed Kubernetes and pre-packaged tools, simplifying deployment. Akamai also invests in serverless technologies like WebAssembly-based functions, enabling developers to build and deploy globally distributed applications quickly. Overall, the company prioritizes developer experience, allowing teams to focus on application logic rather than infrastructure management. Learn more from The New Stack about the latest developments around how Akamai is transforming to a developer-focused cloud platform for AI. Akamai Picks Up Hosting for Kernel.org Should You Care About Fermyon Wasm Functions on Akamai? Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    22 min
  4. MAR 24

    Kubernetes co-founder Brendan Burns: AI-generated code will become as invisible as assembly

    In this episode of The New Stack Makers, Microsoft Corporate Vice President and Technical Fellow, Brendan Burns discusses how AI is reshaping Kubernetes and modern infrastructure. Originally designed for stateless applications, Kubernetes is evolving to support AI workloads that require complex GPU scheduling, co-location, and failure sensitivity. Features like Dynamic Resource Allocation and projects such as KAITO introduce AI-specific capabilities, while maintaining Kubernetes’ core strength: vendor-neutral extensibility.  Burns highlights that AI also changes how systems are monitored. Success is no longer binary; it depends on answer quality, user feedback, and large-scale testing using thousands of prompts and even AI evaluators.  On software development, Burns argues that the industry’s focus on reviewing AI-generated code is temporary. Just as developers stopped inspecting compiler output, AI-generated code will become a disposable artifact validated by tests and specifications. This shift will redefine engineering roles and may lead to programming languages designed for machines rather than humans, signaling a fundamental transformation in how software is built and maintained. Learn more from The New Stack about the latest developments around how AI is reshaping Kubernetes and modern infrastructure: How To Use AI To Design Intelligent, Adaptable Infrastructure The AI Infrastructure crisis: When ambition meets ancient systems  Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    44 min
  5. MAR 20

    AI can write your infrastructure code. There's a reason most teams won't let it.

    In this episode ofThe New Stack Agents, Marcin Wyszynski, co-founder of Spacelift and OpenTofu, explains how AI is transforming infrastructure as code (IaC). Originally built for individual operators, tools like Terraform struggled to scale across teams, prompting Wyszynski to help launch OpenTofu after HashiCorp’s 2023 license change. Now, the bigger shift is AI: engineers no longer write configuration languages like HCL manually, as AI tools generate it, dramatically lowering the barrier to entry. However, this creates a dangerous gap between generating infrastructure and truly understanding it—like using a phrasebook to ask questions in a foreign language but not understanding the response. In infrastructure, that lack of comprehension can lead to serious risks. To address this, Spacelift introduced Intent, which allows AI to directly interact with cloud systems in real time while enforcing deterministic guardrails through policy controls. The broader challenge remains balancing speed with control—enabling faster experimentation without sacrificing safety. Wyszynski argues that, like humans, AI can be trusted when constrained by strong guardrails. Learn more from The New Stack about the latest developments around how AI is transforming infrastructure as code (IaC). The Maturing State of Infrastructure as Code in 2025 Generative AI Tools for Infrastructure as Code Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    29 min
  6. MAR 6

    OutSystems CEO on how enterprises can successfully adopt vibe coding

    Woodson Martin, CEO ofOutSystems, argues that successful enterprise AI deployments rarely rely on standalone agents. Instead, production systems combine AI agents with data, workflows, APIs, applications, and human oversight. While claims that “95% of agent pilots fail” are common, Martin suggests many of those pilots were simply low-commitment experiments made possible by the low cost of testing AI. Enterprises that succeed typically keep humans in the loop, at least initially, to review recommendations and maintain control over decisions. Current enterprise use cases for agents include document processing, decision support, and personalized outputs. When integrated into broader systems, these applications can deliver measurable productivity gains. For example,Travel Essencebuilt an agentic system that reduced a two-hour customer planning process to three minutes, allowing staff to focus more on sales and helping drive 20% top-line growth. Martin also believes AI will pressure traditional SaaS seat-based pricing and accelerate custom software development. In this environment, governed platforms like OutSystems can help enterprises adopt “vibe coding” while maintaining compliance, security, and lifecycle management. Learn more from The New Stack about the latest developments around enterprise adoption of vibe coding: How To Use Vibe Coding Safely in the Enterprise 5 Challenges With Vibe Coding for Enterprises  Vibe Coding: The Shadow IT Problem No One Saw Coming Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    44 min
  7. FEB 20

    NanoClaw's answer to OpenClaw is minimal code, maximum isolation

    OnThe New Stack Agents, Gavriel Cohen discusses why he built NanoClaw, a minimalist alternative to OpenClaw, after discovering security and architectural flaws in the rapidly growing agentic framework. Cohen, co-founder of AI marketing agencyQwibit, had been running agents across operations, sales, and research usingClaude Code. When Clawdbot (laterOpenClaw) launched, it initially seemed ideal. But Cohen grew concerned after noticing questionable dependencies—including his own outdated GitHub package—excessive WhatsApp data storage, a massive AI-generated codebase nearing 400,000 lines, and a lack of OS-level isolation between agents. In response, he createdNanoClawwith radical minimalism: only a few hundred core lines, minimal dependencies, and containerized agents. Built around Claude Code “skills,” NanoClaw enables modular, build-time integrations while keeping the runtime small enough to audit easily. Cohen argues AI changes coding norms—favoring duplication over DRY, relaxing strict file limits, and treating code as disposable. His goal is simple, secure infrastructure that enterprises can fully understand and trust.   Learn more from The New Stack about the latest around personal AI agents Anthropic: You can still use your Claude accounts to run OpenClaw, NanoClaw and Co. It took a researcher fewer than 2 hours to hijack OpenClaw OpenClaw is being called a security “Dumpster fire,” but there is a way to stay safe Join our community of newsletter subscribers to stay on top of the news and at the top of your game.

    52 min
4.3
out of 5
31 Ratings

About

The New Stack Podcast is all about the developers, software engineers and operations people who build at-scale architectures that change the way we develop and deploy software. For more content from The New Stack, subscribe on YouTube at: https://www.youtube.com/c/TheNewStack

More From The New Stack

You Might Also Like