Cloud Native Testing Podcast

Testkube

The Cloud Native Testing Podcast, sponsored by Testkube, brings you insights from engineers navigating testing in cloud-native environments. Hosted by Ole Lensmar, it explores test automation, CI/CD, Kubernetes, shifting left, scaling right, and reliability at scale through conversations with testing and cloud native experts. Learn more about Testkube at http://testkube.io

  1. Bridging the Gap: Production Readiness and AI with Dev Plaza’s Sahana Nagabhushan

    MAR 20

    Bridging the Gap: Production Readiness and AI with Dev Plaza’s Sahana Nagabhushan

    In this episode of the Cloud Native Testing Podcast, Ole Lensmar sits down with Sahana Nagabhushan to tackle the complexities of software lifecycle inefficiencies and developer productivity. Drawing from her extensive product management experience at Fidelity Investments and Workday, Sahana breaks down why companies burn millions of dollars and years of engineering time building custom Internal Developer Portals (IDPs) like Backstage. Instead of relying solely on generic frameworks, she advocates for implementing a "production readiness layer"—a critical bridge between writing code and shipping it confidently to live environments. Ole and Sahana also dive deep into the realities of AI in modern software development. They discuss why AI-generated code still desperately requires human oversight, the current limitations of AI in generating deterministic tests, and the strategic value of integrating existing top-tier tools (like Testkube) to maintain high-quality standards without reinventing the wheel. Key Takeaways: The Cost of DIY Platforms: Why building internal developer portals from scratch can cost companies up to $16 million with delayed ROI.Defining Production Readiness: How companies can create a "trust layer" to ensure code is fully tested, governed, and truly ready for deployment.Backstage vs. Purpose-Built Tools: A candid discussion on why generic plugin frameworks might not solve core quality and testing issues.AI's Role in Testing: Why human context, sentiment, and oversight remain crucial when navigating AI-generated code and test creation.

    23 min
  2. API Mocking, Contract Testing, and the AI Shift with Yacine from Microcks

    FEB 6

    API Mocking, Contract Testing, and the AI Shift with Yacine from Microcks

    Welcome to the first edition of the Cloud Native Testing Podcast for 2026! In this episode, host Ole Lensmar is joined by Yacine Kheddache to dive deep into Microcks, a CNCF Sandbox project dedicated to API mocking and simulation. As cloud-native architectures grow more complex, the need to decouple services during development is critical. Yacine explains how Microcks serves as a "Swiss Army Knife" for developers, offering a single solution to mock and test REST, gRPC, GraphQL, and Event-Driven protocols (like Kafka and NATS). They discuss the tool's evolution from a centralized Kubernetes operator to a developer-friendly utility that runs natively in IDEs and pipelines, enabling true "shift left" testing. Later in the conversation, they explore the intersection of API testing and Artificial Intelligence. Yacine details how Microcks is embracing the AI era by using Copilots to generate mock data and leveraging the Model Context Protocol (MCP) to make existing APIs accessible to LLMs. Key Topics Discussed: The CNCF Journey: Microcks’ status as a community-driven Sandbox project.Polyglot Support: Mocking REST, GraphQL, gRPC, and AsyncAPI with one tool.The Testing Lifecycle: How to reuse mock data artifacts for automated contract and conformance testing in CI/CD.Shift Left: Moving testing from QA environments to local developer laptops and IDEs.AI & MCP: Generating datasets with AI and exposing APIs as tools for AI Agents using the Model Context Protocol.

    26 min
  3. Scaling Testing for the Age of AI-Generated Code

    08/25/2025

    Scaling Testing for the Age of AI-Generated Code

    In this episode, the main conversation is between host Ole Lensmar and guest Atul, who are joined by special guest Dmitry, the CEO of Testkube. Together, they dive deep into the evolving landscape of software testing, framed by Atul's fascinating journey from manual QA to a cloud native developer advocate. The conversation kicks off with Atul sharing his unique career path, highlighting how starting in manual testing gave him a foundational understanding of system failure points—an invaluable edge in the world of development and cloud native technologies. The core of the discussion explores the profound impact of AI on testing and development workflows. Atul points out a critical challenge: while AI can generate code at 10x the previous velocity, our CI/CD pipelines often become a bottleneck, as they weren't designed to handle such a high volume of pull requests. This leads to a crucial conversation about the need to not only scale testing efforts to ensure proper coverage for AI-generated code but also to scale the underlying CI/CD infrastructure itself. They also touch upon the significant software supply chain security concerns that arise when integrating AI agents into the development lifecycle. Shifting focus to the broader ecosystem, the episode examines why testing often feels like an afterthought in the cloud native community, particularly at events like KubeCon. Atul argues that teams are often so focused on application functionality that they're "blinded" to the complexities of testing the entire infrastructure stack—from service meshes to container orchestration—until something breaks. The conversation also covers the nuanced practice of testing in production, balancing its benefits with the inherent risks.

    24 min

Ratings & Reviews

5
out of 5
2 Ratings

About

The Cloud Native Testing Podcast, sponsored by Testkube, brings you insights from engineers navigating testing in cloud-native environments. Hosted by Ole Lensmar, it explores test automation, CI/CD, Kubernetes, shifting left, scaling right, and reliability at scale through conversations with testing and cloud native experts. Learn more about Testkube at http://testkube.io