The Infra Pod

The Infra Pod

The Infra Pod brings you insightful and thought-provoking discussions on the world of infrastructure software. This podcast is started by two engineers, Ian Livingstone (tech advisor for Snyk) and Tim Chen (General Partner at Essence VC), team up with a rotating cast of guests to dive deep into the latest trends and hot topics in the software infrastructure space.

  1. APR 28

    Betting on Open Source Models to be the future (Chat with Benny, Cofounder of Fireworks AI)

    In this episode of The Infra Pod, hosts Tim Chen (Essence VC) and Ian Livingstone (Keycard) sit down with Benny Chen, co-founder of Fireworks AI, to explore the evolving world of AI inference infrastructure. Benny shares his journey from Meta — where capacity planning meetings made it clear GPUs were heading "up and to the right" — to co-founding Fireworks AI before ChatGPT even launched. The conversation dives deep into why the team bet early on inference over training, how they approached model optimization from horizontal compiler techniques to per-model kernel tuning, and why model customization is the key to unlocking better-than-frontier performance for vertical use cases. Benny discusses the reality of open source vs. closed models, the rise of agentic workloads, and why the real question isn't which model to use — it's which tasks have already been saturated. This episode is packed with technical insights on inference infrastructure, reinforcement learning for model customization, and what it means to truly adopt an AI-native engineering culture. 0:24 Benny's journey and founding Fireworks AI3:23 Early conviction: betting on inference before ChatGPT8:29 Pivoting from PyTorch training to text inference15:42 Horizontal vs. per-model optimization strategies11:14 Open source vs. frontier models: the real gap32:35 How customers engage: PLG to hands-on customization17:37 When to move off frontier models33:42 The future of agentic memory and data sovereignty32:35 Fireworks' differentiation in a crowded market33:53 Spicy Future: AI doomers, bot management, and going fully out of loop

    41 min
  2. MAR 9

    Building a successful infra product between all the AI apps and model providers (chat with Louis from OpenRouter)

    Tim (Essence VC) and Ian (Keycard) interviewed Louis Vichy, co-founder of OpenRouter, about why he built OpenRouter to de-risk AI app development (end-user pays LLM costs), how it scaled to processing ~5–6T tokens/week, and what OpenRouter is today: a reliable inference routing/control layer across ~60 providers with consolidated billing and reduced vendor lock-in. Louis explains why teams adopt OpenRouter (constant new model integrations, pricing/billing, differing API shapes), how routing focuses on practical heuristics (fallbacks, cost, throughput, latency), and how reliability is achieved via provider failover (e.g., alternate endpoints like Vertex/Bedrock). They discuss agent trends (longer-running agents, small models for routing/classification with specialized downstream models), possible memory support, developer conveniences (e.g., PDF parsing), and enterprise features (security/compliance guardrails, presets). The episode ends with links to OpenRouter chat/rankings pages and hiring for high-agency TypeScript-focused engineers.00:00 Welcome & Meet Louis (OpenRouter Co‑Founder)00:27 Origin Story: De‑Risking AI App Costs (Hackathon Lessons)01:35 First Big Feature: End‑User Pays for Tokens (Sign in with OpenRouter)02:34 From Routing to Rankings: Scaling to Trillions of Tokens03:42 What OpenRouter Is Today: Reliable Inference Across 60+ Providers05:55 Why Teams Adopt It: Avoiding Model API Churn, Billing, and Vendor Lock‑In08:37 Winning Strategy: Don’t Build a “Magic Router”—Optimize Cost/Latency/Throughput18:58 From Chat to RAG + Memory: Building Persistent Agent Context20:37 Developer Bells & Whistles: Auto PDF Parsing and More21:11 Enterprise Readiness: Compliance, Security Guardrails & Model Presets22:22 Customer Growth at Warp Speed in the AI Era23:03 Spicy Future!

    34 min
  3. JAN 26

    Let's chat about vibe coding & Ralph! (Chat with Dexter at Humanlayer)

    In this episode of The Infra Pod, hosts Tim and Ian sit down with Dexter Horthy, CEO of Human Layer, to explore the evolution of AI coding agents and the future of software development. Dexter shares his journey from building data tools to discovering the real problem: making AI coding agents actually productive for senior engineers, not just juniors. The conversation dives deep into the research-plan-implement workflow that enables engineers to ship 99% of their code with AI assistance, the challenges of getting staff engineers to adopt AI tools, and why most AI coding ecosystems don't actually help you sell to enterprises. Dexter also shares his spicy take on how Ralph-style agents can be even further enhanced. Whether you're a skeptical senior engineer or an AI-curious developer, this episode offers practical insights into what actually works in production AI coding today. [0:00] Introduction & Dexter's JourneyWhy Dexter finally started a company, the failed data catalog pivot, and building an AI janitor for data warehouses [8:00] The Hard Lessons of AI Ecosystem HypeWhy there's no "SAML for AI agents" and what enterprises actually need versus what the hype machine promises [13:00] The Research-Plan-Implement BreakthroughHow to make senior engineers productive with AI, staying objective during research, and making decisions at the top of the context window [26:00] The Vibe Shift & Where We Are TodayWhen respected engineers started believing, the role of Ralph and spec-driven development, and what's working in production [37:00] Spicy Take: Ralph Goes to the Supreme

    43 min

Ratings & Reviews

5
out of 5
2 Ratings

About

The Infra Pod brings you insightful and thought-provoking discussions on the world of infrastructure software. This podcast is started by two engineers, Ian Livingstone (tech advisor for Snyk) and Tim Chen (General Partner at Essence VC), team up with a rotating cast of guests to dive deep into the latest trends and hot topics in the software infrastructure space.

You Might Also Like