The Pragmatic Engineer

Gergely Orosz

Software engineering at Big Tech and startups, from the inside. Deepdives with experienced engineers and tech professionals who share their hard-earned lessons, interesting stories and advice they have on building software. Especially relevant for software engineers and engineering leaders: useful for those working in tech. newsletter.pragmaticengineer.com

  1. 6 HR AGO

    Being a founding engineer at an AI startup

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. •⁠ Linear ⁠ — ⁠ The system for modern product development. — Michelle Lim joined Warp as engineer number one and is now building her own startup, Flint. She brings a strong product-first mindset shaped by her time at Facebook, Slack, Robinhood, and Warp. Michelle shares why she chose Warp over safer offers, how she evaluates early-stage opportunities, and what she believes distinguishes great founding engineers. Together, we cover how product-first engineers create value, why negotiating equity at early-stage startups requires a different approach, and why asking founders for references is a smart move. Michelle also shares lessons from building consumer and infrastructure products, how she thinks about tech stack choices, and how engineers can increase their impact by taking on work outside their job descriptions. If you want to understand what founders look for in early engineers or how to grow into a founding-engineer role, this episode is full of practical advice backed by real examples — Timestamps (00:00) Intro (01:32) How Michelle got into software engineering  (03:30) Michelle’s internships  (06:19) Learnings from Slack  (08:48) Product learnings at Robinhood (12:47) Joining Warp as engineer #1 (22:01) Negotiating equity (26:04) Asking founders for references (27:36) The top reference questions to ask (32:53) The evolution of Warp’s tech stack  (35:38) Product-first engineering vs. code-first (38:27) Hiring product-first engineers  (41:49) Different types of founding engineers  (44:42) How Flint uses AI tools  (45:31) Avoiding getting burned in founder exits (49:26) Hiring top talent (50:15) An overview of Flint (56:08) Advice for aspiring founding engineers (1:01:05) Rapid fire round — The Pragmatic Engineer deepdives relevant for this episode: • Thriving as a founding engineer: lessons from the trenches • From software engineer to AI engineer • AI Engineering in the real world • The AI Engineering stack — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1h 4m
  2. 26 NOV

    Code security for software engineers

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. Statsig are helping make the first-ever Pragmatic Summit a reality. Join me and 400 other top engineers and leaders on 11 February, in San Francisco for a special one-day event. Reserve your spot here. •⁠ Linear ⁠ — ⁠ The system for modern product development. Engineering teams today move much faster, thanks to AI. Because of this, coordination increasingly becomes a problem. This is where Linear helps fast-moving teams stay focused. Check out Linear. — As software engineers, what should we know about writing secure code? Johannes Dahse is the VP of Code Security at Sonar and a security expert with 20 years of industry experience. In today’s episode of The Pragmatic Engineer, he joins me to talk about what security teams actually do, what developers should own, and where real-world risk enters modern codebases. We cover dependency risk, software composition analysis, CVEs, dynamic testing, and how everyday development practices affect security outcomes. Johannes also explains where AI meaningfully helps, where it introduces new failure modes, and why understanding the code you write and ship remains the most reliable defense. If you build and ship software, this episode is a practical guide to thinking about code security under real-world engineering constraints. — Timestamps (00:00) Intro (02:31) What is penetration testing? (06:23) Who owns code security: devs or security teams? (14:42) What is code security?  (17:10) Code security basics for devs (21:35) Advanced security challenges (24:36) SCA testing  (25:26) The CVE Program  (29:39) The State of Code Security report  (32:02) Code quality vs security (35:20) Dev machines as a security vulnerability (37:29) Common security tools (42:50) Dynamic security tools (45:01) AI security reviews: what are the limits? (47:51) AI-generated code risks (49:21) More code: more vulnerabilities (51:44) AI’s impact on code security (58:32) Common misconceptions of the security industry (1:03:05) When is security “good enough?” (1:05:40) Johannes’s favorite programming language — The Pragmatic Engineer deepdives relevant for this episode: • What is Security Engineering? •⁠ Mishandled security vulnerability in Next.js •⁠ Okta Schooled on Its Security Practices — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1h 8m
  3. 19 NOV

    How AI will change software engineering – with Martin Fowler

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. AI-accelerated development isn’t just about shipping faster: it’s about measuring whether, what you ship, actually delivers value. This is where modern experimentation with Statsig comes in. Check it out. •⁠ Linear ⁠ — ⁠ The system for modern product development. I had a jaw-dropping experience when I dropped in for the weekly “Quality Wednesdays” meeting at Linear. Every week, every dev fixes at least one quality isse, large or small. Even if it’s one pixel misalignment, like this one. I’ve yet to see a team obsess this much about quality. Read more about how Linear does Quality Wednesdays – it’s fascinating! — Martin Fowler is one of the most influential people within software architecture, and the broader tech industry. He is the Chief Scientist at Thoughtworks and the author of Refactoring and Patterns of Enterprise Application Architecture, and several other books. He has spent decades shaping how engineers think about design, architecture, and process, and regularly publishes on his blog, MartinFowler.com. In this episode, we discuss how AI is changing software development: the shift from deterministic to non-deterministic coding; where generative models help with legacy code; and the narrow but useful cases for vibe coding. Martin explains why LLM output must be tested rigorously, why refactoring is more important than ever, and how combining AI tools with deterministic techniques may be what engineering teams need. We also revisit the origins of the Agile Manifesto and talk about why, despite rapid changes in tooling and workflows, the skills that make a great engineer remain largely unchanged. — Timestamps (00:00) Intro (01:50) How Martin got into software engineering  (07:48) Joining Thoughtworks  (10:07) The Thoughtworks Technology Radar (16:45) From Assembly to high-level languages (25:08) Non-determinism  (33:38) Vibe coding (39:22) StackOverflow vs. coding with AI (43:25) Importance of testing with LLMs  (50:45) LLMs for enterprise software (56:38) Why Martin wrote Refactoring  (1:02:15) Why refactoring is so relevant today (1:06:10) Using LLMs with deterministic tools (1:07:36) Patterns of Enterprise Application Architecture (1:18:26) The Agile Manifesto  (1:28:35) How Martin learns about AI  (1:34:58) Advice for junior engineers  (1:37:44) The state of the tech industry today (1:42:40) Rapid fire round — The Pragmatic Engineer deepdives relevant for this episode: • Vibe coding as a software engineer • The AI Engineering stack • AI Engineering in the real world • What changed in 50 years of computing — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1h 49m
  4. 12 NOV

    Netflix’s Engineering Culture

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. Statsig enables two cultures at once: continuous shipping and experimentation. Companies like Notion went from single-digit experiments per quarter to over 300 experiments with Statsig. Start using Statsig with a generous free tier, and a $50K startup program. •⁠ Linear ⁠ — ⁠ The system for modern product development. When most companies hit real scale, they start to slow down, and are faced with “process debt.” This often hits software engineers the most. Companies switch to Linear to hit a hard reset on this process debt – ones like Scale cut their bug resolution in half after the switch. Check out Linear’s migration guide for details. — What’s it like to work as a software engineer inside one of the world’s biggest streaming companies? In this special episode recorded at Netflix’s headquarters in Los Gatos, I sit down with Elizabeth Stone, Netflix’s Chief Technology Officer. Before becoming CTO, Elizabeth led data and insights at Netflix and was VP of Science at Lyft. She brings a rare mix of technical depth, product thinking, and people leadership. We discuss what it means to be “unusually responsible” at Netflix, how engineers make decisions without layers of approval, and how the company balances autonomy with guardrails for high-stakes projects like Netflix Live. Elizabeth shares how teams self-reflect and learn from outages and failures, why Netflix doesn’t do formal performance reviews, and what new grads bring to a company known for hiring experienced engineers. This episode offers a rare inside look at how Netflix engineers build, learn, and lead at a global scale. — Timestamps (00:00) Intro (01:44) The scale of Netflix  (03:31) Production software stack (05:20) Engineering challenges in production (06:38) How the Open Connect delivery network works (08:30) From pitch to play  (11:31) How Netflix enables engineers to make decisions  (13:26) Building Netflix Live for global sports (16:25) Learnings from Paul vs. Tyson for NFL Live (17:47) Inside the control room  (20:35) What being unusually responsible looks like (24:15) Balancing team autonomy with guardrails for Live (30:55) The high talent bar and introduction of levels at Netflix (36:01) The Keeper Test   (41:27) Why engineers leave or stay  (44:27) How AI tools are used at Netflix (47:54) AI’s highest-impact use cases (50:20) What new grads add and why senior talent still matters (53:25) Open source at Netflix  (57:07) Elizabeth’s parting advice for new engineers to succeed at Netflix  — The Pragmatic Engineer deepdives relevant for this episode: • The end of the senior-only level at Netflix • Netflix revamps its compensation philosophy • Live streaming at world-record scale with Ashutosh Agrawal • Shipping to production • What is good software architecture? — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1 hr
  5. 5 NOV

    From Swift to Mojo and high-performance AI Engineering with Chris Lattner

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. Companies like Graphite, Notion, and Brex rely on Statsig to measure the impact of the pace they ship. Get a 30-day enterprise trial here. •⁠ Linear – The system for modern product development. Linear is a heavy user of Swift: they just redesigned their native iOS app using their own take on Apple’s Liquid Glass design language. The new app is about speed and performance – just like Linear is. Check it out. — Chris Lattner is one of the most influential engineers of the past two decades. He created the LLVM compiler infrastructure and the Swift programming language – and Swift opened iOS development to a broader group of engineers. With Mojo, he’s now aiming to do the same for AI, by lowering the barrier to programming AI applications. I sat down with Chris in San Francisco, to talk language design, lessons on designing Swift and Mojo, and – of course! – compilers. It’s hard to find someone who is as enthusiastic and knowledgeable about compilers as Chris is! We also discussed why experts often resist change even when current tools slow them down, what he learned about AI and hardware from his time across both large and small engineering teams, and why compiler engineering remains one of the best ways to understand how software really works. — Timestamps (00:00) Intro (02:35) Compilers in the early 2000s (04:48) Why Chris built LLVM (08:24) GCC vs. LLVM (09:47) LLVM at Apple  (19:25) How Chris got support to go open source at Apple (20:28) The story of Swift  (24:32) The process for designing a language  (31:00) Learnings from launching Swift  (35:48) Swift Playgrounds: making coding accessible (40:23) What Swift solved and the technical debt it created (47:28) AI learnings from Google and Tesla  (51:23) SiFive: learning about hardware engineering (52:24) Mojo’s origin story (57:15) Modular’s bet on a two-level stack (1:01:49) Compiler shortcomings (1:09:11) Getting started with Mojo  (1:15:44) How big is Modular, as a company? (1:19:00) AI coding tools the Modular team uses  (1:22:59) What kind of software engineers Modular hires  (1:25:22) A programming language for LLMs? No thanks (1:29:06) Why you should study and understand compilers — The Pragmatic Engineer deepdives relevant for this episode: •⁠ AI Engineering in the real world • The AI Engineering stack • Uber's crazy YOLO app rewrite, from the front seat • Python, Go, Rust, TypeScript and AI with Armin Ronacher • Microsoft’s developer tools roots — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1h 32m
  6. 29 OCT

    Beyond Vibe Coding with Addy Osmani

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. •⁠ Linear – The system for modern product development. — Addy Osmani is Head of Chrome Developer Experience at Google, where he leads teams focused on improving performance, tooling, and the overall developer experience for building on the web. If you’ve ever opened Chrome’s Developer Tools bar, you’ve definitely used features Addy has built. He’s also the author of several books, including his latest, Beyond Vibe Coding, which explores how AI is changing software development. In this episode of The Pragmatic Engineer, I sit down with Addy to discuss how AI is reshaping software engineering workflows, the tradeoffs between speed and quality, and why understanding generated code remains critical. We dive into his article The 70% Problem, which explains why AI tools accelerate development but struggle with the final 30% of software quality—and why this last 30% is tackled easily by software engineers who understand how the system actually works. — Timestamps (00:00) Intro (02:17) Vibe coding vs. AI-assisted engineering (06:07) How Addy uses AI tools (13:10) Addy’s learnings about applying AI for development (18:47) Addy’s favorite tools (22:15) The 70% Problem (28:15) Tactics for efficient LLM usage (32:58) How AI tools evolved (34:29) The case for keeping expectations low and control high (38:05) Autonomous agents and working with them (42:49) How the EM and PM role changes with AI (47:14) The rise of new roles and shifts in developer education (48:11) The importance of critical thinking when working with AI (54:08) LLMs as a tool for learning (1:03:50) Rapid questions — The Pragmatic Engineer deepdives relevant for this episode: •⁠ Vibe Coding as a software engineer •⁠ How AI-assisted coding will change software engineering: hard truths •⁠ AI Engineering in the real world •⁠ The AI Engineering stack •⁠ How Claude Code is built — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1h 8m
  7. 15 OCT

    Google’s engineering culture

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. Something interesting is happening with the latest generation of tech giants. Rather than building advanced experimentation tools themselves, companies like Anthropic, Figma, Notion and a bunch of others… are just using Statsig. Statsig has rebuilt this entire suite of data tools that was available at maybe 10 or 15 giants until now. Check out Statsig. •⁠ Linear – The system for modern product development. Linear is just so fast to use – and it enables velocity in product workflows. Companies like Perplexity and OpenAI have already switched over, because simplicity scales. Go ahead and check out Linear and see why it feels like a breeze to use. — What is it really like to be an engineer at Google? In this special deep dive episode, we unpack how engineering at Google actually works. We spent months researching the engineering culture of the search giant, and talked with 20+ current and former Googlers to bring you this deepdive with Elin Nilsson, tech industry researcher for The Pragmatic Engineer and a former Google intern. Google has always been an engineering-driven organization. We talk about its custom stack and tools, the design-doc culture, and the performance and promotion systems that define career growth. We also explore the culture that feels built for engineers: generous perks, a surprisingly light on-call setup often considered the best in the industry, and a deep focus on solving technical problems at scale. If you are thinking about applying to Google or are curious about how the company’s engineering culture has evolved, this episode takes a clear look at what it was like to work at Google in the past versus today, and who is a good fit for today’s Google. Jump to interesting parts: (13:50) Tech stack (1:05:08) Performance reviews (GRAD) (2:07:03) The culture of continuously rewriting things — Timestamps (00:00) Intro (01:44) Stats about Google (11:41) The shared culture across Google (13:50) Tech stack (34:33) Internal developer tools and monorepo (43:17) The downsides of having so many internal tools at Google (45:29) Perks (55:37) Engineering roles (1:02:32) Levels at Google  (1:05:08) Performance reviews (GRAD) (1:13:05) Readability (1:16:18) Promotions (1:25:46) Design docs (1:32:30) OKRs (1:44:43) Googlers, Nooglers, ReGooglers (1:57:27) Google Cloud (2:03:49) Internal transfers (2:07:03) Rewrites (2:10:19) Open source (2:14:57) Culture shift (2:31:10) Making the most of Google, as an engineer (2:39:25) Landing a job at Google — The Pragmatic Engineer deepdives relevant for this episode: •⁠ Inside Google’s engineering culture •⁠ Oncall at Google •⁠ Performance calibrations at tech companies •⁠ Promotions and tooling at Google •⁠ How Kubernetes is built •⁠ The man behind the Big Tech comics: Google cartoonist Manu Cornet — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    2h 46m
  8. 8 OCT

    Python, Go, Rust, TypeScript and AI with Armin Ronacher

    Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. Most teams end up in this situation: ship a feature to 10% of users, wait a week, check three different tools, try to correlate the data, and you’re still unsure if it worked. The problem is that each tool has its own user identification and segmentation logic. Statsig solved this problem by building everything within a unified platform. Check out Statsig. •⁠ Linear – The system for modern product development. In the episode, Armin talks about how he uses an army of “AI interns” at his startup. With Linear, you can easily do the same: Linear’s Cursor integration lets you add Cursor as an agent to your workspace. This agent then works alongside you and your team to make code changes or answer questions. You’ve got to try it out: give Linear a spin and see how it integrates with Cursor. — Armin Ronacher is the creator of the Flask framework for Python, was one of the first engineers hired at Sentry, and now the co-founder of a new startup. He has spent his career thinking deeply about how tools shape the way we build software. In this episode of The Pragmatic Engineer Podcast, he joins me to talk about how programming languages compare, why Rust may not be ideal for early-stage startups, and how AI tools are transforming the way engineers work. Armin shares his view on what continues to make certain languages worth learning, and how agentic coding is driving people to work more, sometimes to their own detriment.  We also discuss:  • Why the Python 2 to 3 migration was more challenging than expected • How Python, Go, Rust, and TypeScript stack up for different kinds of work  • How AI tools are changing the need for unified codebases • What Armin learned about error handling from his time at Sentry • And much more  Jump to interesting parts: • (06:53) How Python, Go, and Rust stack up and when to use each one • (30:08) Why Armin has changed his mind about AI tools • (50:32) How important are language choices from an error-handling perspective? — Timestamps (00:00) Intro (01:34) Why the Python 2 to 3 migration created so many challenges (06:53) How Python, Go, and Rust stack up and when to use each one (08:35) The friction points that make Rust a bad fit for startups (12:28) How Armin thinks about choosing a language for building a startup (22:33) How AI is impacting the need for unified code bases (24:19) The use cases where AI coding tools excel  (30:08) Why Armin has changed his mind about AI tools (38:04) Why different programming languages still matter but may not in an AI-driven future (42:13) Why agentic coding is driving people to work more and why that’s not always good (47:41) Armin’s error-handling takeaways from working at Sentry  (50:32) How important is language choice from an error-handling perspective (56:02) Why the current SDLC still doesn’t prioritize error handling  (1:04:18) The challenges language designers face  (1:05:40) What Armin learned from working in startups and who thrives in that environment (1:11:39) Rapid fire round — The Pragmatic Engineer deepdives relevant for this episode: — Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email podcast@pragmaticengineer.com. Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

    1h 14m

About

Software engineering at Big Tech and startups, from the inside. Deepdives with experienced engineers and tech professionals who share their hard-earned lessons, interesting stories and advice they have on building software. Especially relevant for software engineers and engineering leaders: useful for those working in tech. newsletter.pragmaticengineer.com

You Might Also Like