Semi Doped

Vikram Sekar and Austin Lyons

The business and technology of semiconductors. Alpha for engineers and investors alike.

  1. MicroLEDs Ain’t Dead, Micron Snags Vera Rubin

    5日前

    MicroLEDs Ain’t Dead, Micron Snags Vera Rubin

    Austin and Vik break down a packed week in semiconductors, covering GTC, OFC, and Micron earnings. The conversation kicks off with Jensen Huang's bold claim that engineers should spend $250K/year on AI tokens, and whether companies will buy tokens or token generators (i.e., on-prem hardware like the Dell Pro Max with GB300). They dig into the CapEx vs OpEx tradeoffs, data security concerns, and how sharing GPU resources might end up looking a lot like the old EDA license model. Next up: Micron crushed earnings and appears to be designed into Vera Rubin for HBM4 — despite months of rumors saying otherwise. Austin and Vik unpack the nuance around HBM pin speeds, memory node base dies, and what Micron's massive new fab investments in Taiwan, Singapore, Idaho, and New York mean for the memory cycle. The back half of the episode dives into optical interconnects for AI scale-up. A new industry consortium (OCI-MSA) has formed with Meta, Broadcom, NVIDIA, and OpenAI to standardize optical components. Vik explains why traditional indium phosphide lasers might be overkill for short-reach scale-up, and makes the case for micro LEDs — a "slow but wide" approach that could fill the gap between copper and conventional optics. They also touch on Credo's expanding product portfolio (and the infamous purple-to-orange cable saga), plus Lumentum's new VCSEL work for scale-up. Vik - https://www.viksnewsletter.com/ Austin - https://www.chipstrat.com/ CHAPTERS 0:00 Intro & GTC/OFC Conference Overload 2:09 Jensen's $250K Token Budget Per Engineer 5:08 On-Prem Inference vs. Cloud Token Spending (Dell Pro Max, CapEx vs OpEx) 6:44 Sharing GPU Resources Like EDA Licenses 8:16 Data Security & On-Prem Privacy Concerns 9:53 Matthew Berman's Fine-Tuned Open Claw Agent 10:35 Vik Sets Up Open Claw on a Home Server 11:53 Always Be Clauden (ABC) – Managing Agents from Your Phone 13:34 Micron Earnings & HBM4 in Vera Rubin 16:39 HBM Pin Speeds & the Micron Design-In Debate 20:17 Micron's New Fab Investments & Memory Cycle Fears 23:49 Why AI Drives a Step Change in Memory Demand 26:30 Optical Compute Interconnect MSA (OCI-MSA) 29:48 Scale-Up Optics: Do We Need New Technology? 30:58 Micro LEDs – The "Slow but Wide" Approach 35:45 Micro LEDs vs. Copper vs. Traditional Optics 36:55 Credo's Product Spectrum & the Purple Cable Story 39:31 VCSELs & Lumentum's 1060nm Scale-Up Play

    43 分鐘
  2. Meta's Inference Accelerator & Applied Optoelectronics (AAOI)

    3月13日

    Meta's Inference Accelerator & Applied Optoelectronics (AAOI)

    Austin recaps moderating an agentic AI panel at Synopsys Converge, then gives an in-depth technical breakdown of Meta's MTIA custom silicon. Why they're building it, how chiplets let them ship a new chip every 6 months, and how the roadmap is shifting toward gen AI inference. Vik digs into Applied Optoelectronics (AAOI), the vertically integrated Texas laser shop whose stock went from $1.48 to $100+, and whether history is about to rhyme.                      Austin Lyons: https://www.chipstrat.com Vik Sekar: https://www.viksnewsletter.com/                                                                                                                                    Topics covered: • Agentic AI in chip design — how it changes roles for junior and senior engineers • Optical circuit switching and what it means for Arista's business model • Meta's ad-serving pipeline: Andromeda, Lattice, and the GEM foundation model • Why custom silicon (MTIA) makes sense at Meta's scale • MTIA chiplet strategy — 4 generations in 2 years • AAOI's vertical integration, Amazon's $4B warrant deal, and the 2017 parallel Chapters: 0:00 Intro 1:26 Synopsys Converge — Agentic AI Panel 9:44 Vik's Article: Optical Circuit Switching & Arista 14:43 Meta MTIA — A New Chip Every 6 Months 21:32 Why Custom Silicon Makes Sense for Meta 27:22 MTIA Chiplet Strategy & Roadmap 33:56 Gen AI Fits Meta's Business Model 36:31 How Meta Ships Chips So Fast 40:30 Applied Optoelectronics (AAOI) Deep Dive 45:02 Amazon's $4B Warrant Deal 48:54 Can AAOI's Lasers Compete with Lumentum? 53:16 AAOI's Aggressive Capacity Buildout 55:35 History Rhymes: AAOI's 2017 Boom & Bust 1:00:55 Wrap-Up #semiconductors #chips #tech #meta #MTIA #AAOI #optics #inference #AI

    1 小時 2 分鐘
  3. The Great Optics-Copper Crossroads

    3月7日

    The Great Optics-Copper Crossroads

    This week, Austin and Vik break down the optics vs. copper debate that rocked semis this week. Nvidia dropped $4 billion on Lumentum and Coherent, Credo posted a blowout quarter betting on copper, and then Hock Tan shocked everyone claiming 400G per lane works over copper in Broadcom’s labs — potentially pushing CPO out to 2030+. Plus, Vik’s 4D chess conspiracy theory on why Hock Tan is talking up copper when Broadcom is a CPO company. Like, subscribe, and drop your thoughts on the copper vs. optics debate in the comments! Subscribe to our newsletters: * Chipstrat by Austin Lyons — chipstrat.com * Vik’s Semiconductor Newsletter by Vik Sekar  — viksnewsletter.com Chapters (00:00) - Newsletter Plugs: Groq LPUs & Broadcom’s Laser Business (03:15) - Dynamo & the Rise of Workload-Specific Hardware (08:04) - Austin’s Broadcom Laser Deep Dive (09:53) - The Week’s Whiplash: Optics Monday, Copper Wednesday (17:50) - Why Nvidia Invested $4B: Geopolitics, Supply & the HBM Playbook (24:15) - CPO Lasers & Optical Circuit Switches (26:16) - Credo Earnings: 200% YoY Growth & the Copper Bull Case (31:09) - Reliability, AECs & Oracle’s GPU Cluster Problem (35:48) - Credo’s Optics Play: Micro-LED Active Cables & the CPO Timing Risk (38:45) - Broadcom Earnings: Hock Tan’s Copper Bombshell (43:34) - Customer-Owned Tooling: Hock Tan Says “Good Luck” (44:25) - Vik’s 4D Chess Theory: Why Hock Tan Talks Up Copper (47:03) - Wrap-Up: It’s Both — The Real Question Is Timing

    48 分鐘
  4. Optical Supply Chain: What would you buy?

    2月27日

    Optical Supply Chain: What would you buy?

    This week, we move from optics technology to optics companies. We walk the AI optical supply chain from bottom to top. Main debate: Who has a moat? Who is already priced for perfection?  *Not investment advice, do your own due diligence* AXTI - Indium phosphide substrate supplier. Critical bottleneck in the laser stack. Major China export-control risk. Massive stock run vs thin earnings. Tower Semiconductor - Leading silicon photonics foundry. 5x capacity expansion with customer prepayments. Strong process lock-in. Pure-play optics exposure. GlobalFoundries - 300mm monolithic photonics platform + Chips Act support. Optics growing fast but still small piece of overall business. Lumentum - Dominant EML laser supplier. Explosive AI demand. Strong technical moat. Valuation and capex sensitivity are key risks. Coherent - Vertically integrated from substrate to module. 6-inch InP push could lower costs structurally. Execution and margin mix matter. Fabrinet - Optics assembly partner. High NVIDIA exposure. Scales with industry, but dependent on upstream supply. Corning - AI data centers require far more fiber than traditional cloud. $6B Meta deal adds visibility. Timing of scale-up optics is the swing factor. Timestamps 00:01 Intro 06:59 AXT $AXTI 13:38 Tower Semiconductor $TSEM 23:58 GlobalFoundries $GFS 32:43 Lumentum $LITE 39:38 Coherent $COHR 47:09 Fabrinet $FN 54:07 Corning $GLW Austin's Substack: https://www.chipstrat.com/ Vik's Substack: https://www.viksnewsletter.com/

    1 小時 2 分鐘
  5. Optical Networking Supercycle - ALL the Tech You NEED to know

    2月20日

    Optical Networking Supercycle - ALL the Tech You NEED to know

    Austin and Vik delve into the evolving landscape of optics and networking, particularly in relation to AI and data centers. The conversation covers various scales of networking, including scale across, scale out, and scale up, while also addressing the demand-supply dynamics in laser manufacturing and the future of optical circuit switches. The episode highlights the technological advancements and market opportunities in the optics sector, emphasizing the significance of these developments for the future of AI. Takeaways Silicon photonics is becoming crucial for data center connectivity.Optics is essential for overcoming copper's limitations in speed and distance.Scale across technology is vital for connecting data centers.Scale out optics is the standard for connecting GPUs between racks.Co-packaged optics can reduce energy consumption in data centers.The scale up market for optics is emerging as a new opportunity.Indium phosphide wafers are a critical bottleneck in laser manufacturing.Optical circuit switches are gaining traction in data centers.2026 is anticipated to be a pivotal year for optical networking.  Chapters 00:00 Introduction to AI and CPU Bottlenecks 03:00 The Rise of Silicon Photonics 06:01 Understanding Optical Networking and Data Centers 08:49 Scale Across: Connecting Data Centers 11:56 Scale Out: Optimizing Data Center Connectivity 14:53 Scale Up: The Future of GPU Connectivity 23:32 The Shift from Copper to Optical Connections 26:13 Challenges and Reliability of Lasers 30:47 Understanding Co-Packaged Optics 34:17 Market Dynamics: Demand and Supply of Lasers 40:46 Emerging Technologies: Optical Circuit Switches Check out Austin's Substack: https://www.chipstrat.com Check out Vik's Substack: https://www.viksnewsletter.com

    46 分鐘
  6. Memory Mayhem & AI Capex Madness

    2月13日

    Memory Mayhem & AI Capex Madness

    In this episode of the Semi Doped podcast, Austin and Vik delve into the current state of the semiconductor industry, focusing on the memory crisis driven by increasing demand from AI applications. They discuss the implications of rising memory prices, the impact of hyperscaler spending on the market, and the strategic moves of major players like Google, Microsoft, Meta, and Amazon in the AI landscape. Takeaways Memory prices are skyrocketing, impacting consumer electronics.The memory crisis is affecting the production of lower-end devices.DRAM prices have doubled in a single quarter, creating challenges for manufacturers.Nanya Tech's revenue growth indicates a booming memory market.AI applications are driving unprecedented demand for memory.Hyperscalers are significantly increasing their capital expenditures for AI infrastructure.The integration of AI into advertising is reshaping business models for companies like Google and Meta.Chapters 00:00 The State of Memory in Semiconductors 03:08 Nvidia's GPU Dilemma and Market Dynamics 06:13 The Impact of AI on Memory Demand 09:08 NAND Flash and Context Memory Trends 11:59 The Future of Memory Supply and Demand 15:12 AI Infrastructure and CapEx Spending 17:47 Google's Strategic Investments in AI 20:58 The Advertising Business Model and AI Integration 30:26 Revenue vs. Expenses: A Balancing Act 31:08 The Future of TPUs vs. GPUs in Cloud Computing 35:31 Microsoft vs. Google: AI Investments and Market Reactions 38:22 AI Integration in Enterprises: Microsoft’s Unique Position 39:57 The Power of Microsoft’s Reach in AI 40:30 GitHub: A Hidden Gem for Microsoft’s AI Strategy 43:52 Meta’s AI Strategy: Advertising and Revenue Growth 51:18 Amazon’s Massive CapEx: Implications for the Future 54:00 Looking Ahead: Predictions for 2027 and Beyond Check out Austin's substack: https://www.chipstrat.com/ Check out Vik's substack: https://www.viksnewsletter.com/

    59 分鐘
  7. The future of financing AI infrastructure with Wayne Nelms, CTO of Ornn

    2月10日

    The future of financing AI infrastructure with Wayne Nelms, CTO of Ornn

    In this episode, Vik and Wayne Nelms discuss the emerging financial exchange for GPU compute, exploring its implications for the AI infrastructure market. They discuss the value of compute, pricing dynamics, hedging strategies, and the future of GPU and memory trading.  Wayne shares insights on partnerships, the depreciation of GPUs, and how inference demand may reshape hardware utilization. The conversation highlights the importance of financial products in facilitating data center development and optimizing profitability in the evolving landscape of compute resources. Takeaways Wayne Nelms is the CTO of Ornn, focusing on GPU compute as a commodity.The value of compute is still being defined in the market.Hedging strategies are essential for managing compute costs.The pricing of GPUs varies significantly across providers.Memory trading is becoming a crucial aspect of the compute market.Partnerships can enhance trading platforms and market efficiency.Depreciation of GPUs is not linear and varies by use case.Inference demand may change how GPUs are utilized in the future.Transparency in pricing benefits smaller players in the market.Financial products can facilitate data center development and profitability.Chapters 00:00 Introduction to GPU Compute Futures 03:13 The Value of Compute in Today's Market 05:59 Understanding GPU Pricing Dynamics 08:46 Hedging and Futures in Compute 11:52 The Role of Memory in AI Infrastructure 15:14 Partnerships and Market Expansion 17:46 Depreciation and Residual Value of GPUs 20:57 Future of Data Centers and Compute Demand 24:01 The Impact of Financialization on AI Infrastructure 27:04 Looking Ahead: The Future of Compute Markets Keywords GPU compute, financial exchange, futures market, data centers, AI infrastructure, pricing strategies, hedging, memory trading, Ornn  Follow Wayne Nelms (@wayne_nelmz on X) Check out Ornn's website: https://www.ornnai.com/ Check out Vik's Substack: https://www.viksnewsletter.com/ Check out Austin's Substack: https://www.chipstrat.com/

    41 分鐘

關於

The business and technology of semiconductors. Alpha for engineers and investors alike.

你可能也會喜歡