AI doesn't just chew up compute—it eats your network for breakfast. In this episode of Pop Goes the Stack, F5's Lori MacVittie, Joel Moses, and Ken Arora dig into the pressing issues surrounding AI workloads and networking. Everyone's worried about GPUs and cooling, but nobody’s talking about the lateral east-west traffic explosion, the rise of inter-agent comms, or the operational strain on DCN fabric and interconnects. Our experts discuss the importance of upgrading data center networks to accommodate AI demands, examining the differences between training and inferencing workloads. The conversation also covers the necessity of high-performance networking, the relevance of latency, data gravity, and the potential expansion of data centers. Tune in to get valuable insights into the challenges and solutions shaping the future of AI-driven applications.
Information
- Show
- FrequencyUpdated weekly
- Published19 August 2025 at 11:00 UTC
- Length24 min
- Episode7
- RatingClean