16 min

The Evolution of Inference with Custom-Built Accelerators - Intel® Chip Chat episode 678 Intel Chip Chat

    • Technology

Academic research and theoretical work on deep learning have predicted an exciting future for deep learning, and now real-world technology is catching up with some of the most exciting potential of AI. Inference is a particularly fascinating part of AI, as it’s what powers the ability of neural networks to “predict” what certain data looks or sounds like. The Intel Nervana Neural Network Processor for Inference (NNP-I) is purpose-built for intensive inference workloads, and accelerating this crucial part of artificial intelligence.

Gadi Singer is the VP and General Manager of the Artificial Intelligence Products Group at Intel, and a 29-year veteran of the company. In this interview, Gadi holds forth on both the high-level design philosophy informing the NNP-I’s structure, and the finer details of its design, such as power efficiency, optimizing data movement, and software support. He also talks about which industries and areas can potentially be transformed by better inference, such as image analysis, automated recommendation systems, and natural language processing.

To learn more about the Intel Nervana Neural Network Processor for Inference go to: https://www.intel.ai/nervana-nnp/

Intel and the Intel logo are trademarks of Intel Corporation or its subsidiaries in the U.S. and/or other countries. © Intel Corporation

Academic research and theoretical work on deep learning have predicted an exciting future for deep learning, and now real-world technology is catching up with some of the most exciting potential of AI. Inference is a particularly fascinating part of AI, as it’s what powers the ability of neural networks to “predict” what certain data looks or sounds like. The Intel Nervana Neural Network Processor for Inference (NNP-I) is purpose-built for intensive inference workloads, and accelerating this crucial part of artificial intelligence.

Gadi Singer is the VP and General Manager of the Artificial Intelligence Products Group at Intel, and a 29-year veteran of the company. In this interview, Gadi holds forth on both the high-level design philosophy informing the NNP-I’s structure, and the finer details of its design, such as power efficiency, optimizing data movement, and software support. He also talks about which industries and areas can potentially be transformed by better inference, such as image analysis, automated recommendation systems, and natural language processing.

To learn more about the Intel Nervana Neural Network Processor for Inference go to: https://www.intel.ai/nervana-nnp/

Intel and the Intel logo are trademarks of Intel Corporation or its subsidiaries in the U.S. and/or other countries. © Intel Corporation

16 min

Top Podcasts In Technology

Acquired
Ben Gilbert and David Rosenthal
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Hard Fork
The New York Times
TED Radio Hour
NPR
Lex Fridman Podcast
Lex Fridman
Darknet Diaries
Jack Rhysider