22 min

Compressing deep learning models: distillation (Ep.104‪)‬ Data Science at Home

    • Technology

Using large deep learning models on limited hardware or edge devices is definitely prohibitive. There are methods to compress large models by orders of magnitude and maintain similar accuracy during inference.
In this episode I explain one of the first methods: knowledge distillation
 Come join us on Slack
Reference
Distilling the Knowledge in a Neural Network https://arxiv.org/abs/1503.02531
Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks https://arxiv.org/abs/2004.05937

Using large deep learning models on limited hardware or edge devices is definitely prohibitive. There are methods to compress large models by orders of magnitude and maintain similar accuracy during inference.
In this episode I explain one of the first methods: knowledge distillation
 Come join us on Slack
Reference
Distilling the Knowledge in a Neural Network https://arxiv.org/abs/1503.02531
Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks https://arxiv.org/abs/2004.05937

22 min

Top Podcasts In Technology

Geeniuse digisaade | Geenius.ee
Geenius.ee
Acquired
Ben Gilbert and David Rosenthal
Digitund
Kuku Raadio
Digitund
Kuku Raadio
Waveform: The MKBHD Podcast
Vox Media Podcast Network
Olukorrast digiriigis
Delfi Meedia