22 min

Activate deep learning neurons faster with Dynamic RELU (ep. 101‪)‬ Data Science at Home

    • Technology

In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU). While there are several flavors of ReLU in the literature, in this episode I speak about a very interesting approach that keeps computational complexity low while improving performance quite consistently.
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones. 
References
Dynamic ReLU https://arxiv.org/abs/2003.10027

In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU). While there are several flavors of ReLU in the literature, in this episode I speak about a very interesting approach that keeps computational complexity low while improving performance quite consistently.
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones. 
References
Dynamic ReLU https://arxiv.org/abs/2003.10027

22 min

Top Podcasts In Technology

Digitund
Kuku Raadio
Geeniuse digisaade | Geenius.ee
Geenius.ee
The Scrimba Podcast
Alex Booker
Lex Fridman Podcast
Lex Fridman
Istmesoojendus
Delfi Meedia
Lenny's Podcast: Product | Growth | Career
Lenny Rachitsky