This week we take a look at the need for pooling layers within CNNs as well as discussing the regularization of CNNs using large-scale neuroscience data.
We are also very pleased to have Rosanne Liu join us on the show. Rosanne is a senior research scientist and a founding member of Uber AI. She is interested in making neural networks a better place and also currently runs a deep learning reading group called "Deep Learning: Classics and Trends".
Rosanne Liu Twitter: https://twitter.com/savvyrl?lang=en
Please let us know who you thought presented the most underrated paper in the form below:
Also let us know any suggestions for future papers or guests:
Links to the papers:
"Learning From Brains How to Regularize Machines" - https://arxiv.org/pdf/1911.05072.pdf
"Pooling is neither necessary nor sufficient for appropriate deformation stability in CNNs - https://arxiv.org/pdf/1804.04438.pdf
"Plug and play language models: A simple approach to controlled text generation" - https://arxiv.org/pdf/1912.02164.pdf