Underrated ML

Sara Hooker & Sean Hooker

Listen along as we try and dissect various Machine Learning papers that just haven't got the love and attention they deserve.Twitter: https://twitter.com/underrated_mlVoting Page: https://forms.gle/97MgHvTkXgdB41TC8

Episodes

  1. 07/26/2022

    Strongly typed RNNs and morphogenesis

    We conclude season one of Underrated ML by having Stephen Merity on as our guest. Stephen has worked at various institutions such as MetaMind and Salesforce ohana, Google Sydney, Freelancer.com, the Schwa Lab at the University of Sydney, the team at Grok Learning, the non-profit Common Crawl, and IACS @ Harvard. He also holds a Bachelor of Information Technology from the University of Sydney and a Master of Science in Computational Science and Engineering from Harvard University. In this weeks episode we talk about the current influences of hardware in the field of Deep Learning research, baseline models, strongly typed RNNs and Alan Turings paper on the chemical basis of morphogenesis. Underrated ML Twitter: https://twitter.com/underrated_ml Stephen Merity Twitter: https://twitter.com/Smerity Please let us know who you thought presented the most underrated paper in the form below: https://forms.gle/97MgHvTkXgdB41TC8 Links to the papers: “The Chemical Basis of Morphogenesis” - https://www.dna.caltech.edu/courses/cs191/paperscs191/turing.pdf "Strongly-Typed Recurrent Neural Networks” - https://arxiv.org/abs/1602.02218 "Quasi-Recurrent Neural Networks" - https://arxiv.org/abs/1611.01576 "An Analysis of Neural Language Modelling at Multiple Scales" - https://arxiv.org/abs/1803.08240 Additional Links: Aleatory architecture / hysteresis: Why Birds Are The World's Best EngineersNear decomposability: Near decomposability and the speed of evolution / The Architecture of ComplexityGoogle's All Our N-gram are Belong to You from 2006

    1h 34m
  2. 07/26/2022

    Language independence and material properties

    This week we are joined by Sebastian Ruder. He is a research scientist at DeepMind, London. He has also worked at a variety of institutions such as AYLIEN, Microsoft, IBM's Extreme Blue, Google Summer of Code, and SAP. These experiences were completed in tangent with his studies which included studying Computational Linguistics at the University of Heidelberg, Germany and at Trinity College, Dublin before undertaking a PhD in Natural Language Processing and Deep Learning at the Insight Research Centre for Data Analytics. This week we discuss language independence and diversity in natural language processing whilst also taking a look at the attempts to identify material properties from images. As discussed in the podcast if you would like to donate to the current campaign of "CREATE DONATE EDUCATE" which supports Stop Hate UK then please find the link below: https://www.shorturl.at/glmsz Please also find additional links to help support black colleagues in the area of research; Black in AI twitter account: https://twitter.com/black_in_ai Mentoring and proofreading sign-up to support our Black colleagues in research: https://twitter.com/le_roux_nicolas/status/1267896907621433344?s=20 Underrated ML Twitter: https://twitter.com/underrated_ml Sebastian Ruder Twitter: https://twitter.com/seb_ruder Please let us know who you thought presented the most underrated paper in the form below: https://forms.gle/97MgHvTkXgdB41TC8 Links to the papers: “On Achieving and Evaluating Language-Independence in NLP” - https://journals.linguisticsociety.org/elanguage/lilt/article/view/2624.html "The State and Fate of Linguistic Diversity and Inclusion in the NLP World” - https://arxiv.org/abs/2004.09095 "Recognizing Material Properties from Images" - https://arxiv.org/pdf/1801.03127.pdf Additional Links: Student perspectives on applying to NLP PhD programs: https://blog.nelsonliu.me/2019/10/24/student-perspectives-on-applying-to-nlp-phd-programs/Tim Dettmer's post on how to pick your grad school: https://timdettmers.com/2020/03/10/how-to-pick-your-grad-school/Rachel Thomas' blog post on why you should blog: https://medium.com/@racheltho/why-you-yes-you-should-blog-7d2544ac1045Emily Bender's The Gradient article: https://thegradient.pub/the-benderrule-on-naming-the-languages-we-study-and-why-it-matters/Paper on order-sensitive vs order-free methods: https://www.aclweb.org/anthology/N19-1253.pdf"Exploring the Origins and Prevalence of Texture Bias in Convolutional Neural Networks": https://arxiv.org/abs/1911.09071Sebastian's website where you can find all his blog posts: https://ruder.io/

    1h 34m
  3. 07/26/2022

    Energy functions and shortcut learning

    This week we are joined by Kyunghyun Cho. He is an associate professor of computer science and data science at New York University, a research scientist at Facebook AI Research and a CIFAR Associate Fellow. On top of this he also co-chaired the recent ICLR 2020 virtual conference. We talk about a variety of topics in this weeks episode including the recent ICLR conference, energy functions, shortcut learning and the roles popularized Deep Learning research areas play in answering the question “What is Intelligence?”. Underrated ML Twitter: https://twitter.com/underrated_ml Kyunghyun Cho Twitter: https://twitter.com/kchonyc?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor Please let us know who you thought presented the most underrated paper in the form below: https://forms.gle/97MgHvTkXgdB41TC8 Links to the papers: “Shortcut Learning in Deep Neural Networks” - https://arxiv.org/pdf/2004.07780.pdf "Bayesian Deep Learning and a Probabilistic Perspective of Generalization” - https://arxiv.org/abs/2002.08791 "Classifier-agnostic saliency map extraction" - https://arxiv.org/abs/1805.08249 “Deep Energy Estimator Networks” - https://arxiv.org/abs/1805.08306 “End-to-End Learning for Structured Prediction Energy Networks” - https://arxiv.org/abs/1703.05667 “On approximating nabla f with neural networks” - https://arxiv.org/abs/1910.12744 “Adversarial NLI: A New Benchmark for Natural Language Understanding“ - https://arxiv.org/abs/1910.14599 “Learning the Difference that Makes a Difference with Counterfactually-Augmented Data” - https://arxiv.org/abs/1909.12434 “Learning Concepts with Energy Functions” - https://openai.com/blog/learning-concepts-with-energy-functions/

    1h 29m
  4. 07/26/2022

    Metaphor generation and ML for child welfare

    We open season two of Underrated ML with Anna Huang on the show. Anna Huang is a Research Scientist at Google Brain, working on the Magenta project. Her research focuses on designing generative models to make creating music more approachable. She is the creator of Music Transformer and also the ML model Coconet that powered Google’s first AI Doodle the Bach Doodle. She holds a PhD in computer science from Harvard University and was a recipient of the NSF Graduate Research Fellowship. She spent the later parts of her PhD as a visiting research student at the Montreal Institute of Learning Algorithms (MILA). She publishes in machine learning, human-computer interaction, and music, at conferences such as ICLR, IUI, CHI, and ISMIR. She has been a judge on the Eurovision AI Song Contest and her compositions have won awards including first place in the San Francisco Choral Artists’ a cappella composition contest. She holds a masters in media arts and sciences from the MIT Media Lab, and a B.S. in computer science and B.M. in music composition both from the University of Southern California. She grew up in Hong Kong, where she learned to play the guzheng. On the episode we discuss Metaphoria by Kate Gero and Lydia Chilton, which is a fascinating tool allowing users to generate metaphors from only a select number of words. We also discuss the current trends regarding the dangers of AI with a case study on child welfare. Underrated ML Twitter: https://twitter.com/underrated_ml Anna Huang Twitter: https://twitter.com/huangcza Please let us know who you thought presented the most underrated paper in the form below: https://forms.gle/97MgHvTkXgdB41TC8 Links to the papers: Gero, Katy Ilonka, and Lydia B. Chilton. "Metaphoria: An Algorithmic Companion for Metaphor Creation." CHI 2019. [paper][online paper] [talk] [demo] "A case study of algorithm-assisted decision making in child maltreatment hotline screening decisions" - [paper] Additional Links: Compton, Kate, and Michael Mateas. "Casual Creators." ICCC 2015. [paper]Fiebrink, Rebecca, Dan Trueman, and Perry R. Cook. "A Meta-Instrument for Interactive, On-the-Fly Machine Learning." NIME 2009. [paper][talk][tool]Huang, Cheng-Zhi Anna, et al. "The Bach Doodle: Approachable music composition with machine learning at scale." ISMIR 2019. [paper][blog][doodle]

    1h 14m

Ratings & Reviews

4.2
out of 5
5 Ratings

About

Listen along as we try and dissect various Machine Learning papers that just haven't got the love and attention they deserve.Twitter: https://twitter.com/underrated_mlVoting Page: https://forms.gle/97MgHvTkXgdB41TC8