
16 集

Argmax Vahe Hagopian, Taka Hasegawa, Farrukh Rahman
-
- 数学
-
-
5.0 • 1 个评分
-
A show where three machine learning enthusiasts talk about recent papers and developments in machine learning. Watch our video on YouTube https://www.youtube.com/@argmaxfm
-
LoRA
We talk about Low Rank Approximation for fine tuning Transformers. We are also on YouTube now! Check out the video here: https://youtu.be/lLzHr0VFi3Y
-
15: InstructGPT
In this episode we discuss the paper "Training language models to follow instructions with human feedback" by Ouyang et al (2022). We discuss the RLHF paradigm and how important RL is to tuning GPT.
-
14: Whisper
This week we talk about Whisper. It is a weakly supervised speech recognition model.
-
13: AlphaTensor
We talk about AlphaTensor, and how researchers were able to find a new algorithm for matrix multiplication.
-
12: SIRENs
In this episode we talked about "Implicit Neural Representations with Periodic Activation Functions" and the strength of periodic non-linearities.
-
11: CVPR Workshop on Autonomous Driving Keynote by Ashok Elluswamy, a Tesla engineer
In this episode we discuss this video: https://youtu.be/jPCV4GKX9Dw
How Tesla approaches collision detection with novel methods.