53 min

DT6. AI Intro and Intuitions Paul, Deon & Tong

    • Tech News

OK, we are not experts nor PhDs so most of this is probably not technically correct, but the math is so complicated and the concepts so complicated, that we thought it would be good to just get some intuition on what is happening. So this is a quick talk that summarizes readings from so many different sources about the history of AI from the 1950s all the way to January 2024 or so.

It is really hard to keep up but much harder without some intuition about what is happening. We cover expert systems, the emergence of neural networks, Convolution Neural Networks, and Recurrent Neural Networks. The Attention is All You Need paper led to Transformers and then finally some intuition on how such a simple idea, that is training on things can lead to emergent and unexpected behaviors, and finally, some intuition on how Generative images work.

You can go to YouTube to see the slides we are using at YouTube and more information at Tongfamily.com

Chapters:


23:25 Attention is all you need Transformers
27:36 But it's too Simple! Emergence is surprising
33:04 What emerges inside a Transformer?
43:01 One Model to Rule Them All
47:54 Works for Image generation too




---

Send in a voice message: https://podcasters.spotify.com/pod/show/richt/message

OK, we are not experts nor PhDs so most of this is probably not technically correct, but the math is so complicated and the concepts so complicated, that we thought it would be good to just get some intuition on what is happening. So this is a quick talk that summarizes readings from so many different sources about the history of AI from the 1950s all the way to January 2024 or so.

It is really hard to keep up but much harder without some intuition about what is happening. We cover expert systems, the emergence of neural networks, Convolution Neural Networks, and Recurrent Neural Networks. The Attention is All You Need paper led to Transformers and then finally some intuition on how such a simple idea, that is training on things can lead to emergent and unexpected behaviors, and finally, some intuition on how Generative images work.

You can go to YouTube to see the slides we are using at YouTube and more information at Tongfamily.com

Chapters:


23:25 Attention is all you need Transformers
27:36 But it's too Simple! Emergence is surprising
33:04 What emerges inside a Transformer?
43:01 One Model to Rule Them All
47:54 Works for Image generation too




---

Send in a voice message: https://podcasters.spotify.com/pod/show/richt/message

53 min