17 min

Transformer FAM: Feedback attention is working memory Arxiv Papers

    • Science

Proposing Feedback Attention Memory (FAM) for Transformers to process infinitely long inputs by leveraging a feedback loop, enhancing working memory and improving performance on long-context tasks.



https://arxiv.org/abs//2404.09173



YouTube: https://www.youtube.com/@ArxivPapers



TikTok: https://www.tiktok.com/@arxiv_papers



Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016



Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers




---

Support this podcast: https://podcasters.spotify.com/pod/show/arxiv-papers/support

Proposing Feedback Attention Memory (FAM) for Transformers to process infinitely long inputs by leveraging a feedback loop, enhancing working memory and improving performance on long-context tasks.



https://arxiv.org/abs//2404.09173



YouTube: https://www.youtube.com/@ArxivPapers



TikTok: https://www.tiktok.com/@arxiv_papers



Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016



Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers




---

Support this podcast: https://podcasters.spotify.com/pod/show/arxiv-papers/support

17 min

Top Podcasts In Science

Hidden Brain
Hidden Brain, Shankar Vedantam
Something You Should Know
Mike Carruthers | OmniCast Media | Cumulus Podcast Network
Radiolab
WNYC Studios
Crash Course Pods: The Universe
Crash Course Pods, Complexly
Ologies with Alie Ward
Alie Ward
StarTalk Radio
Neil deGrasse Tyson