Transformer FAM: Feedback attention is working memory Arxiv Papers
-
- Science
Proposing Feedback Attention Memory (FAM) for Transformers to process infinitely long inputs by leveraging a feedback loop, enhancing working memory and improving performance on long-context tasks.
https://arxiv.org/abs//2404.09173
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers
---
Support this podcast: https://podcasters.spotify.com/pod/show/arxiv-papers/support
Proposing Feedback Attention Memory (FAM) for Transformers to process infinitely long inputs by leveraging a feedback loop, enhancing working memory and improving performance on long-context tasks.
https://arxiv.org/abs//2404.09173
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers
---
Support this podcast: https://podcasters.spotify.com/pod/show/arxiv-papers/support
17 min