47 min

126 - Optimizing Continuous Prompts for Generation, with Lisa Li NLP Highlights

    • Science

We invited Lisa Li to talk about her recent work, Prefix-Tuning: Optimizing Continuous Prompts for Generation. Prefix tuning is a lightweight alternative to finetuning, and the idea is to tune only a fixed-length task-specific continuous vector, and to keep the pretrained transformer parameters frozen. We discussed how prefix tuning compares with finetuning and other efficient alternatives on two tasks in various experimental settings, and in what scenarios prefix tuning is preferable.

Lisa is a Phd student at Stanford University. Lisa's webpage: https://xiangli1999.github.io/

The hosts for this episode are Pradeep Dasigi and Ana Marasović.

We invited Lisa Li to talk about her recent work, Prefix-Tuning: Optimizing Continuous Prompts for Generation. Prefix tuning is a lightweight alternative to finetuning, and the idea is to tune only a fixed-length task-specific continuous vector, and to keep the pretrained transformer parameters frozen. We discussed how prefix tuning compares with finetuning and other efficient alternatives on two tasks in various experimental settings, and in what scenarios prefix tuning is preferable.

Lisa is a Phd student at Stanford University. Lisa's webpage: https://xiangli1999.github.io/

The hosts for this episode are Pradeep Dasigi and Ana Marasović.

47 min

Top Podcasts In Science

Something You Should Know
Mike Carruthers | OmniCast Media | Cumulus Podcast Network
Hidden Brain
Hidden Brain, Shankar Vedantam
Radiolab
WNYC Studios
Ologies with Alie Ward
Alie Ward
StarTalk Radio
Neil deGrasse Tyson
Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas
Sean Carroll | Wondery