YuLan-Mini A Data-Efficient Language Model

Neural intel Pod

YuLan-Mini is a data-efficient large language model (LLM) developed by researchers at the Gaoling School of Artificial Intelligence, Renmin University of China. It is designed to achieve high performance while using significantly fewer computational and data resources compared to other large-scale models.

To listen to explicit episodes, sign in.

Stay up to date with this show

Sign in or sign up to follow shows, save episodes, and get the latest updates.

Select a country or region

Africa, Middle East, and India

Asia Pacific

Europe

Latin America and the Caribbean

The United States and Canada