32 min

OLMo: Everything You Need to Train an Open Source LLM with Akshita Bhagia The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

    • Tecnología

Today we’re joined by Akshita Bhagia, a senior research engineer at the Allen Institute for AI. Akshita joins us to discuss OLMo, a new open source language model with 7 billion and 1 billion variants, but with a key difference compared to similar models offered by Meta, Mistral, and others. Namely, the fact that AI2 has also published the dataset and key tools used to train the model. In our chat with Akshita, we dig into the OLMo models and the various projects falling under the OLMo umbrella, including Dolma, an open three-trillion-token corpus for language model pretraining, and Paloma, a benchmark and tooling for evaluating language model performance across a variety of domains.

The complete show notes for this episode can be found at twimlai.com/go/674.

Today we’re joined by Akshita Bhagia, a senior research engineer at the Allen Institute for AI. Akshita joins us to discuss OLMo, a new open source language model with 7 billion and 1 billion variants, but with a key difference compared to similar models offered by Meta, Mistral, and others. Namely, the fact that AI2 has also published the dataset and key tools used to train the model. In our chat with Akshita, we dig into the OLMo models and the various projects falling under the OLMo umbrella, including Dolma, an open three-trillion-token corpus for language model pretraining, and Paloma, a benchmark and tooling for evaluating language model performance across a variety of domains.

The complete show notes for this episode can be found at twimlai.com/go/674.

32 min

Top podcasts en Tecnología

Top Noticias Tech
Tech Santos
Apple Events (video)
Apple
Acquired
Ben Gilbert and David Rosenthal
Nosotros Los Clones
NOSOTROS LOS CLONES
Lex Fridman Podcast
Lex Fridman
Cafe con Victor
Victor Abarca