50 min

The importance of open source in GenAI Targeting AI

    • Technology

The rise of generative AI has also brought renewed interest and growth in open source technology. But the question of open source is still "open" in generative AI.
Sometimes, the code is open -- other times, the training data and weights are open.
A leader in the open source large language model arena is Meta. However, despite the popularity of the social media's giant's Llama family of large language models (LLMs), some say Meta's LLMs are not fully open source.
One vendor that built on top of Llama is Lightning AI.
LightningAI is known for PyTorch Lightning, an open source Python library that provides a high level of support for PyTorch, a deep learning framework.
Lightning in March rolled out Thunder, a source-to-source compiler for PyTorch. Thunder speeds up training and serves generative AI (GenAI) models across multiple GPUs.
In April 2023, Lightning introduced Lit-Llama. 
The vendor created the Lit-Llama model starting with code from NanoGPT (a small-scale GPT for text generation created by Andrej Karpathy, a co-founder of OpenAI and former director of AI at Tesla). Lit-Llama is a fully open implementation of Llama source code, according to Lightning.
Being able to create on top of Llama highlights the importance of "hackable" technology, Lightning AI CTO Luca Antiga said on the Targeting AI podcast from TechTarget Editorial.
"The moment it's hackable is the moment people can build on top of it," Antiga said.
However, mechanisms of open source are yet to be fully developed in GenAI technology, Antiga continued.
It's also unlikely that open source models will outperform proprietary models.
"Open source will tend to keep model size low and more and more capable, which is really enabling and really groundbreaking, and closed source will try to win out by scaling out, probably," Antiga said. "It's a very nice race."
Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. Shaun Sutner is senior news director for TechTarget Editorial's information management team, driving coverage of artificial intelligence, unified communications, analytics and data management technologies. Together, they host the Targeting AI podcast series.
 

The rise of generative AI has also brought renewed interest and growth in open source technology. But the question of open source is still "open" in generative AI.
Sometimes, the code is open -- other times, the training data and weights are open.
A leader in the open source large language model arena is Meta. However, despite the popularity of the social media's giant's Llama family of large language models (LLMs), some say Meta's LLMs are not fully open source.
One vendor that built on top of Llama is Lightning AI.
LightningAI is known for PyTorch Lightning, an open source Python library that provides a high level of support for PyTorch, a deep learning framework.
Lightning in March rolled out Thunder, a source-to-source compiler for PyTorch. Thunder speeds up training and serves generative AI (GenAI) models across multiple GPUs.
In April 2023, Lightning introduced Lit-Llama. 
The vendor created the Lit-Llama model starting with code from NanoGPT (a small-scale GPT for text generation created by Andrej Karpathy, a co-founder of OpenAI and former director of AI at Tesla). Lit-Llama is a fully open implementation of Llama source code, according to Lightning.
Being able to create on top of Llama highlights the importance of "hackable" technology, Lightning AI CTO Luca Antiga said on the Targeting AI podcast from TechTarget Editorial.
"The moment it's hackable is the moment people can build on top of it," Antiga said.
However, mechanisms of open source are yet to be fully developed in GenAI technology, Antiga continued.
It's also unlikely that open source models will outperform proprietary models.
"Open source will tend to keep model size low and more and more capable, which is really enabling and really groundbreaking, and closed source will try to win out by scaling out, probably," Antiga said. "It's a very nice race."
Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. Shaun Sutner is senior news director for TechTarget Editorial's information management team, driving coverage of artificial intelligence, unified communications, analytics and data management technologies. Together, they host the Targeting AI podcast series.
 

50 min

Top Podcasts In Technology

All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Acquired
Ben Gilbert and David Rosenthal
Hard Fork
The New York Times
Lex Fridman Podcast
Lex Fridman
Search Engine
PJ Vogt, Audacy, Jigsaw
TED Radio Hour
NPR