11 min

Episode 4 - Ensemble models. Bagging and boosting Your Data Teacher Podcast

    • Technology

In this episode, I'm going to talk about ensemble models, particularly bagging and boosting. Bagging is very useful for reducing variance, boosting is used for reducing bias. The most common bagging algorithm is Random Forest, the most common boosting algorithm is Gradient Boosting, whose most common implementations are XGBoost, LightGBM and CatBoost.

Home Page: https://www.yourdatateacher.com

In this episode, I'm going to talk about ensemble models, particularly bagging and boosting. Bagging is very useful for reducing variance, boosting is used for reducing bias. The most common bagging algorithm is Random Forest, the most common boosting algorithm is Gradient Boosting, whose most common implementations are XGBoost, LightGBM and CatBoost.

Home Page: https://www.yourdatateacher.com

11 min

Top Podcasts In Technology

Acquired
Ben Gilbert and David Rosenthal
Lenny's Podcast: Product | Growth | Career
Lenny Rachitsky
Generative AI
Kognitos
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Waveform: The MKBHD Podcast
Vox Media Podcast Network
The AI Daily Brief (Formerly The AI Breakdown): Artificial Intelligence News and Analysis
Nathaniel Whittemore