In this Machine Learning 101 episode, we explain ensemble modelling—how combining multiple models can create one powerful predictor. You’ll learn the difference between bagging and boosting, then dive into two of the most popular tree-based ensembles: Random Forests (many “randomised” decision trees voting/averaging together to reduce overfitting) and Gradient Boosted Trees (trees built sequentially, each correcting the last model’s mistakes). We use simple, real-world examples, then add an advanced section on key concepts such as OOB error. We finish with evaluation tips, common pitfalls, and a quick note on bias and responsible use.
Información
- Programa
- Publicado15 de febrero de 2026 a las 5:02 p.m. UTC
- Duración4 min
- Temporada1
- Episodio4
- ClasificaciónApto
