Skip to content

Latest commit

 

History

History

README.md

Summary

Ensemble learning is also called committee-based learning, which makes decision based on multiple weak learners, or individual learners. Some basic forms of ensemble learning are composite of multiple small trees, such as bagging, boosting and random forest. Currently, boosting family(Gradient boosting decision trees, Adaboost, XGboost, etc.) is a popular machine learning models because it can archieve both high accuracy and low variance. Also, Ensemble learning distincts itself from other individual model by combining diverse weak learner in an innovative way with methods like stacking.

References

  • [Machine Learning, by Zhihua Zhou] - Chapter 8: Ensemble Learning
  • [Pattern Recognition and Machine Learning, by Christopher Bishop] - Chapter 14: Combining Models
  • [An Introduction to Statistical Learning, by Gareth James,Daniela Witten,Trevor Hastie and Robert Tibshirani] - Chapter 8: Tree-Based Method
  • 10601, Fall 2018[CMU]

Exercises

  • Course Notes
  • XGboost.ipynb
  • RandomForest.ipynb
  • Stacking.ipynb