AdaBoost : bootstrapping stumps
There are three main ideas behind the AdaBoost algorithm (in the case of trees)
- AdaBoost combines a lot of "weak learners" to make classification. These weak learners are almost always "stumps" (i.e. trees with one root node and two leaves).
- Some stumps get more say in the classification than others
- Each stump is made by taking the previous stump's mistakes into account (via bootstrapping the sample following a distribution that puts more weight on incorrectly classified data)