Artificial Intelligence : Notes
  • Supervised Learning
    • Trees
      • AdaBoost
      • ID3
      • Random Forests
    • Convolutional Neural Networks
    • DNN for Classification
    • K-Nearest Neighbors
    • LDA
    • Logistic Regression
    • Perceptron
    • QDA
    • SVM
  • Unsupervised Learning
    • DBSCAN
    • Deep Autoencoder
    • Generative Adversarial Networks (GAN)
    • K-Means Clustering
    • Linear Regression
    • Principal Component Analysis (PCA)
    • Restricted Boltzmann Machines (RBM)
  • Reinforcement Learning
    • Markov Decision Process
    • Q-Learning
    • Deep Q-Learning
  • Ensemble Strategies
    • Ensemble Learning
    • Fine-tuning and resampling
  • Other Techniques
    • Expectation-Maximization
    • Recurrent Neural Networks

AdaBoost : bootstrapping stumps

There are three main ideas behind the AdaBoost algorithm (in the case of trees)

  1. AdaBoost combines a lot of "weak learners" to make classification. These weak learners are almost always "stumps" (i.e. trees with one root node and two leaves).
  2. Some stumps get more say in the classification than others
  3. Each stump is made by taking the previous stump's mistakes into account (via bootstrapping the sample following a distribution that puts more weight on incorrectly classified data)
Next
ID3