Developer Guide and Reference

  • 2021.4
  • 09/27/2021
  • Public Content
Contents

Boosting

Boosting is a set of algorithms intended to build a strong classifier from an ensemble of weighted weak learners by iterative re-weighting according to some accuracy measure for weak learners. A weak learner is a classification or regression algorithm that has only slightly better performance than random guessing. Weak learners are usually very simple and fast, and they focus on classification of very specific features.
Boosting algorithms include LogitBoost, BrownBoost, AdaBoost, and others. A Decision Stump classifier is one of the popular weak learners.
In oneDAL, a weak learner is:
  • Classification algorithm for AdaBoost and BrownBoost
  • Regression algorithm for LogitBoost
Weak learners support training of the boosting model for weighted datasets.
oneDAL boosting algorithms pass pointers to weak learner training and prediction objects through the parameters of boosting algorithms. Use the
getNumberOfWeakLearners()
method to determine the number of weak learners trained.
You can implement your own weak learners by deriving from the appropriate interface classes:
  • Classification for AdaBoost and BrownBoost
  • Regression for LogitBoost
When defining your own weak learners to use with boosting classifiers, make sure the prediction component of your weak learner returns:
  • The number from LaTex Math image. in case of binary classification.
  • Class label from LaTex Math image. for
    nClasses
    > 2.
  • Some boosting algorithms like SAMME.R AdaBoost that require probabilities of classes. For description of each boosting algorithm, refer to a corresponding section in this document.

Product and Performance Information

1

Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.