Skip to main content

Table 1 Overview of the boosting approaches

From: Boosting methods for multi-class imbalanced data classification: an experimental review

Approach Brief description Years
AdaBoost.M1 [43] A multiclass variation of AdaBoost which uses multiclass base classifier
Weight of each base classifiers is a function of error rate
1997
AdaBoost.M2 [43] A multiclass variation of AdaBoost
Weight of each base classifiers is a function of pseudo-loss
1997
GentleBoost [45] Extended version for AdaBoost which uses Newton steps
Using weighted least-squares regression for fitting the base classifiers
2000
CSB1 [58] A Cost-sensitive variation of AdaBoost proposed for handling imbalanced data
Adding cost item into the weight update formula of AdaBoost
Removing step size coefficient from the weight update formula of AdaBoost
2000
CSB2 [58] A Cost-sensitive variation of AdaBoost proposed for handling imbalanced data
Adding cost item into the weight update formula of AdaBoost
The step size is considered in the weight update formula, like AdaBoost
2000
MAdaBoost [59] Proposed with the goal of solving the AdaBoost's sensitivity to noise
Modifying the weight update formula of AdaBoost
2000
RareBoost [60, 61] An improvement for AdaBoost
Using different weight update scheme for positive and negative predictions
Considering False Positive, True Positive, True Negative and False Negative in step size calculation
2001
Modest AdaBoost [62] An improvement of GentleBoost
Using different weight update formula for misclassified and truly classified samples
Using inverted distribution to assign larger weights to truly classified samples
2005
JOUSBoost [63] Proposed with the goal of handling imbalanced data in AdaBoost algorithm
Combining the jittering of the data and sampling techniques with AdaBoost
2007
ABC-LogitBoost [47] An improvement of LogitBoost for multiclass classification
Solving the difficulties of dense Hessian Matrix in Logistic loss
2009
AdaBoost.HM [64] A multiclass variation of AdaBoost which uses hypothesis margin
Using multiclass base classifiers instead of decomposing the multiclass classification problem into multiple binary problems
2010
RAMOBoost [65] Proposed with the goal of imbalanced data handling
Combining Ranked Minority Oversampling with AdaBoost.M2
Using the sampling probability distribution for ranking the minority class samples
2010
AOSO-LogitBoost [48] One versus one version of LogitBoost for multiclass classification
Solving the difficulties of dense Hessian Matrix in Logistic loss by utilizing vector tree and adaptive block coordinate descent techniques
2011
CD-MCBoost [66] Performing coordinate descent on multiclass loss function
Concentration of each base classifier on margin maximization of a single class
2011
EUSBoost [67] An improvement of RUSBoost which uses evolutionary undersampling
Using different subsets of majority class samples in the training phase of each base classifier to ensure diversity
2013
RB-Boost [68] Combining Random Balance with AdaBoost.M2
Using SMOTE sampling to deal with imbalanced data problem
The difference with SMOTEBoost is using random proportion of classes in each iteration of booting to ensure the diversity of base classifiers
2015
LIUBoost [69] Proposed with the goal of imbalanced data handling
Using undersampling in order to solve the imbalanced data problem
Adding a cost term to the weight update formula of the samples
2019
TLUSBoost [70] Proposed with the goal of imbalanced data handling
Using Tomek-linked and redundancy-based undersampling for removing outlier samples
2019