Package | Description |
---|---|
jsat.classifiers.boosting |
Class and Description |
---|
AdaBoostM1
Implementation of Experiments with a New Boosting Algorithm, by Yoav Freund&Robert E.
|
AdaBoostM1PL
An extension to the original AdaBoostM1 algorithm for parallel training.
|
ArcX4
Arc-x4 is a ensemble-classifier that performs re-weighting of the data points
based on the total number of errors that have occurred for the data point.
|
Bagging
An implementation of Bootstrap Aggregating, as described by LEO BREIMAN in "Bagging Predictors".
|
EmphasisBoost
Emphasis Boost is a generalization of the Real AdaBoost algorithm, expanding
the update term and providing the
λ term
to control the trade off. |
LogitBoost
An implementation of the original 2 class LogitBoost algorithm.
|
LogitBoostPL
An extension to the original LogitBoost algorithm for parallel training.
|
ModestAdaBoost
Modest Ada Boost is a generalization of Discrete Ada Boost that attempts to
reduce the generalization error and avoid over-fitting.
|
SAMME
This is an implementation of the Multi-Class AdaBoost method SAMME (Stagewise Additive Modeling using
a Multi-Class Exponential loss function), presented in Multi-class AdaBoost by Ji Zhu,
Saharon Rosset, Hui Zou,&Trevor Hasstie
This algorithm reduces to AdaBoostM1 for binary classification problems. |
Stacking
This provides an implementation of the Stacking ensemble method.
|
UpdatableStacking
This provides an implementation of the Stacking ensemble method meant for
Updatable models.
|
Wagging
Wagging is a meta-classifier that is related to
Bagging . |
WaggingNormal
Wagging using the
Normal distribution. |
Copyright © 2017. All rights reserved.