Package | Description |
---|---|
jsat.classifiers.linear |
Class and Description |
---|
ALMA2
Provides a linear implementation of the ALMAp algorithm for p = 2, which is
considerably more efficient to compute.
|
AROW
An implementation of Adaptive Regularization of Weight Vectors (AROW), which
uses second order information to learn a large margin binary classifier.
|
BBR
This is an implementation of Bayesian Binary Regression for L1 and
L2 regularized logistic regression.
|
BBR.Prior
Valid priors that control what type of regularization is applied
|
LinearBatch
LinearBatch learns either a classification or regression problem depending on
the
loss function ℓ(w,x)
used. |
LinearL1SCD
Implements an iterative and single threaded form of fast
Stochastic Coordinate Decent for optimizing L1 regularized
linear regression problems.
|
LinearSGD
LinearSGD learns either a classification or regression problem depending on
the
loss function ℓ(w,x)
used. |
LogisticRegressionDCD
This provides an implementation of regularized logistic regression using Dual
Coordinate Descent.
|
NewGLMNET
NewGLMNET is a batch method for solving Elastic Net regularized Logistic
Regression problems of the form
0.5 * (1-α) ||w||2 + α * ||w||1 + C * ∑Ni=1 ℓ (wT xi + b, yi). |
NHERD
Implementation of the Normal Herd (NHERD) algorithm for learning a linear
binary classifier.
|
NHERD.CovMode
Sets what form of covariance matrix to use
|
PassiveAggressive
An implementations of the 3 versions of the Passive Aggressive algorithm for
binary classification and regression.
|
PassiveAggressive.Mode
Controls which version of the Passive Aggressive update is used
|
ROMMA
Provides an implementation of the linear Relaxed online Maximum Margin
algorithm, which finds a similar solution to SVMs.
|
SCD
Implementation of Stochastic Coordinate Descent for L1 regularized
classification and regression.
|
SCW
Provides an Implementation of Confidence-Weighted (CW) learning and Soft
Confidence-Weighted (SCW), both of which are binary linear classifiers
inspired by
PassiveAggressive . |
SCW.Mode
Which version of the algorithms shuld be used
|
SMIDAS
Implements the iterative and single threaded stochastic solver for
L1 regularized linear regression problems SMIDAS (Stochastic
Mirror Descent Algorithm mAde Sparse).
|
SPA
Support class Passive Aggressive (SPA) is a multi class generalization of
PassiveAggressive . |
STGD
This provides an implementation of Sparse Truncated Gradient Descent for
L1 regularized linear classification and regression on sparse data
sets.
|
StochasticMultinomialLogisticRegression
This is a Stochastic implementation of Multinomial Logistic Regression.
|
StochasticMultinomialLogisticRegression.Prior
Represents a prior of the coefficients that can be applied to perform
regularization.
|
StochasticSTLinearL1
This base class provides shared functionality and variables used by two
different training algorithms for L1 regularized linear models.
|
StochasticSTLinearL1.Loss |
Copyright © 2017. All rights reserved.