Package | Description |
---|---|
jsat.classifiers.neuralnetwork |
Class and Description |
---|
BackPropagationNet
An implementation of a Feed Forward Neural Network (NN) trained by Back
Propagation.
|
BackPropagationNet.ActivationFunction
The neural network needs an activation function for the neurons that is
used to predict from inputs and train the network by propagating the
errors back through the network.
|
BackPropagationNet.WeightInitialization
Different methods of initializing the weight values before training
|
DReDNetSimple
This class provides a neural network based on Geoffrey Hinton's
Deep Rectified Dropout Nets.
|
LVQ
Learning Vector Quantization (LVQ) is an algorithm that extends
SOM
to take advantage of label information to perform classification. |
LVQ.LVQVersion
There are several LVQ versions, each one adding an additional case in
which two LVs instead of one can be updated.
|
LVQLLC
LVQ with Locally Learned Classifier (LVQ-LLC) is an adaption of the LVQ algorithm
I have come up with.
|
Perceptron
The perceptron is a simple algorithm that attempts to find a hyperplane that
separates two classes.
|
RBFNet
This provides a highly configurable implementation of a Radial Basis Function
Neural Network.
|
RBFNet.Phase1Learner
The first phase of learning a RBF Neural Network is to determine the
neuron locations.
|
RBFNet.Phase2Learner
The second phase of learning a RBF Neural Network is to determine how the
neurons are activated to produce the output of the hidden layer.
|
SGDNetworkTrainer
This class provides a highly configurable and generalized method of training
a neural network using Stochastic Gradient Decent.
Note, the API of this class may change in the future. |
SOM
An implementation of a Self Organizing Map, also called a Kohonen Map.
|
Copyright © 2017. All rights reserved.