Class | Description |
---|---|
BackPropagationNet |
An implementation of a Feed Forward Neural Network (NN) trained by Back
Propagation.
|
BackPropagationNet.ActivationFunction |
The neural network needs an activation function for the neurons that is
used to predict from inputs and train the network by propagating the
errors back through the network.
|
DReDNetSimple |
This class provides a neural network based on Geoffrey Hinton's
Deep Rectified Dropout Nets.
|
LVQ |
Learning Vector Quantization (LVQ) is an algorithm that extends
SOM
to take advantage of label information to perform classification. |
LVQLLC |
LVQ with Locally Learned Classifier (LVQ-LLC) is an adaption of the LVQ algorithm
I have come up with.
|
Perceptron |
The perceptron is a simple algorithm that attempts to find a hyperplane that
separates two classes.
|
RBFNet |
This provides a highly configurable implementation of a Radial Basis Function
Neural Network.
|
SGDNetworkTrainer |
This class provides a highly configurable and generalized method of training
a neural network using Stochastic Gradient Decent.
Note, the API of this class may change in the future. |
SOM |
An implementation of a Self Organizing Map, also called a Kohonen Map.
|
Enum | Description |
---|---|
BackPropagationNet.WeightInitialization |
Different methods of initializing the weight values before training
|
LVQ.LVQVersion |
There are several LVQ versions, each one adding an additional case in
which two LVs instead of one can be updated.
|
RBFNet.Phase1Learner |
The first phase of learning a RBF Neural Network is to determine the
neuron locations.
|
RBFNet.Phase2Learner |
The second phase of learning a RBF Neural Network is to determine how the
neurons are activated to produce the output of the hidden layer.
|
Copyright © 2017. All rights reserved.