Package | Description |
---|---|
jsat.classifiers.neuralnetwork | |
jsat.classifiers.neuralnetwork.activations |
Modifier and Type | Method and Description |
---|---|
void |
SGDNetworkTrainer.setLayersActivation(List<ActivationLayer> layersActivation)
Sets the list of layer activations for all layers other than the input
layer.
|
Modifier and Type | Class and Description |
---|---|
class |
LinearLayer |
class |
ReLU
This Activation Layer is for Rectified Linear Units.
|
class |
SigmoidLayer
This layer provides the standard Sigmoid activation f(x) =
1/(1+exp(-x))
|
class |
SoftmaxLayer
This activation layer is meant to be used as the top-most layer for
classification problems, and uses the softmax function (also known as cross
entropy) to convert the inputs into probabilities.
|
class |
SoftSignLayer
This provides the Soft Sign activation function f(x) = x/(1+abs(x)), which is
similar to the
tanh activation and has a min/max of -1 and
1. |
class |
TanhLayer
This layer provides the standard tanh activation f(x) =
tanh(x)
|
Modifier and Type | Method and Description |
---|---|
ActivationLayer |
ActivationLayer.clone() |
Copyright © 2017. All rights reserved.