Package | Description |
---|---|
jsat.classifiers.neuralnetwork | |
jsat.classifiers.neuralnetwork.activations |
Class and Description |
---|
ActivationLayer
This interface defines a type of activation layer for use in a Neural Network
|
Class and Description |
---|
ActivationLayer
This interface defines a type of activation layer for use in a Neural Network
|
LinearLayer |
ReLU
This Activation Layer is for Rectified Linear Units.
|
SigmoidLayer
This layer provides the standard Sigmoid activation f(x) =
1/(1+exp(-x))
|
SoftmaxLayer
This activation layer is meant to be used as the top-most layer for
classification problems, and uses the softmax function (also known as cross
entropy) to convert the inputs into probabilities.
|
SoftSignLayer
This provides the Soft Sign activation function f(x) = x/(1+abs(x)), which is
similar to the
tanh activation and has a min/max of -1 and
1. |
TanhLayer
This layer provides the standard tanh activation f(x) =
tanh(x)
|
Copyright © 2017. All rights reserved.