public class SoftSignLayer extends Object implements ActivationLayer
tanh
activation and has a min/max of -1 and
1. However it is significantly faster to compute. Constructor and Description |
---|
SoftSignLayer() |
Modifier and Type | Method and Description |
---|---|
void |
activate(Matrix input,
Matrix output,
boolean rowMajor)
Computes the activation function of this layer on the given input.
|
void |
activate(Vec input,
Vec output)
Computes the activation function of this layer on the given input.
|
void |
backprop(Matrix input,
Matrix output,
Matrix delta_partial,
Matrix errout,
boolean rowMajor)
This method computes the backpropagated error to a given layer.
|
void |
backprop(Vec input,
Vec output,
Vec delta_partial,
Vec errout)
This method computes the backpropagated error to a given layer.
|
SoftSignLayer |
clone() |
public void activate(Vec input, Vec output)
ActivationLayer
activate
in interface ActivationLayer
input
- the raw input to compute the activation foroutput
- the location to store the activation inpublic void activate(Matrix input, Matrix output, boolean rowMajor)
ActivationLayer
activate
in interface ActivationLayer
input
- the raw input to compute the activation foroutput
- the location to store the activation inrowMajor
- true
if the information per input is stored in
rows, false
if the inputs were stored by column. This parameter
does not indicate if the matrices themselves are backed by a row or
column major implementationpublic void backprop(Vec input, Vec output, Vec delta_partial, Vec errout)
ActivationLayer
delta_partial
and errout
may point to the same vector
objectbackprop
in interface ActivationLayer
input
- the input to this layer that was feed in to be activatedoutput
- the activation that was produced for this layerdelta_partial
- the error assigned to this layer from the above
layer, sans the hamard product with the derivative of the layer
activation. Often denoted as wl+1 T δl+1errout
- the delta value or error produced for this layerpublic void backprop(Matrix input, Matrix output, Matrix delta_partial, Matrix errout, boolean rowMajor)
ActivationLayer
delta_partial
and errout
may point to the same vector
objectbackprop
in interface ActivationLayer
input
- the input to this layer that was feed in to be activatedoutput
- the activation that was produced for this layerdelta_partial
- the error assigned to this layer from the above
layer, sans the hamard product with the derivative of the layer
activation. Often denoted as wl+1 T δl+1errout
- the delta value or error produced for this layerrowMajor
- true
if the information per input is stored in
rows, false
if the inputs were stored by column. This parameter
does not indicate if the matrices themselves are backed by a row or
column major implementationpublic SoftSignLayer clone()
clone
in interface ActivationLayer
clone
in class Object
Copyright © 2017. All rights reserved.