public class NAdaGrad extends Object implements GradientUpdater
ScaledVector
, where
the base vector is the datum, then NAdaGrad will work. If not the case,
NAdaGrad will degenerate into something similar to normal AdaGrad
.
Constructor and Description |
---|
NAdaGrad()
Creates a new NAdaGrad updater
|
NAdaGrad(NAdaGrad toCopy)
Copy constructor
|
Modifier and Type | Method and Description |
---|---|
NAdaGrad |
clone() |
void |
setup(int d)
Sets up this updater to update a weight vector of dimension
d
by a gradient of the same dimension |
void |
update(Vec w,
Vec grad,
double eta)
Updates the weight vector
x such that x = x-ηf(grad),
where f(grad) is some function on the gradient that effectively returns a
new vector. |
double |
update(Vec w,
Vec grad,
double eta,
double bias,
double biasGrad)
Updates the weight vector
x such that x = x-ηf(grad),
where f(grad) is some function on the gradient that effectively returns a
new vector. |
public NAdaGrad()
public NAdaGrad(NAdaGrad toCopy)
toCopy
- the object to copypublic void update(Vec w, Vec grad, double eta)
GradientUpdater
x
such that x = x-ηf(grad),
where f(grad) is some function on the gradient that effectively returns a
new vector. It is not necessary for the internal implementation to ever
explicitly form any of these objects, so long as x
is mutated to
have the correct result.update
in interface GradientUpdater
w
- the vector to mutate such that is has been updated by the
gradientgrad
- the gradient to update the weight vector x
frometa
- the learning rate to applypublic double update(Vec w, Vec grad, double eta, double bias, double biasGrad)
GradientUpdater
x
such that x = x-ηf(grad),
where f(grad) is some function on the gradient that effectively returns a
new vector. It is not necessary for the internal implementation to ever
explicitly form any of these objects, so long as x
is mutated to
have the correct result. update
in interface GradientUpdater
w
- the vector to mutate such that is has been updated by the
gradientgrad
- the gradient to update the weight vector x
frometa
- the learning rate to applybias
- the bias term of the vectorbiasGrad
- the gradient for the bias termbias = bias - returnValue
public NAdaGrad clone()
clone
in interface GradientUpdater
clone
in class Object
public void setup(int d)
GradientUpdater
d
by a gradient of the same dimensionsetup
in interface GradientUpdater
d
- the dimension of the weight vector that will be updatedCopyright © 2017. All rights reserved.