public class AdaGrad extends Object implements LearningRate
Duchi and Elad Hazan and Yoram Singer, Adaptive Subgradient Methods for Online Learning and Stochastic Optimization, 2010, University of California, Berkeley| Constructor and Description |
|---|
AdaGrad()
Creates and instance with a base rate of 0.5.
|
AdaGrad(double baseRate)
Create a new instance with the given base rate.
|
| Modifier and Type | Method and Description |
|---|---|
double |
getLearningRate(int group,
int number,
double gradient)
Returns the learning rate for the parameter in the given group with the
given number.
|
void |
reset()
Resets the learning rate sequence to its initial state.
|
public AdaGrad(double baseRate)
baseRate - public AdaGrad()
public double getLearningRate(int group,
int number,
double gradient)
LearningRategetLearningRate in interface LearningRatepublic void reset()
LearningRatereset in interface LearningRateCopyright © 2017. All rights reserved.