sugartensor.sg_optimize module¶
-
class
sugartensor.sg_optimize.
AdaMaxOptimizer
(learning_rate=0.001, beta1=0.9, beta2=0.999, use_locking=False, name='Adamax')[source]¶ Bases:
tensorflow.python.training.optimizer.Optimizer
Optimizer that implements the Adamax algorithm. See [Kingma et. al., 2014](http://arxiv.org/abs/1412.6980) ([pdf](http://arxiv.org/pdf/1412.6980.pdf)).
excerpted from https://github.com/openai/iaf/blob/master/tf_utils/adamax.py
@@__init__
-
class
sugartensor.sg_optimize.
MaxPropOptimizer
(learning_rate=0.001, beta2=0.999, use_locking=False, name='MaxProp')[source]¶ Bases:
tensorflow.python.training.optimizer.Optimizer
Optimizer that implements the MaxProp algorithm by buriburisuri@gmail.com.