Home
last modified time | relevance | path

Searched refs:Adagrad (Results 1 – 25 of 41) sorted by relevance

12

/external/tensorflow/tensorflow/core/api_def/base_api/
Dapi_def_RetrieveTPUEmbeddingMDLAdagradLightParameters.pbtxt7 Parameter parameters updated by the MDL Adagrad Light optimization algorithm.
13 Parameter accumulators updated by the MDL Adagrad Light optimization algorithm.
19 Parameter weights updated by the MDL Adagrad Light optimization algorithm.
25 Parameter benefits updated by the MDL Adagrad Light optimization algorithm.
28 summary: "Retrieve MDL Adagrad Light embedding parameters."
Dapi_def_LoadTPUEmbeddingMDLAdagradLightParameters.pbtxt7 Value of parameters used in the MDL Adagrad Light optimization algorithm.
13 Value of accumulators used in the MDL Adagrad Light optimization algorithm.
19 Value of weights used in the MDL Adagrad Light optimization algorithm.
25 Value of benefits used in the MDL Adagrad Light optimization algorithm.
28 summary: "Load MDL Adagrad Light embedding parameters."
Dapi_def_RetrieveTPUEmbeddingAdagradParametersGradAccumDebug.pbtxt7 Parameter parameters updated by the Adagrad optimization algorithm.
13 Parameter accumulators updated by the Adagrad optimization algorithm.
19 Parameter gradient_accumulators updated by the Adagrad optimization algorithm.
22 summary: "Retrieve Adagrad embedding parameters with debug support."
Dapi_def_RetrieveTPUEmbeddingProximalAdagradParametersGradAccumDebug.pbtxt7 Parameter parameters updated by the proximal Adagrad optimization algorithm.
13 Parameter accumulators updated by the proximal Adagrad optimization algorithm.
19 Parameter gradient_accumulators updated by the proximal Adagrad optimization algorithm.
22 summary: "Retrieve proximal Adagrad embedding parameters with debug support."
Dapi_def_LoadTPUEmbeddingProximalAdagradParametersGradAccumDebug.pbtxt7 Value of parameters used in the proximal Adagrad optimization algorithm.
13 Value of accumulators used in the proximal Adagrad optimization algorithm.
19 Value of gradient_accumulators used in the proximal Adagrad optimization algorithm.
22 summary: "Load proximal Adagrad embedding parameters with debug support."
Dapi_def_LoadTPUEmbeddingAdagradParametersGradAccumDebug.pbtxt7 Value of parameters used in the Adagrad optimization algorithm.
13 Value of accumulators used in the Adagrad optimization algorithm.
19 Value of gradient_accumulators used in the Adagrad optimization algorithm.
22 summary: "Load Adagrad embedding parameters with debug support."
Dapi_def_RetrieveTPUEmbeddingAdagradParameters.pbtxt7 Parameter parameters updated by the Adagrad optimization algorithm.
13 Parameter accumulators updated by the Adagrad optimization algorithm.
16 summary: "Retrieve Adagrad embedding parameters."
Dapi_def_RetrieveTPUEmbeddingProximalAdagradParameters.pbtxt7 Parameter parameters updated by the proximal Adagrad optimization algorithm.
13 Parameter accumulators updated by the proximal Adagrad optimization algorithm.
16 summary: "Retrieve proximal Adagrad embedding parameters."
Dapi_def_LoadTPUEmbeddingAdagradParameters.pbtxt7 Value of parameters used in the Adagrad optimization algorithm.
13 Value of accumulators used in the Adagrad optimization algorithm.
16 summary: "Load Adagrad embedding parameters."
Dapi_def_LoadTPUEmbeddingProximalAdagradParameters.pbtxt7 Value of parameters used in the proximal Adagrad optimization algorithm.
13 Value of accumulators used in the proximal Adagrad optimization algorithm.
16 summary: "Load proximal Adagrad embedding parameters."
Dapi_def_ResourceApplyProximalAdagrad.pbtxt46 summary: "Update \'*var\' and \'*accum\' according to FOBOS with Adagrad learning rate."
Dapi_def_ApplyProximalAdagrad.pbtxt52 summary: "Update \'*var\' and \'*accum\' according to FOBOS with Adagrad learning rate."
/external/tensorflow/tensorflow/python/keras/optimizer_v2/
Dadagrad_test.py84 ada_opt = adagrad.Adagrad(learning_rate)
135 ada_opt = adagrad.Adagrad(learning_rate, decay=decay)
181 ada_opt = adagrad.Adagrad(lr_schedule)
222 sgd_op = adagrad.Adagrad(1.0).minimize(loss, var_list=[var0])
247 ada_opt = adagrad.Adagrad(learning_rate)
286 ada_opt = adagrad.Adagrad(learning_rate)
331 repeated_update = adagrad.Adagrad(3.0).apply_gradients(
333 aggregated_update = adagrad.Adagrad(3.0).apply_gradients(
356 update_op_repeated = adagrad.Adagrad(2.0).minimize(
358 update_op_aggregated = adagrad.Adagrad(2.0).minimize(
[all …]
Dadagrad.py35 class Adagrad(optimizer_v2.OptimizerV2): class
97 super(Adagrad, self).__init__(name, **kwargs)
117 super(Adagrad, self).set_weights(weights)
173 config = super(Adagrad, self).get_config()
/external/tensorflow/tensorflow/tools/api/golden/v2/
Dtensorflow.optimizers.-adagrad.pbtxt1 path: "tensorflow.optimizers.Adagrad"
3 is_instance: "<class \'tensorflow.python.keras.optimizer_v2.adagrad.Adagrad\'>"
17 … \'name\'], varargs=None, keywords=kwargs, defaults=[\'0.001\', \'0.1\', \'1e-07\', \'Adagrad\'], "
Dtensorflow.keras.optimizers.-adagrad.pbtxt1 path: "tensorflow.keras.optimizers.Adagrad"
3 is_instance: "<class \'tensorflow.python.keras.optimizer_v2.adagrad.Adagrad\'>"
17 … \'name\'], varargs=None, keywords=kwargs, defaults=[\'0.001\', \'0.1\', \'1e-07\', \'Adagrad\'], "
Dtensorflow.optimizers.pbtxt8 name: "Adagrad"
Dtensorflow.keras.optimizers.pbtxt8 name: "Adagrad"
Dtensorflow.estimator.-d-n-n-regressor.pbtxt24 …\'], varargs=None, keywords=None, defaults=[\'None\', \'1\', \'None\', \'Adagrad\', \'<function re…
/external/tensorflow/tensorflow/tools/api/golden/v1/
Dtensorflow.keras.optimizers.-adagrad.pbtxt1 path: "tensorflow.keras.optimizers.Adagrad"
3 is_instance: "<class \'tensorflow.python.keras.optimizer_v2.adagrad.Adagrad\'>"
17 … \'name\'], varargs=None, keywords=kwargs, defaults=[\'0.001\', \'0.1\', \'1e-07\', \'Adagrad\'], "
Dtensorflow.keras.optimizers.pbtxt8 name: "Adagrad"
Dtensorflow.train.-adagrad-optimizer.pbtxt21 …se_locking\', \'name\'], varargs=None, keywords=None, defaults=[\'0.1\', \'False\', \'Adagrad\'], "
/external/tensorflow/tensorflow/contrib/keras/api/keras/optimizers/
D__init__.py23 from tensorflow.python.keras.optimizers import Adagrad
/external/tensorflow/tensorflow/python/keras/
Doptimizers_test.py115 self._test_optimizer(keras.optimizers.Adagrad())
116 self._test_optimizer(keras.optimizers.Adagrad(decay=1e-3))
Doptimizers.py291 class Adagrad(Optimizer): class
312 super(Adagrad, self).__init__(**kwargs)
353 base_config = super(Adagrad, self).get_config()
777 adagrad = Adagrad
805 'adagrad': adagrad_v2.Adagrad,

12