Searched refs:ReLU (Results 1 – 14 of 14) sorted by relevance
/external/tensorflow/tensorflow/python/keras/layers/ |
D | advanced_activations_test.py | 65 testing_utils.layer_test(keras.layers.ReLU, 72 'LeakyRelu' in keras.layers.ReLU(negative_slope=0.2)(x).name) 74 self.assertTrue('Relu' in keras.layers.ReLU()(x).name) 76 self.assertTrue('Relu6' in keras.layers.ReLU(max_value=6)(x).name) 81 testing_utils.layer_test(keras.layers.ReLU, 89 keras.layers.ReLU, 95 layer = keras.layers.Dense(1, activation=keras.layers.ReLU())
|
D | advanced_activations.py | 273 class ReLU(Layer): class 298 super(ReLU, self).__init__(**kwargs) 327 base_config = super(ReLU, self).get_config()
|
D | __init__.py | 32 from tensorflow.python.keras.layers.advanced_activations import ReLU
|
/external/tensorflow/tensorflow/tools/api/golden/v2/ |
D | tensorflow.keras.layers.-re-l-u.pbtxt | 1 path: "tensorflow.keras.layers.ReLU" 3 is_instance: "<class \'tensorflow.python.keras.layers.advanced_activations.ReLU\'>"
|
D | tensorflow.keras.layers.pbtxt | 308 name: "ReLU"
|
/external/tensorflow/tensorflow/tools/api/golden/v1/ |
D | tensorflow.keras.layers.-re-l-u.pbtxt | 1 path: "tensorflow.keras.layers.ReLU" 3 is_instance: "<class \'tensorflow.python.keras.layers.advanced_activations.ReLU\'>"
|
D | tensorflow.keras.layers.pbtxt | 312 name: "ReLU"
|
/external/libtextclassifier/lang_id/common/flatbuffers/ |
D | embedding-network.fbs | 90 // this is not the first layer, we apply a nonlinear function (ReLU); for the 91 // first layer, we skip ReLU. Next, we multiply by |weights| and add |bias|,
|
/external/tensorflow/tensorflow/python/keras/engine/ |
D | base_layer_test.py | 598 10, activation=keras.layers.ReLU(name='MyAct'), name='MyName2') 607 10, activation=keras.layers.ReLU(name='MyAct'), name='MyName3')
|
/external/tensorflow/tensorflow/contrib/specs/ |
D | README.md | 168 This will stack four fully connected ReLU layers, sharing the same
|
/external/tensorflow/tensorflow/contrib/autograph/examples/notebooks/ |
D | rnn_keras_estimator.ipynb | 227 " followed by a fully connected layer with ReLU activation.\n",
|
D | dev_summit_2018_demo.ipynb | 1148 " followed by a fully connected layer with ReLU activation.\n",
|
/external/tensorflow/ |
D | RELEASE.md | 1185 Activation: rectified linear unit (ReLU)
|
/external/tensorflow/tensorflow/examples/tutorials/deepdream/ |
D | deepdream.ipynb | 404 … "# Picking some internal layer. Note that we use outputs before applying the ReLU nonlinearity\n",
|