Home
last modified time | relevance | path

Searched refs:ReLU (Results 1 – 14 of 14) sorted by relevance

/external/tensorflow/tensorflow/python/keras/layers/
Dadvanced_activations_test.py65 testing_utils.layer_test(keras.layers.ReLU,
72 'LeakyRelu' in keras.layers.ReLU(negative_slope=0.2)(x).name)
74 self.assertTrue('Relu' in keras.layers.ReLU()(x).name)
76 self.assertTrue('Relu6' in keras.layers.ReLU(max_value=6)(x).name)
81 testing_utils.layer_test(keras.layers.ReLU,
89 keras.layers.ReLU,
95 layer = keras.layers.Dense(1, activation=keras.layers.ReLU())
Dadvanced_activations.py273 class ReLU(Layer): class
298 super(ReLU, self).__init__(**kwargs)
327 base_config = super(ReLU, self).get_config()
D__init__.py32 from tensorflow.python.keras.layers.advanced_activations import ReLU
/external/tensorflow/tensorflow/tools/api/golden/v2/
Dtensorflow.keras.layers.-re-l-u.pbtxt1 path: "tensorflow.keras.layers.ReLU"
3 is_instance: "<class \'tensorflow.python.keras.layers.advanced_activations.ReLU\'>"
Dtensorflow.keras.layers.pbtxt308 name: "ReLU"
/external/tensorflow/tensorflow/tools/api/golden/v1/
Dtensorflow.keras.layers.-re-l-u.pbtxt1 path: "tensorflow.keras.layers.ReLU"
3 is_instance: "<class \'tensorflow.python.keras.layers.advanced_activations.ReLU\'>"
Dtensorflow.keras.layers.pbtxt312 name: "ReLU"
/external/libtextclassifier/lang_id/common/flatbuffers/
Dembedding-network.fbs90 // this is not the first layer, we apply a nonlinear function (ReLU); for the
91 // first layer, we skip ReLU. Next, we multiply by |weights| and add |bias|,
/external/tensorflow/tensorflow/python/keras/engine/
Dbase_layer_test.py598 10, activation=keras.layers.ReLU(name='MyAct'), name='MyName2')
607 10, activation=keras.layers.ReLU(name='MyAct'), name='MyName3')
/external/tensorflow/tensorflow/contrib/specs/
DREADME.md168 This will stack four fully connected ReLU layers, sharing the same
/external/tensorflow/tensorflow/contrib/autograph/examples/notebooks/
Drnn_keras_estimator.ipynb227 " followed by a fully connected layer with ReLU activation.\n",
Ddev_summit_2018_demo.ipynb1148 " followed by a fully connected layer with ReLU activation.\n",
/external/tensorflow/
DRELEASE.md1185 Activation: rectified linear unit (ReLU)
/external/tensorflow/tensorflow/examples/tutorials/deepdream/
Ddeepdream.ipynb404 … "# Picking some internal layer. Note that we use outputs before applying the ReLU nonlinearity\n",