Home
last modified time | relevance | path

Searched refs:Attention (Results 1 – 25 of 26) sorted by relevance

12

/external/tensorflow/tensorflow/python/keras/layers/
Ddense_attention_test.py183 attention_layer = dense_attention.Attention()
199 attention_layer = dense_attention.Attention()
219 attention_layer = dense_attention.Attention()
235 attention_layer = dense_attention.Attention(use_scale=True)
254 attention_layer = dense_attention.Attention()
273 attention_layer = dense_attention.Attention()
286 attention_layer = dense_attention.Attention()
314 attention_layer = dense_attention.Attention()
346 attention_layer = dense_attention.Attention()
386 attention_layer = dense_attention.Attention()
[all …]
Ddense_attention.py225 class Attention(BaseDenseAttention): class
326 super(Attention, self).__init__(**kwargs)
340 super(Attention, self).build(input_shape)
358 base_config = super(Attention, self).get_config()
D__init__.py133 from tensorflow.python.keras.layers.dense_attention import Attention
/external/libcups/templates/fr/
Dclass-confirm.tmpl3 <P><B>Attention:</B> Êtes-vous sur(e) de vouloir supprimer la classe
/external/tensorflow/tensorflow/python/keras/mixed_precision/
Dlayer_correctness_test.py103 ('Attention', dense_attention.Attention, [(2, 2, 3), (2, 3, 3),
143 ('AttentionLayerCausal', lambda: dense_attention.Attention(causal=True), [
/external/tensorflow/tensorflow/tools/api/golden/v1/
Dtensorflow.keras.layers.-attention.pbtxt1 path: "tensorflow.keras.layers.Attention"
3 is_instance: "<class \'tensorflow.python.keras.layers.dense_attention.Attention\'>"
Dtensorflow.keras.layers.pbtxt28 name: "Attention"
/external/tensorflow/tensorflow/tools/api/golden/v2/
Dtensorflow.keras.layers.-attention.pbtxt1 path: "tensorflow.keras.layers.Attention"
3 is_instance: "<class \'tensorflow.python.keras.layers.dense_attention.Attention\'>"
Dtensorflow.keras.layers.pbtxt28 name: "Attention"
/external/mesa3d/docs/
Dlicense.rst60 Attention, Contributors
/external/tensorflow/tensorflow/python/keras/benchmarks/layer_benchmarks/
Dlayer_benchmarks_test.py166 ("Attention_small_shape", tf.keras.layers.Attention,
/external/llvm-project/clang/include/clang/AST/
DCommentCommands.td130 def Attention : BlockCommand<"attention">;
/external/clang/include/clang/AST/
DCommentCommands.td128 def Attention : BlockCommand<"attention">;
/external/aac/
DNOTICE86 Attention: Audio and Multimedia Departments - FDK AAC LL
/external/ltp/testcases/network/stress/
DREADME60 Attention for HTTP/FTP stress tests
/external/python/cpython2/Doc/library/
Dturtle.rst586 Attention: *speed* = 0 means that *no* animation takes
1712 "world coordinates". **Attention**: in this mode angles appear distorted if
/external/python/cpython3/Doc/library/
Dturtle.rst626 Attention: *speed* = 0 means that *no* animation takes
1918 "world coordinates". **Attention**: in this mode angles appear distorted if
/external/llvm-project/llvm/docs/
DCoroutines.rst1719 Areas Requiring Attention
/external/selinux/policycoreutils/po/
Dfr.po1139 msgstr "Attention ! Impossible de restaurer le contexte pour %s\n"
1241 "Attention ! Impossible de récupérer les informations liées au terminal\n"
/external/mtools/
Dmtools.info468 *Caution / Attention distributors*: If mtools is compiled on a Linux
Dmtools.texi515 @strong{Caution / Attention distributors}: If mtools is compiled on a
/external/openssh/
Dconfigure.ac1087 # Attention: always take care to bind libsocket and libnsl before libc,
/external/cpuinfo/test/dmesg/
Dhuawei-mate-9.log1350 [ 8.368682s][pid:0,cpu6,in irq]sd 0:0:0:0: tag#0 Sense Key : Unit Attention [current]
1356 [ 8.369232s][pid:0,cpu6,in irq]sd 0:0:0:1: tag#0 Sense Key : Unit Attention [current]
1396 [ 8.371551s][pid:0,cpu6,in irq]sd 0:0:0:2: tag#0 Sense Key : Unit Attention [current]
1409 [ 8.375000s][pid:0,cpu6,in irq]sd 0:0:0:3: tag#0 Sense Key : Unit Attention [current]
/external/tensorflow/
DRELEASE.md612 …* In `Attention` and `AdditiveAttention` layers, the `call()` method now accepts a `return_attenti…
3716 * Monotonic Attention wrappers added to `tf.contrib.seq2seq`.
/external/one-true-awk/testdir/
Dfunstack.in4828 …tion of the field equations in the presence of matter in a simple case. Attention is drawn to the …
19666 …abstract = "Attention is first given to some of the general problems of multidimensional table…
23746 …continuous-flow and preempt-resume scheduling of input-output activity. Attention is given to the …

12