Searched refs:Attention (Results 1 – 25 of 26) sorted by relevance
12
/external/tensorflow/tensorflow/python/keras/layers/ |
D | dense_attention_test.py | 183 attention_layer = dense_attention.Attention() 199 attention_layer = dense_attention.Attention() 219 attention_layer = dense_attention.Attention() 235 attention_layer = dense_attention.Attention(use_scale=True) 254 attention_layer = dense_attention.Attention() 273 attention_layer = dense_attention.Attention() 286 attention_layer = dense_attention.Attention() 314 attention_layer = dense_attention.Attention() 346 attention_layer = dense_attention.Attention() 386 attention_layer = dense_attention.Attention() [all …]
|
D | dense_attention.py | 225 class Attention(BaseDenseAttention): class 326 super(Attention, self).__init__(**kwargs) 340 super(Attention, self).build(input_shape) 358 base_config = super(Attention, self).get_config()
|
D | __init__.py | 133 from tensorflow.python.keras.layers.dense_attention import Attention
|
/external/libcups/templates/fr/ |
D | class-confirm.tmpl | 3 <P><B>Attention:</B> Êtes-vous sur(e) de vouloir supprimer la classe
|
/external/tensorflow/tensorflow/python/keras/mixed_precision/ |
D | layer_correctness_test.py | 103 ('Attention', dense_attention.Attention, [(2, 2, 3), (2, 3, 3), 143 ('AttentionLayerCausal', lambda: dense_attention.Attention(causal=True), [
|
/external/tensorflow/tensorflow/tools/api/golden/v1/ |
D | tensorflow.keras.layers.-attention.pbtxt | 1 path: "tensorflow.keras.layers.Attention" 3 is_instance: "<class \'tensorflow.python.keras.layers.dense_attention.Attention\'>"
|
D | tensorflow.keras.layers.pbtxt | 28 name: "Attention"
|
/external/tensorflow/tensorflow/tools/api/golden/v2/ |
D | tensorflow.keras.layers.-attention.pbtxt | 1 path: "tensorflow.keras.layers.Attention" 3 is_instance: "<class \'tensorflow.python.keras.layers.dense_attention.Attention\'>"
|
D | tensorflow.keras.layers.pbtxt | 28 name: "Attention"
|
/external/mesa3d/docs/ |
D | license.rst | 60 Attention, Contributors
|
/external/tensorflow/tensorflow/python/keras/benchmarks/layer_benchmarks/ |
D | layer_benchmarks_test.py | 166 ("Attention_small_shape", tf.keras.layers.Attention,
|
/external/llvm-project/clang/include/clang/AST/ |
D | CommentCommands.td | 130 def Attention : BlockCommand<"attention">;
|
/external/clang/include/clang/AST/ |
D | CommentCommands.td | 128 def Attention : BlockCommand<"attention">;
|
/external/aac/ |
D | NOTICE | 86 Attention: Audio and Multimedia Departments - FDK AAC LL
|
/external/ltp/testcases/network/stress/ |
D | README | 60 Attention for HTTP/FTP stress tests
|
/external/python/cpython2/Doc/library/ |
D | turtle.rst | 586 Attention: *speed* = 0 means that *no* animation takes 1712 "world coordinates". **Attention**: in this mode angles appear distorted if
|
/external/python/cpython3/Doc/library/ |
D | turtle.rst | 626 Attention: *speed* = 0 means that *no* animation takes 1918 "world coordinates". **Attention**: in this mode angles appear distorted if
|
/external/llvm-project/llvm/docs/ |
D | Coroutines.rst | 1719 Areas Requiring Attention
|
/external/selinux/policycoreutils/po/ |
D | fr.po | 1139 msgstr "Attention ! Impossible de restaurer le contexte pour %s\n" 1241 "Attention ! Impossible de récupérer les informations liées au terminal\n"
|
/external/mtools/ |
D | mtools.info | 468 *Caution / Attention distributors*: If mtools is compiled on a Linux
|
D | mtools.texi | 515 @strong{Caution / Attention distributors}: If mtools is compiled on a
|
/external/openssh/ |
D | configure.ac | 1087 # Attention: always take care to bind libsocket and libnsl before libc,
|
/external/cpuinfo/test/dmesg/ |
D | huawei-mate-9.log | 1350 [ 8.368682s][pid:0,cpu6,in irq]sd 0:0:0:0: tag#0 Sense Key : Unit Attention [current] 1356 [ 8.369232s][pid:0,cpu6,in irq]sd 0:0:0:1: tag#0 Sense Key : Unit Attention [current] 1396 [ 8.371551s][pid:0,cpu6,in irq]sd 0:0:0:2: tag#0 Sense Key : Unit Attention [current] 1409 [ 8.375000s][pid:0,cpu6,in irq]sd 0:0:0:3: tag#0 Sense Key : Unit Attention [current]
|
/external/tensorflow/ |
D | RELEASE.md | 612 …* In `Attention` and `AdditiveAttention` layers, the `call()` method now accepts a `return_attenti… 3716 * Monotonic Attention wrappers added to `tf.contrib.seq2seq`.
|
/external/one-true-awk/testdir/ |
D | funstack.in | 4828 …tion of the field equations in the presence of matter in a simple case. Attention is drawn to the … 19666 …abstract = "Attention is first given to some of the general problems of multidimensional table… 23746 …continuous-flow and preempt-resume scheduling of input-output activity. Attention is given to the …
|
12