Searched refs:attention_axes (Results 1 – 4 of 4) sorted by relevance
/external/tensorflow/tensorflow/python/keras/layers/ |
D | multi_head_attention.py | 224 attention_axes=None, argument 246 if attention_axes is not None and not isinstance(attention_axes, 248 self._attention_axes = (attention_axes,) 250 self._attention_axes = attention_axes
|
D | multi_head_attention_test.py | 198 def test_high_dim_attention(self, q_dims, v_dims, mask_dims, attention_axes): argument 201 num_heads=2, key_dim=2, attention_axes=attention_axes)
|
/external/tensorflow/tensorflow/tools/api/golden/v2/ |
D | tensorflow.keras.layers.-multi-head-attention.pbtxt | 132 …dim\', \'value_dim\', \'dropout\', \'use_bias\', \'output_shape\', \'attention_axes\', \'kernel_in…
|
/external/tensorflow/tensorflow/tools/api/golden/v1/ |
D | tensorflow.keras.layers.-multi-head-attention.pbtxt | 132 …dim\', \'value_dim\', \'dropout\', \'use_bias\', \'output_shape\', \'attention_axes\', \'kernel_in…
|