Searched refs:sum_j (Results 1 – 5 of 5) sorted by relevance
19 $$softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))$$
24 \\(output_i = \sum_j data_j\\) where sum is over `j` such
24 \\(output_i = \frac{\sum_j data_j}{N}\\) where `mean` is
3956 int sum_j = 0; in SoftmaxInt8LUT() local3962 for (; sum_j <= last_dim - 16; sum_j += 16) { in SoftmaxInt8LUT()3963 uint8x16_t input_value = vld1q_u8(input_data_uint + sum_j); in SoftmaxInt8LUT()3986 for (; sum_j < last_dim; ++sum_j) { in SoftmaxInt8LUT()3987 const uint8_t index = (input_data_uint[sum_j] ^ offset) + table_offset; in SoftmaxInt8LUT()
13017 \\(output_i = \frac{\sum_j data_j}{N}\\) where `mean` is13147 \\(output_i = \sum_j data_j\\) where sum is over `j` such13746 $$softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))$$