Home
last modified time | relevance | path

Searched refs:latency (Results 1 – 6 of 6) sorted by relevance

/test/mlts/benchmark/results/
Dgenerate_result.py207 for latency in latencies:
208 mins.append(latency.time_freq_start_sec)
209 to_add = len(latency.time_freq_sec) * latency.time_freq_step_sec
210 maxs.append(latency.time_freq_start_sec + to_add)
368 def generate_avg_ms(baseline, latency): argument
370 if latency is None:
371 latency = baseline
373 result_avg_ms = (latency.total_time_sec / latency.iterations)*1000.0
374 if latency is baseline:
400 def generate_latency_graph_entry(tag, latency, tmin, tmax): argument
[all …]
/test/vts-testcase/performance/audio_loopback_test/
DAudioLoopbackTest.py64 for latency, confidence in zip(latencies, confidences):
67 latencies.remove(latency)
69 total_latency += latency
84 for latency in latencies:
85 sd += (latency - avg_latency) * (latency - avg_latency)
DREADME.md10 1. Average round-trip latency.
11 2. Standard deviation of round-trip latency.
/test/vts/utils/python/profiling/
Dprofiling_utils.py275 full_api, latency = line.rsplit(":", 1)
278 profiling_data.values[api_name].append(long(latency))
280 profiling_data.values[api_name] = [long(latency)]
/test/vts/utils/native/trace_processor/
DVtsTraceProcessor.cpp293 int64_t latency = end_timestamp - start_timestamp; in ProcessTraceForLatencyProfiling() local
295 if (latency < 0) { in ProcessTraceForLatencyProfiling()
300 cout << full_api_name << ":" << latency << endl; in ProcessTraceForLatencyProfiling()
/test/mlts/benchmark/
DREADME.txt17 Test Suite). MLTS allows to evaluate NNAPI acceleration latency and accuracy