Searched defs:integer_count_error (Results 1 – 1 of 1) sorted by relevance
697 float integer_count_error = best_combined_error[quant_level][integer_count - 1]; in one_partition_find_best_combination_for_bitcount() local800 float integer_count_error = best_combined_error[quant_level][integer_count - 2]; in two_partitions_find_best_combination_for_bitcount() local925 float integer_count_error = best_combined_error[quant_level][integer_count - 3]; in three_partitions_find_best_combination_for_bitcount() local1061 float integer_count_error = best_combined_error[quant_level][integer_count - 4]; in four_partitions_find_best_combination_for_bitcount() local