Adjust threshold in palette color cache clustering
Use a higher threshold to bias towards using cached colors.
On speed 0, key frame only, screen content set:
ovr_psnr ssim vmaf_neg
8-bit -0.37% -0.57% -0.27%
10-bit -1.18% -0.97% -0.65%
12-bit -0.53% -0.99% -0.71%
BUG=aomedia:2847
STATS_CHANGED
Change-Id: Ibc6018ff6a39823547faff8ab6a1625beed9ffbc
(cherry picked from commit 35d1af3053a03b9d963802ac77f458e8f520fc73)
diff --git a/av1/encoder/palette.c b/av1/encoder/palette.c
index 9eedd6f..d608a23 100644
--- a/av1/encoder/palette.c
+++ b/av1/encoder/palette.c
@@ -187,6 +187,7 @@
// Bias toward using colors in the cache.
// TODO(huisu): Try other schemes to improve compression.
+#define PALETTE_CACHE_BIAS_THRESH 4
static AOM_INLINE void optimize_palette_colors(uint16_t *color_cache,
int n_cache, int n_colors,
int stride, int *centroids,
@@ -202,7 +203,7 @@
idx = j;
}
}
- int min_threshold = (1 << (bit_depth - 8));
+ const int min_threshold = (PALETTE_CACHE_BIAS_THRESH) << (bit_depth - 8);
if (min_diff <= min_threshold) centroids[i] = color_cache[idx];
}
}