Distributional Distances in Color Image Retrieval with GMVQ-Generated Histograms

* Final gross prices may vary according to local VAT.

Get Access

Abstract

We investigate and compare the performance of several distributional distances in generic color image retrieval with an emphasis on symmetry and boundedness of the distances. Two histogram generation methods based on Gauss mixture vector quantization (GMVQ) are compared using Kullback-Leibler divergence (KLD). The joint histogram method shows a better retrieval performance than the Bayesian retrieval with the label histograms of interleaved data. A variety of distance measures are tested and compared for the joint histogram features produced by GMVQ, including an important set of Ali-Silvey distances, the Bhattacharyya distance, and a few other divergence measures based on Shannon entropy. Experimental results show that the Bhattacharyya distance and the L divergence are better than the histogram intersection (HI), but the KLD is poorer than the HI. In all cases, the symmetric version of a distance performs better than the asymmetric one and usually the bounded version of a distance gives better retrieval performance than the corresponding non-bounded.