Color descriptors are among the most important features used in image analysis and retrieval. Due to its compact representation and low complexity, direct histogram comparison is a commonly used technique for measuring the color similarity. However, it has many serious drawbacks, including a high degree of dependency on color codebook design, sensitivity to quantization boundaries, and inefficiency in representing images with few dominant colors. In this paper, we present a new algorithm for color matching that models behavior of the human visual system in capturing color appearance of an image. We first develop a new method for color codebook design in the Lab space. The method is well suited for creating small fixed color codebooks; for image analysis, matching, and retrieval. Then we introduce a statistical technique to extract perceptually relevant colors. We also propose a new color distance measure that is based on the optimal mapping between two sets of color components representing two images. Experiments comparing the new algorithm to some existing techniques show that these novel elements lead to better match to human perception in judging image similarity in terms of color composition.