This paper proposes a new measure of "distance" between faces. This measure involves the estimation of the set of possible transformations between face images of the same person. The global transformation, which is assumed to be too complex for direct modeling, is approximated by a patchwork of local transformations, under a constraint imposing consistency between neighboring local transformations. The proposed system of local transformations and neighboring constraints is embedded within the probabilistic framework of a two-dimensional hidden Markov model. More specifically, we model two types of intraclass variabilities involving variations in facial expressions and illumination, respectively. The performance of the resulting method is assessed on a large data set consisting of four face databases. In particular, it is shown to outperform a leading approach to face recognition, namely, the Bayesian intra/extrapersonal classifier.