Skip to Main Content
Distribution calibration plays an important role in cross-domain learning. However, existing distribution distance metrics are not geodesic; therefore, they cannot measure the intrinsic distance between two distributions. In this paper, we calibrate two distributions by using the geodesic distance in Riemannian symmetric space. Our method learns a latent subspace in the reproducing kernel Hilbert space, where the geodesic distance between the distribution of the source and the target domains is minimized. The corresponding geodesic distance is thus equivalent to the geodesic distance between two symmetric positive definite (SPD) matrices defined in the Riemannian symmetric space. These two SPD matrices parameterize the marginal distributions of the source and target domains in the latent subspace. We carefully design an evolutionary algorithm to find a local optimal solution that minimizes this geodesic distance. Empirical studies on face recognition, text categorization, and web image annotation suggest the effectiveness of the proposed scheme.