Skip to Main Content
When performing subspace modeling of data using principal component analysis (PCA) it may be desirable to constrain certain directions to be more meaningful in the context of the problem being investigated. This need arises due to the data often being approximately isotropic along the lesser principal components, making the choice of directions for these components more-or-less arbitrary. Furthermore, constraining may be imperative to ensure viable solutions in problems where the dimensionality of the data space is of the same order as the number of data points available. This paper adopts a Bayesian approach and augments the likelihood implied by probabilistic principal component analysis (PPCA) (Tipping and Bishop, 1999) with a prior designed to achieve the constraining effect. The subspace parameters are computed efficiently using the EM algorithm. The constrained modeling approach is illustrated on two pertinent problems, one from speech analysis, and one from computer vision.