Skip to Main Content
The Karhunen-Loeve (KL) transform is a very important method in data compression and pattern recognition, which has been successfully used in face recognition. Intuitively, the more training samples are used, the better performance would be, but the computation cost increases in proportion to cubic speed. The numbers of training samples and eigenvalues required by the KL method in face representation and recognition are considered. In this paper, we attempt to find the balance points by experiments and analyses.