Skip to Main Content
Generalized singular value decomposition (GSVD) has been used in the literature for linear discriminant analysis (LDA) to solve the small sample size problem in pattern recognition. However, this algorithm suffers from excessive computational load when the sample dimension is high. In this paper, we present a modified version of the LDA/GSVD algorithm to enhance the computational efficiency, referred to as EGSVD-LDA algorithm, which uses the linear combination of the sample vectors to represent the singular vectors so as to circumvent the calculation of the high dimensional singular vectors through SVD. Further, to overcome the over-fitting problem of the GSVD-based algorithms, we have also proposed a new method to orthogonalize the discriminative subspace derived from the GSVD framework through a Gram-Schmidt process in an inner product space. These methods are efficient when data are high dimensional. Simulation results show that the EGSVD-LDA algorithm, especially its orthogonalized version, overcomes the computational complexity problem and provides high recognition accuracy with low computational load.