Skip to Main Content
Generalised singular value decomposition (GSVD) has been used in the literature for linear discriminant analysis (LDA) to solve the small sample size problem in pattern recognition. However, this method, commonly known as LDA/GSVD algorithm, suffers from excessive computational load when the sample dimension is high. Here the GSVD framework used in the LDA/GSVD algorithm is modified by replacing the SVD of a high-dimension matrix with the eigen-decomposition of a small size inner product matrix, thus circumventing the direct calculation of a high-dimension singular vector matrix. It is established by a theorem that if the samples are linearly independent in the feature space, the samples in each class are degenerated into a distinct single point of a discriminative space derived from the GSVD-based algorithms, and the distances between the points depend only on the respective numbers of the samples in the corresponding classes. In order to overcome the over-fitting problem, a method to orthogonalise the basis of the discriminative subspace is proposed. The proposed linear algorithm is kernelised for the discriminant analysis of samples that are not linearly independent as the non-linear kernel mapping can establish linear independence. The results of the above theorem are used to develop a method to measure the numerical error. This measure can also be used to decide the kernel parameters to minimise the numerical error in the non-linear algorithm.