Skip to Main Content
Standard linear discriminant analysis (LDA) is known to be computationally expensive due to the need to perform eigen-analysis. Based on the recent success of least-squares LDA (LSLDA), we propose a novel rank-one update method for LSLDA, which not only alleviates the computation and memory requirements, and is also able to solve the adaptive learning task of concept drift. In other words, our proposed LSLDA can efficiently capture the information from recently received data with gradual or abrupt changes in distribution. Moreover, our LSLDA can be extended to recognize data with newly-added class labels during the learning process, and thus exhibits excellent scalability. Experimental results on both synthetic and real datasets confirm the effectiveness of our propose method.