Skip to Main Content
Principal component analysis (PCA) has been shown to be very fruitful for extracting the most useful information from a given sequence of observations. Quite a number of PCA methods can be found in the literature. In this work, we concentrate on the derivation of optimal learning rates for some well-known adaptive PCA algorithms. A detailed derivation procedure is described which results in closed-form formulae of the optimal learning rates. These optimal learning rates can be obtained from the solution of either quadratic or cubic equations which can be analytically solved, where no numerical procedures are needed. The key advantage of the optimal learning rate is to offer a wise mechanism to automatically adjust the learning stepsize.