By Topic

Finite word length computational effects of the principal component analysis networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
T. Szabo ; Dept. of Meas. & Inf. Syst., Tech. Univ. Budapest, Hungary ; G. Horvath

This paper deals with some of the effects of finite precision data representation and arithmetics in principal component analysis (PCA) neural networks. The PCA networks are single layer linear neural networks that use some versions of Oja's learning rule. The paper concentrates on the effects of premature convergence or early termination of the learning process. It determines an approximate analytical expression of the lower limit of the learning rate parameter. Selecting the learning rate below this limit-which depends on the statistical properties of the input data and the quantum size used in the finite precision arithmetics-the convergence will slow down significantly or the learning process will stop before converging to the proper weight vector

Published in:

IEEE Transactions on Instrumentation and Measurement  (Volume:47 ,  Issue: 5 )