By Topic

Successive-least-squares error algorithm on minimum description length neural networks for time series prediction

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yu Ning Lai ; City Univ. of Hong Kong, China ; Shiu Yin Yuen

A successive least-squares approach is proposed to find an optimal model of a flat neural network in a short period of time. It is based on a minimum description length (MDL) neural network that uses the MDL principle as the stopping criterion. Different from conventional algorithms on flat neural networks that apply least-squares technique on weights between hidden layer and output layer only, it extends the least-squares technique to weights between the input layer and the hidden layer. We apply this algorithm to the chaotic Mackey-Glass time series and chaotic laser time series. The results show that it provides satisfactory prediction within a small amount of time.

Published in:

Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on  (Volume:4 )

Date of Conference:

23-26 Aug. 2004