By Topic

Minimum description length criterion for modeling of chaotic attractors with multilayer perceptron networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Zhao Yi ; Dept. of Electron. & Inf. Eng., Hong Kong Polytech. Univ. ; Small, Michael

Overfitting has long been recognized as a problem endemic to models with a large number of parameters. The usual method of avoiding this problem in neural networks is to avoid fitting the data too precisely, and this technique cannot determine the exact model size directly. In this paper, we describe an alternative, information theoretic criterion to determine the number of neurons in the optimal model. When applied to the time series prediction problem we find that models which minimize the description length (DL) of the data, both generalize well and accurately capture the underlying dynamics. We illustrate our method with several computational and experimental examples

Published in:

Circuits and Systems I: Regular Papers, IEEE Transactions on  (Volume:53 ,  Issue: 3 )