By Topic

Approximations of continuous functionals by neural networks with application to dynamic systems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Tianping Chen ; Inst. of Math., Fudan Univ., Shanghai, China ; Hong Chen

The paper gives several strong results on neural network representation in an explicit form. Under very mild conditions a functional defined on a compact set in C[a, b] or Lp[a, b], spaces of infinite dimensions, can be approximated arbitrarily well by a neural network with one hidden layer. The results are a significant development beyond earlier work, where theorems of approximating continuous functions defined on a finite-dimensional real space by neural networks with one hidden layer were given. All the results are shown to be applicable to the approximation of the output of dynamic systems at any particular time

Published in:

Neural Networks, IEEE Transactions on  (Volume:4 ,  Issue: 6 )