By Topic

Good weights and hyperbolic kernels for neural networks, projection pursuit, and pattern classification: Fourier strategies for extracting information from high-dimensional data

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Jones, L.K. ; Dept of Math. Sci., Massachusetts Univ., Lowell, MA, USA

Fourier approximation and estimation of discriminant, regression, and density functions are considered. A preference order is established for the frequency weights in multiple Fourier expansions and the connection weights in single hidden-layer neural networks. These preferred weight vectors, called good weights (good lattice weights for estimation of periodic functions), are generalizations for arbitrary periods of the hyperbolic lattice points of Korobov (1959) and Hlawka (1962) associated with classes of smooth functions of period one in each variable. Although previous results on approximation and quadrature are affinely invariant to the scale of the underlying periods, some of our results deal with optimization over finite sets and strongly depend on the choice of scale. It is shown how to count and generate good lattice weights. Finite sample bounds on mean integrated squared error are calculated for ridge estimates of periodic pattern class densities. The bounds are combined with a table of cardinalities of good lattice weight sets to furnish classifier design with prescribed class density estimation errors. Applications are presented for neural networks and projection pursuit. A hyperbolic kernel gradient transform is developed which automatically determines the training weights (projection directions). Its sampling properties are discussed. Algorithms are presented for generating good weights for projection pursuit

Published in:

Information Theory, IEEE Transactions on  (Volume:40 ,  Issue: 2 )