By Topic

Advanced search algorithms for information-theoretic learning with kernel-based estimators

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
R. A. Morejon ; Comput. Neuroengineering Lab., Univ. of Florida, Gainesville, FL, USA ; J. C. Principe

Recent publications have proposed various information-theoretic learning (ITL) criteria based on Renyi's quadratic entropy with nonparametric kernel-based density estimation as alternative performance metrics for both supervised and unsupervised adaptive system training. These metrics, based on entropy and mutual information, take into account higher order statistics unlike the mean-square error (MSE) criterion. The drawback of these information-based metrics is the increased computational complexity, which underscores the importance of efficient training algorithms. In this paper, we examine familiar advanced-parameter search algorithms and propose modifications to allow training of systems with these ITL criteria. The well known algorithms tailored here for ITL include various improved gradient-descent methods, conjugate gradient approaches, and the Levenberg-Marquardt (LM) algorithm. Sample problems and metrics are presented to illustrate the computational efficiency attained by employing the proposed algorithms.

Published in:

IEEE Transactions on Neural Networks  (Volume:15 ,  Issue: 4 )