Cart (Loading....) | Create Account
Close category search window
 

Information Theoretic Learning: Reny's Entropy and Kernel Perspectives (Principe, J.; 2010) [Book Review]

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)

This book, derived from Jose Principe and his group??s 10 years?? research in information theory and statistical learning, gives a comprehensive introduction, analysis and demonstration of almost all the major components required for understanding and developing the new theme of information-theoretical learning. The basic strategy utilized by the author is to apply information theory descriptors (namely entropy and divergence, in contrast to the statistical measures of mean and covariance)as nonparametric cost functions for the design of adaptive systems, thus creating a new paradigm of information theoretic learning. And like in statistical learning, unsupervised or supervised training modes are also fully explored.

Published in:

Computational Intelligence Magazine, IEEE  (Volume:6 ,  Issue: 3 )

Date of Publication:

Aug. 2011

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.