By Topic

Nonlinear feature transforms using maximum mutual information

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
K. Torkkola ; Motorola Inc., MD, USA

Finding the right features is an essential part of a pattern recognition system. This can be accomplished either by selection or by a transform from a larger number of “raw” features. In this work we learn nonlinear dimension reducing discriminative transforms that are implemented as neural networks, either as radial basis function networks or as multilayer perceptrons. As the criterion, we use the joint mutual information (MI) between the class labels of training data and transformed features. Our measure of MI makes use of Renyi entropy as formulated by Principe et al. (1998, 2000). Resulting low-dimensional features enable a classifier to operate with less computational resources and memory without compromising the accuracy

Published in:

Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on  (Volume:4 )

Date of Conference: