By Topic

A fixed-point algorithm to minimax learning with neural networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
A. Guerrero-Curieses ; Dpto. de Teoria de la Senal y Comunicaciones, Univ. Carlos de Madrid, Leganes-Madrid, Spain ; R. Alaiz-Rodriguez ; J. Cid-Sueiro

In some real applications, such as medical diagnosis or remote sensing, available training data do not often reflect the true a priori probabilities of the underlying data distribution. The classifier designed from these data may be suboptimal. Building classifiers that are robust against changes in prior probabilities is possible by applying a minimax learning strategy. In this paper, we propose a simple fixed-point algorithm that is able to train a neural minimax classifier [i.e., a classifier minimizing the worst (maximum) possible risk]. Moreover, we present a new parametric family of loss functions that is able to provide the most accurate estimates for the posterior class probabilities near the decision regions, and we also discuss the application of these functions together with a minimax learning strategy. The results of the experiments carried out on different real databases point out the ability of the proposed algorithm to find the minimax solution and produce a robust classifier when the real a priori probabilities differ from the estimated ones.

Published in:

IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews)  (Volume:34 ,  Issue: 4 )