By Topic

New nonleast-squares neural network learning algorithms for hypothesis testing

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Pados, D.A. ; Dept. of Electr. Eng., Virginia Univ., Charlottesville, VA, USA ; Papantoni-Kazakos, P.

Hypothesis testing is a collective name for problems such as classification, detection, and pattern recognition. In this paper we propose two new classes of supervised learning algorithms for feedforward, binary-output neural network structures whose objective is hypothesis testing. All the algorithms are applications of stochastic approximation and are guaranteed to provide optimization with probability one. The first class of algorithms follows the Neyman-Pearson approach and maximizes the probability of detection, subject to a given false alarm constraint. These algorithms produce layer-by-layer optimal Neyman-Pearson designs. The second class of algorithms minimizes the probability of error and leads to layer-by-layer Bayes optimal designs. Deviating from the layer-by-layer optimization assumption, we propose more powerful learning techniques which unify, in some sense, the already existing algorithms. The proposed algorithms were implemented and tested on a simulated hypothesis testing problem. Backpropagation and perceptron learning were also included in the comparisons

Published in:

Neural Networks, IEEE Transactions on  (Volume:6 ,  Issue: 3 )