By Topic

RKHS Bayes Discriminant: A Subspace Constrained Nonlinear Feature Projection for Signal Detection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Ozertem, U. ; Yahoo! Labs., Santa Clara, CA, USA ; Erdogmus, D.

Given the knowledge of class probability densities, a priori probabilities, and relative risk levels, Bayes classifier provides the optimal minimum-risk decision rule. Specifically, focusing on the two-class (detection) scenario, under certain symmetry assumptions, matched filters provide optimal results for the detection problem. Noticing that the Bayes classifier is in fact a nonlinear projection of the feature vector to a single-dimensional statistic, in this paper, we develop a smooth nonlinear projection filter constrained to the estimated span of class conditional distributions as does the Bayes classifier. The nonlinear projection filter is designed in a reproducing kernel Hilbert space leading to an analytical solution both for the filter and the optimal threshold. The proposed approach is tested on typical detection problems, such as neural spike detection or automatic target detection in synthetic aperture radar (SAR) imagery. Results are compared with linear and kernel discriminant analysis, as well as classification algorithms such as support vector machine, AdaBoost and LogitBoost.

Published in:

Neural Networks, IEEE Transactions on  (Volume:20 ,  Issue: 7 )