By Topic

Kernel Maximum Autocorrelation Factor and Minimum Noise Fraction Transformations

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Allan Aasbjerg Nielsen ; DTU Informatics—Informatics and Mathematical Modelling, Richard Petersens Plads, Lyngby, Denmark

This paper introduces kernel versions of maximum autocorrelation factor (MAF) analysis and minimum noise fraction (MNF) analysis. The kernel versions are based upon a dual formulation also termed Q-mode analysis in which the data enter into the analysis via inner products in the Gram matrix only. In the kernel version, the inner products of the original data are replaced by inner products between nonlinear mappings into higher dimensional feature space. Via kernel substitution also known as the kernel trick these inner products between the mappings are in turn replaced by a kernel function and all quantities needed in the analysis are expressed in terms of this kernel function. This means that we need not know the nonlinear mappings explicitly. Kernel principal component analysis (PCA), kernel MAF, and kernel MNF analyses handle nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. Three examples show the very successful application of kernel MAF/MNF analysis to: 1) change detection in DLR 3K camera data recorded 0.7 s apart over a busy motorway, 2) change detection in hyperspectral HyMap scanner data covering a small agricultural area, and 3) maize kernel inspection. In the cases shown, the kernel MAF/MNF transformation performs better than its linear counterpart as well as linear and kernel PCA. The leading kernel MAF/MNF variates seem to possess the ability to adapt to even abruptly varying multi and hypervariate backgrounds and focus on extreme observations.

Published in:

IEEE Transactions on Image Processing  (Volume:20 ,  Issue: 3 )