Cart (Loading....) | Create Account
Close category search window
 

A multiscale expectation-maximization semisupervised classifier suitable for badly posed image classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Baraldi, A. ; ISSIA-CNR, Bari ; Bruzzone, L. ; Blonda, P.

This paper deals with the problem of badly posed image classification. Although underestimated in practice, bad-posedness is likely to affect many real-world image classification tasks, where reference samples are difficult to collect (e.g., in remote sensing (RS) image mapping) and/or spatial autocorrelation is relevant. In an image classification context affected by a lack of reference samples, an original inductive learning multiscale image classifier, termed multiscale semisupervised expectation maximization (MSEM), is proposed. The rationale behind MSEM is to combine useful complementary properties of two alternative data mapping procedures recently published outside of image processing literature, namely, the multiscale modified Pappas adaptive clustering (MPAC) algorithm and the sample-based semisupervised expectation maximization (SEM) classifier. To demonstrate its potential utility, MSEM is compared against nonstandard classifiers, such as MPAC, SEM and the single-scale contextual SEM (CSEM) classifier, besides against well-known standard classifiers in two RS image classification problems featuring few reference samples and modestly useful texture information. These experiments yield weak (subjective) but numerous quantitative map quality indexes that are consistent with both theoretical considerations and qualitative evaluations by expert photointerpreters. According to these quantitative results, MSEM is competitive in terms of overall image mapping performance at the cost of a computational overhead three to six times superior to that of its most interesting rival, SEM. More in general, our experiments confirm that, even if they rely on heavy class-conditional normal distribution assumptions that may not be true in many real-world problems (e.g., in highly textured images), semisupervised classifiers based on the iterative expectation maximization Gaussian mixture model solution can be very powerful in practice when: 1) there is a lack of reference sam- - ples with respect to the problem/model complexity and 2) texture information is considered negligible (i.e., a piecewise constant image model holds)

Published in:

Image Processing, IEEE Transactions on  (Volume:15 ,  Issue: 8 )

Date of Publication:

Aug. 2006

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.