By Topic

Distributed multitarget classification in wireless sensor networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
J. H. Kotecha ; Dept. of Electr. & Comput. Eng., Univ. of Wisconsin, Madison, WI, USA ; V. Ramachandran ; A. M. Sayeed

We study distributed strategies for classification of multiple targets in a wireless sensor network. The maximum number of targets is known a priori but the actual number of distinct targets present in any given event is assumed unknown. The target signals are modeled as zero-mean Gaussian processes with distinct temporal power spectral densities, and it is assumed that the noise-corrupted node measurements are spatially independent. The proposed classifiers have a simple distributed architecture: local hard decisions from each node are communicated over noisy links to a manager node which optimally fuses them to make the final decision. A natural strategy for local hard decisions is to use the optimal local classifier. A key problem with the optimal local classifier is that the number of hypotheses increases exponentially with the maximum number of targets. We propose two suboptimal (mixture density and Gaussian) local classifiers that are based on a natural but coarser repartitioning of the hypothesis space, resulting in linear complexity with the number of targets. We show that exponentially decreasing probability of error with the number of nodes can be guaranteed with an arbitrarily small but nonvanishing communication power per node. Numerical results based on real data demonstrate the remarkable practical advantage of decision fusion: an acceptably small probability of error can be attained by fusing a moderate number of unreliable local decisions. Furthermore, the performance of the suboptimal mixture density classifier is comparable to that of the optimal local classifier, making it an attractive choice in practice.

Published in:

IEEE Journal on Selected Areas in Communications  (Volume:23 ,  Issue: 4 )