By Topic

Effectiveness of feature extraction in neural network architectures for novelty detection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Addison, J.F.D. ; Sch. of Comput. Eng. & Technol., Sunderland Univ., UK ; Wermter, S. ; MacIntyre, J.

This paper examines the performance of seven neural network architectures in classifying and detecting novel events contained within data collected from turbine sensors. Several different multilayer perceptrons were built and trained using backpropagation, conjugate gradient and quasi-Newton training algorithms. In addition, linear networks, radial basis function networks, probabilistic networks and Kohonen self organising feature maps were also built and trained, with the objective of discovering the most appropriate architecture. Because of the large input set involved in practice, feature extraction is examined to reduce the input features, the techniques considered being stepwise linear regression and a genetic algorithm. The results of these experiments have demonstrated an improvement in classification performance for multilayer perceptrons, Kohonen and probabilistic networks, using both genetic algorithms and stepwise linear regression over other architectures considered in this work. In addition, linear regression also performed better than a genetic algorithm for feature extraction. For classification problems involving a clear two class structure we consider a synthesis of stepwise linear regression with any of the architectures listed above to offer demonstrable improvements in performance for important real world tasks

Published in:

Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)  (Volume:2 )

Date of Conference:

1999