By Topic

Simultaneous Pattern Classification and Multidomain Association Using Self-Structuring Kernel Memory Networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Hoya, T. ; Dept. of Math., Nihon Univ., Tokyo ; Washizawa, Y.

In this paper, a novel exemplar-based constructive approach using kernels is proposed for simultaneous pattern classification and multidomain pattern association tasks. The kernel networks are constructed on a modular basis by a simple one-shot self-structuring algorithm motivated from the traditional Hebbian principle and then, they act as the flexible memory capable of generalization for the respective classes. In the self-structuring kernel memory (SSKM), any arduous and iterative network parameter tuning is not involved for establishing the weight connections during the construction, unlike conventional approaches, and thereby, it is considered that the networks do not inherently suffer from the associated numerical instability. Then, the approach is extended for multidomain pattern association, in which a particular domain input cannot only activate some kernel units (KUs) but also the kernels in other domain(s) via the cross-domain connection(s) in between. Thereby, the SSKM can be regarded as a simultaneous pattern classifier and associator. In the simulation study for pattern classification, it is justified that an SSKM consisting of distinct kernel networks can yield relatively compact-sized pattern classifiers, while preserving a reasonably high generalization capability, in comparison with the approach using support vector machines (SVMs)

Published in:

Neural Networks, IEEE Transactions on  (Volume:18 ,  Issue: 3 )