By Topic

Granular Neural Networks and Their Development Through Context-Based Clustering and Adjustable Dimensionality of Receptive Fields

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Ho-Sung Park ; Ind. Adm. Inst., Univ. of Suwon, Suwon, South Korea ; Witold Pedrycz ; Sung-Kwun Oh

In this study, we present a new architecture of a granular neural network and provide a comprehensive design methodology as well as elaborate on an algorithmic setup supporting its development. The proposed neural network relates to a broad category of radial basis function neural networks (RBFNNs) in the sense that its topology involves a collection of receptive fields. In contrast to the standard architectures encountered in RBFNNs, here we form individual receptive fields in subspaces of the original input space rather than in the entire input space. These subspaces could be different for different receptive fields. The architecture of the network is fully reflective of the structure encountered in the training data which are granulated with the aid of clustering techniques. More specifically, the output space is granulated with use of K-means clustering while the information granules in the multidimensional input space are formed by using the so-called context-based fuzzy C-means, which takes into account the structure being already formed in the output space. The innovative development facet of the network involves a dynamic reduction of dimensionality of the input space in which the information granules are formed in the subspace of the overall input space which is formed by selecting a suitable subset of input variables so that this subspace retains the structure of the entire space. As this search is of combinatorial character, we use the technique of genetic optimization [genetic algorithms (GAs), to be more specific] to determine the optimal input subspaces. A series of numeric studies exploiting synthetic data and data coming from the Machine Learning Repository, University of California at Irvine, provide a detailed insight into the nature of the algorithm and its parameters as well as offer some comparative analysis.

Published in:

IEEE Transactions on Neural Networks  (Volume:20 ,  Issue: 10 )