By Topic

The Hierarchical Fast Learning Artificial Neural Network (HieFLANN)—An Autonomous Platform for Hierarchical Neural Network Construction

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Tay, A.L.P. ; Nanyang Technol. Univ., Singapore ; Zurada, J.M. ; Lai-Ping Wong ; Jian Xu

The hierarchical fast learning artificial neural network (HieFLANN) is a clustering NN that can be initialized using statistical properties of the data set. This provides the possibility of constructing the entire network autonomously with no manual intervention. This distinguishes it from many existing networks that, though hierarchically plausible, still require manual initialization processes. The unique system of hierarchical networks begins with a reduction of the high-dimensional feature space into smaller and manageable ones. This process involves using the K-iterations fast learning artificial neural network (KFLANN) to systematically cluster a square matrix containing the Mahalanobis distances (MDs) between data set features, into homogeneous feature subspaces (HFSs). The KFLANN is used for its heuristic network initialization capabilities on a given data set and requires no supervision. Through the recurring use of the KFLANN and a second stage involving canonical correlation analysis (CCA), the HieFLANN is developed. Experimental results on several standard benchmark data sets indicate that the autonomous determination of the HFS provides a viable avenue for feasible partitioning of feature subspaces. When coupled with the network transformation process, the HieFLANN yields results showing accuracies comparable with available methods. This provides a new platform by which data sets with high-dimensional feature spaces can be systematically resolved and trained autonomously, alleviating the effects of the curse of dimensionality.

Published in:

Neural Networks, IEEE Transactions on  (Volume:18 ,  Issue: 6 )