Skip to Main Content
The hierarchical fast learning artificial neural network (HieFLANN) is a clustering NN that can be initialized using statistical properties of the data set. This provides the possibility of constructing the entire network autonomously with no manual intervention. This distinguishes it from many existing networks that, though hierarchically plausible, still require manual initialization processes. The unique system of hierarchical networks begins with a reduction of the high-dimensional feature space into smaller and manageable ones. This process involves using the K-iterations fast learning artificial neural network (KFLANN) to systematically cluster a square matrix containing the Mahalanobis distances (MDs) between data set features, into homogeneous feature subspaces (HFSs). The KFLANN is used for its heuristic network initialization capabilities on a given data set and requires no supervision. Through the recurring use of the KFLANN and a second stage involving canonical correlation analysis (CCA), the HieFLANN is developed. Experimental results on several standard benchmark data sets indicate that the autonomous determination of the HFS provides a viable avenue for feasible partitioning of feature subspaces. When coupled with the network transformation process, the HieFLANN yields results showing accuracies comparable with available methods. This provides a new platform by which data sets with high-dimensional feature spaces can be systematically resolved and trained autonomously, alleviating the effects of the curse of dimensionality.