Loading [MathJax]/extensions/MathMenu.js
Structural adaptation for sparsely connected MLP using Newton's method | IEEE Conference Publication | IEEE Xplore

Structural adaptation for sparsely connected MLP using Newton's method


Abstract:

In this work, we propose a paradigm for constructing a sparsely-connected multi-layer perceptron (MLP). Using Orthogonal Least Squares (OLS) method for training, the prop...Show More

Abstract:

In this work, we propose a paradigm for constructing a sparsely-connected multi-layer perceptron (MLP). Using Orthogonal Least Squares (OLS) method for training, the proposed method prunes the hidden units and output weights based on their usefulness to design a sparsely connected MLP. We formulate second order algorithm to obtain the closed-form expression for hidden unit learning factors thereby minimizing hand-tuned parameters. The usefulness of the proposed algorithm is further substantiated by its ability to differentiate two combined datasets. Using widely available datasets, the proposed algorithm's 10-fold testing error is shown to be less than that of several other algorithms. Inducing sparsity into a fully-connected neural network, pruning of the hidden units, Newton's method for optimization, and orthogonal least squares are the subject matter of the present work.
Date of Conference: 14-19 May 2017
Date Added to IEEE Xplore: 03 July 2017
ISBN Information:
Electronic ISSN: 2161-4407
Conference Location: Anchorage, AK, USA

Contact IEEE to Subscribe

References

References is not available for this document.