Skip to Main Content
In this paper, we propose a new pruning algorithm to obtain the optimal number of hidden units of a single layer of a fully connected neural network (NN). The technique relies on a global sensitivity analysis of model output. The relevance of the hidden nodes is determined by analysing the Fourier decomposition of the variance of the model output. Each hidden unit is assigned a ratio (the fraction of variance which the unit accounts for) that gives their ranking. This quantitative information therefore leads to a suggestion of the most favorable units to eliminate. Experimental results suggest that the method can be seen as an effective tool available to the user in controlling the complexity in NNs.