Abstract:
Dendrite morphological neurons are a type of artificial neural network that work with min and max operators instead of algebraic products. These morphological operators a...Show MoreMetadata
Abstract:
Dendrite morphological neurons are a type of artificial neural network that work with min and max operators instead of algebraic products. These morphological operators allow each dendrite to build a hyper-box in classification N-dimensional space. In contrast with classical perceptrons, these simple geometrical representations, hyper-boxes, allow the proposal of training methods based on heuristics without using of an optimisation method. In literature, it has been claimed that these heuristics-based trainings have advantages: there are no convergence problems, perfect classification can always be reached and training is performed in only one epoch. This paper shows that these assumed advantages come with a cost: these heuristics increase classification errors in the test set because they are not optimal and learning generalisation is poor. To solve these problems, we introduce a novel method to train dendrite morphological neurons based on stochastic gradient descent for classification tasks, using these heuristics just for initialisation of learning parameters. We add a softmax layer to the neural architecture for calculating gradients and also propose and evaluate four different methods to initialise the dendrite parameters. Experiments are performed based on several real and synthetic datasets. Results show that we can enhance the testing accuracy in comparison with solely heuristics-based training methods. This approach reaches competitive performance with respect to other popular machine learning algorithms. Our code developed in Matlab is available online.
Date of Conference: 06-09 December 2016
Date Added to IEEE Xplore: 13 February 2017
ISBN Information: