Skip to Main Content
Multi layer perceptron networks have been successful in many applications, yet there are many unsolved problems in the theory. Commonly, sigmoidal activation functions have been used, giving good results. The backpropagation algorithm might work with any other activation function on one condition though - it has to have a differential. We investigate some possible activation functions and compare the results they give on some sample data sets.