Cart (Loading....) | Create Account
Close category search window
 

Combining Gradient and Evolutionary Approaches to the Artificial Neural Networks Training According to Principles of Support Vector Machines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
2 Author(s)
Bundzel, M. ; Tech. Univ. of Kosice, Kosice ; Sincak, P.

A gradient based learning for ANN training in pattern recognition tasks and a genetic approach for ANN pruning are proposed in this paper. The goal is to achieve a wide margin classifier the Vapnik-Chevornenkis (VC) dimension of which is being reduced in order to increase the generalization performance. Inspired by Support Vector Machines the examples closest to the decision boundary contribute to the training the most. The training penalty is rule-based and calculated according to the spatial distribution of the training examples relative to the separating hyperplane. The tendency to saturation of hidden neurons is suppressed. Genetic algorithm based method is proposed for reduction of the size of a trained ANN. The proposed algorithms were tested on artificial and real world data and compared to standard Backpropagation and Support Vector Machine with Gaussian RBF kernel.

Published in:

Neural Networks, 2006. IJCNN '06. International Joint Conference on

Date of Conference:

0-0 0

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.