By Topic

Failure prediction of banks using threshold accepting trained kernel principal component neural network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
P. Ravisankar ; Institute for Development and Research in Banking Technology, Castle Hills Road #1, Masab Tank, Hyderabad 500 057, AP, India ; V. Ravi

This paper presents a new neural network architecture kernel principal component neural network (KPCNN) trained by threshold accepting based training algorithm with different kernels like polynomial, sigmoid and Gaussian and its application to bankruptcy prediction in banks. KPCNN is a non linear version of the PCNN proposed elsewhere. In this architecture, dimensionality reduction is taken care of kernel principal component analysis. First the kernel matrices are computed and then PCNN is applied to those kernel matrices. The nonlinearity is introduced into the architecture by applying different kernels like polynomial, sigmoid and Gaussian etc. The efficiency of KPCNN is tested on different datasets including, Spanish banks, Turkish banks and UK banks datasets. Further t-statistic and f-statistic are used for feature selection purpose and the features so selected are fed as input to KPCNN for classification purpose It is observed that the features selected by t-statistic and f-statistic are identical in all datasets. Ten-fold cross validation is performed throughout the study. The performance of KPCNN on above datasets is compared with that of earlier results both with and without feature selection. From this study we can conclude that the KPCNN yields comparable results with all the techniques both with and without feature selection. Furthermore, we can conclude that this KPCNN best suits for the datasets with high nonlinearity.

Published in:

Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on

Date of Conference:

9-11 Dec. 2009