By Topic

A log-linearized Gaussian mixture network and its application to EEG pattern classification

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Tsuji, T. ; Dept. of Ind. & Syst. Eng., Hiroshima Univ., Japan ; Fukuda, O. ; Ichinobe, H. ; Kaneko, M.

Proposes a new probabilistic neural network (NN) that can estimate the a-posteriori probability for a pattern classification problem. The structure of the proposed network is based on a statistical model composed by a mixture of log-linearized Gaussian components. However, the forward calculation and the backward learning rule can be defined in the same manner as the error backpropagation NN. In this paper, the proposed network is applied to the electroencephalogram (EEG) pattern classification problem. In the experiments described, two types of a photic stimulation, which are caused by eye opening/closing and artificial light, are used to collect the data to be classified. It is shown that the EEG signals can be classified successfully and that the classification rates change depending on the amount of training data and the dimension of the feature vectors

Published in:

Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on  (Volume:29 ,  Issue: 1 )