By Topic

Second-order recurrent neural network for word sequence learning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
H. K. Kwan ; Dept. of Electr. & Comput. Eng., Windsor Univ., Ont., Canada ; J. Yan

This paper presents a genetic algorithm (GA)-based 2ndorder recurrent neural network (GRNN). Feedbacks in the structure enable the network to remember cues from the recent past of a word sequence. The GA is used to help design an improved network by evolving weights and connections dynamically. Simulation results on learning 50 commands of up to 3 words and 24 phone numbers of 10 digits illustrate that the GRNN is most efficient in error performance and recall accuracy as compared to other backpropagation-based recurrent and feedforward networks. The effects of population size, crossover probability and mutation rate on the performance of the GRNN are presented

Published in:

Intelligent Multimedia, Video and Speech Processing, 2001. Proceedings of 2001 International Symposium on

Date of Conference:

2001