Scheduled System Maintenance:
On May 6th, single article purchases and IEEE account management will be unavailable from 8:00 AM - 5:00 PM ET (12:00 - 21:00 UTC). We apologize for the inconvenience.
By Topic

Representations of Continuous Attractors of Recurrent Neural Networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Jiali Yu ; Sch. of Comput. Sci. & Eng., Univ. of Electron. Sci. & Technol. of China, Chengdu ; Zhang Yi ; Lei Zhang

A continuous attractor of a recurrent neural network (RNN) is a set of connected stable equilibrium points. Continuous attractors have been used to describe the encoding of continuous stimuli in neural networks. Dynamic behaviors of continuous attractors of RNNs exhibit interesting properties. This brief desires to derive explicit representations of continuous attractors of RNNs. Representations of continuous attractors of linear RNNs as well as linear-threshold (LT) RNNs are obtained under some conditions. These representations could be looked at as solutions of continuous attractors of the networks. Such results provide clear and complete descriptions to the continuous attractors.

Published in:

Neural Networks, IEEE Transactions on  (Volume:20 ,  Issue: 2 )