By Topic

Information Theory and Neural Information Processing

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Don H. Johnson ; Dept. Electrical and Computer Engineering, Rice University, Houston

Neuroscientists want to quantify how well neurons, individually and collectively, process information and encode the result in their outputs. We demonstrate that while classic information theory demarcates optimal performance boundaries, it does not provide results that would be useful in analyzing an existing system about which little is known (such as the brain). In the classical vein, non-Poisson channels, which describe the communication medium for neural signals, are shown to have individually a capacity strictly smaller than the Poisson ideal. We describe recent capacity results for Poisson neural populations, showing that connections among neurons can increase capacity. We then present an alternative theory more amenable to data analysis and to situations wherein systems actively extract and represent information. Using this theory, we show that the ability of a neural population to jointly represent information depends nature of its input signal, not on the encoded information.

Published in:

IEEE Transactions on Information Theory  (Volume:56 ,  Issue: 2 )