By Topic

Analog VLSI circuits as physical structures for perception in early visual tasks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
L. Raffo ; Dept. of Electr. & Electron. Eng., Cagliari Univ., Italy ; S. P. Sabatini ; G. M. Bo ; G. M. Bisio

A variety of computational tasks in early vision can be formulated through lattice networks. The cooperative action of these networks depends on the topology of interconnections, both feedforward and recurrent ones. This paper shows that it is possible to consider a distinct general architectural solution for all recurrent computations of any given order. The Gabor-like impulse response of a second-order network is analyzed in detail, pointing out how a near-optimal filtering behavior in space and frequency domains can be achieved through excitatory/inhibitory interactions without impairing the stability of the system. These architectures can be mapped, very efficiently at transistor level, on VLSI structures operating as analog perceptual engines. The problem of hardware implementation of early vision tasks can, indeed, be tackled by combining these perceptual agents through suitable weighted sums. A 17-node analog current-mode VLSI circuit has been implemented on a CMOS 2 μm, NWELL, single-poly, and double-metal technology, to demonstrate the feasibility of the approach. Applications of the perceptual engine to various machine vision algorithms are proposed

Published in:

IEEE Transactions on Neural Networks  (Volume:9 ,  Issue: 6 )