By Topic

Channel linearity mismatch effects in time-interleaved ADC systems

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Kurosawa, N. ; Dept. of Electron. Eng., Gunma Univ., Japan ; Kobayashi, H. ; Kobayashi, K.

A time-interleaved ADC system is an effective way to implement a high-sampling-rate ADC with relatively slow circuits. In the system, several channel ADCs operate at interleaved sampling times as if they were effectively a single ADC operating at a much higher sampling rate. However, mismatches among channel ADCs degrade S/N of the ADC system as a whole, and the effects of offset, gain and bandwidth mismatches as well as timing skew of the clocks distributed to the channels have been well investigated. This paper investigates the channel linearity mismatch effects in the time-interleaved ADC system, which are practically very important but have not been investigated yet. We consider two cases: differential nonlinearity mismatch and integral nonlinearity mismatch cases. Our numerical simulation shows their distinct features especially in frequency domain. The derived results can be useful for calibration algorithms to compensate for the channel mismatch effects

Published in:

Circuits and Systems, 2001. ISCAS 2001. The 2001 IEEE International Symposium on  (Volume:1 )

Date of Conference:

6-9 May 2001