Skip to Main Content
A time-interleaved ADC (TIADC) increases the overall sampling rate by combining multiple slow ADCs. However, the performance of a TIADC suffers from several mismatches such as time, offset, and gain mismatches. This paper deals with the identification and compensation of timing mismatches in a TIADC using the least mean square (LMS) algorithm. The method only requires a bandlimited and oversampled input signal. We present a detailed discussion and demonstrate the effectiveness of the proposed method by numerical simulations.