Skip to Main Content
A fundamental problem with an actual implementation of a time-interleaved analog-to-digital converter (ADC) is the sample-time mismatches among the demultiplexing channels. This problem is expected to worsen with continued scaling of transistor sizes and the need for faster ADC's. Based on the filter bank framework, an approach for designing a finite-length synthesis filter that interpolates in the least-squares sense to obtain uniform samples is presented. Our results suggest that although digital interpolation can be very effective in improving the ADC resolution, achieving arbitrarily high resolution may require synthesis filters that are very complex.