Skip to Main Content
Time interleaved A/D converters (ADC) can be used to increase the sample rate of an ADC system. However, a problem with time interleaved ADC is that distortion is introduced in the output signal due to various mismatch errors between the ADC. One way to decrease the impact of the mismatch errors is to introduce additional ADC in the interleaved structure and randomly select an ADC at each sample instance. The periodicity of the errors is then removed and the spurious distortion is changed to a more noiselike distortion, spread over the whole spectrum. In this paper, a probabilistic model of the randomly interleaved ADC system is presented. The noise spectrum caused by gain errors is also analyzed.