Skip to Main Content
This paper describes an approach for analog-to-digital converter (ADC) linearity testing that can tolerate environmental nonstationarity and use low-precision test signals. The effects of stimulus errors on ADC testing results will be identified and removed by exploiting the functional relationship of input signals. The effects of environmental nonstationarity will be suppressed by interleaving input signals with a center-symmetric pattern. This approach can be applied to testing of ADCs of very high performance, such as 16-bit or higher resolution and more than 1 MSPS sampling rates, to which there is hardly a well-established solution for full-code testing. Simulation and experimental results show that a 16-bit ADC can be tested to one-least-significant-bit accuracy by using input signals of seven-bit linearity in an environment with more than 100-ppm/min nonstationarity. The proposed method can help control the cost of ADC production tests, extend the test coverage of current solutions, and enable built-in self-tests and test-based self-calibrations.