Skip to Main Content
Digital calibration techniques are widely utilized to linearize pipelined analog-to-digital converters (ADCs). However, their power dissipation can be prohibitively high, particularly when high-order gain calibration is needed. This paper demonstrates the need for high-order gain calibration in pipelined ADCs designed using low-gain opamps in scaled digital CMOS. For high-order gain calibration, this paper then proposes a design methodology to optimize the data precision (number of bits) within the digital calibration unit. Thus, the power dissipation and chip area of the calibration unit can be minimized, without affecting the ADC linearity. A 90-nm field-programmable gate array synthesis of a second-order gain calibration unit shows that the proposed optimization methodology results in 53% and 30% reductions in digital power dissipation and chip area, respectively.