Skip to Main Content
The increase of statistical variations in advanced nanometer CMOS technologies poses a major challenge for digital circuit design. In this paper, we study the impact of random variations on the delay variability of a gate and derive simple and scalable statistical models to effectively evaluate delay variations in the presence of within-die variations. The derived models are verified and compared to Monte Carlo SPICE simulations using industrial 90-nm technology. This paper provides new design insight and highlights the importance of accounting for the effect of input slew on delay variations, particularly at lower supply voltages. We also show that, for a given supply voltage, there is an optimum input slew that minimizes the relative delay variation of the gate. We present conditions to achieve this minimum. The derived analytical models account for the impact of supply voltage and output loading and can be used in early design cycle. These results are particularly important for variation-tolerant design in nanometer technologies, particularly in low-power and low-voltage operation.