Skip to Main Content
Traditionally, the minimum possible area of a very large scale integration (VLSI) layout is considered to be the best for delay and power minimization due to decreased interconnect capacitance. This paper, however, shows that the use of minimum area does not result in minimum power and/or delay in nanometer-scale technologies due to thermal effects and, in some cases, may cause thermal runaway. A methodology using area as a design parameter to reduce the leakage power and prevent thermal runaway is presented. A 16-bit adder example in 70-nm technology shows total power savings of 17% with 15% increase in area and no increase in delay. The power savings using this technique are expected to increase in future technologies.