Skip to Main Content
In old CMOS technologies above 90 nm, operating a circuit in high-temperature regime was implying an increase in the total delay. This was due to the fact that both interconnects and gates were slowing down as temperature was raising. For transistors with feature size of 90 nm and below, this picture started changing. In particular, the threshold voltage to supply voltage ratio of high-Vt cells in a library is now very close to 1. Consequence of this is the appearance of the so-called Inverted Temperature Dependence (ITD) of the propagation delay of such cells. In other words, while for low-Vt gates the delay does increase with temperature, high-Vt gates show the opposite behavior; they get faster as they get warmer. This new, complicated dependence of delay vs. temperature poses new challenges to circuit designers and, in turn, to the EDA tools. Besides making timing analysis more difficult, ITD has important and unforeseeable consequence for power-aware logic synthesis. Expanding on our recent work , , this paper describes the impact that ITD may have on the design of modern, nanometer VLSI circuits. We also provide a more refined algorithm for dual-Vt synthesis which guarantees temperature-insensitive operation of the circuits, together with a significant reduction of both leakage and total power consumption. In fact, experiments performed on a set of standard benchmarks show timing compliancy at any temperature, and an average leakage reduction around 22% w.r.t. circuits synthesized with a standard, commercial flow that does not take ITD into account and thus, to ensure that no temperature-induced timing faults occur, needs to resort to overdesign (i.e., overconstraining the timing bound so as to make sure that temperature fluctuations never make the circuits violating the specified required time for all paths).