Skip to Main Content
Global interconnect delay variations may cause clock skew, unpredictable signal line delays, and degraded system performance. Conventional variation mitigation techniques incur large delay and power overheads, as variability increases in sub-65 nm technologies. This paper presents a methodology to include robustness optimization in power-delay optimal buffer insertion. Closed form expressions are derived for the delay variation model used in the optimization and its accuracy is verified against simulation results. Using the power, delay, and delay variation models, a design space is constructed for the interconnect. Through power-robustness trade-off analysis of the design space, the optimal buffering solution for the interconnect is computed. Comparison with simulation results verifies the accuracy of the optimal solution computed using this method. The application of this methodology in enhancing robustness of clock networks during buffer insertion phase is demonstrated and simulation results presented.