Skip to Main Content
Analyzing the effect of crosstalk on delay is critical for high performance circuits. The major bottleneck in performing crosstalk-induced delay analysis is the high computational cost of simulating the coupled interconnect and the nonlinear drivers. We propose an efficient iterative algorithm that avoids time-consuming nonlinear driver simulations and performs node-specific crosstalk delay analysis. The proposed algorithm has been tested over circuits in two deep submicron technologies with varying driver sizes, interconnect parasitics, signal transition times and it has been found to predict the worst-case delay to within 10% of the actual delay.