Skip to Main Content
Power supply integrity verification has become a key concern in high performance designs. In deep submicron technologies, power supply noise can significantly increase the circuit delay and lead to performance failures. Traditional static timing analysis which applies worst-case voltage margins to compute circuit delay leads to a very conservative analysis because the worst-case drop is localized to a small area of the die. In this paper, we propose a new approach for analyzing the impact of power supply variations on circuit delay. The circuit delay maximization problem is formulated as a constrained non-linear optimization problem which takes both IR and Ldi/dt drops into account The proposed approach does not require apriori knowledge of critical paths in the circuit and can be effectively incorporated in an existing static timing analysis framework. The proposed method has been implemented and tested on ISCAS85 benchmark circuits and compared with the traditional methods for computing worst-case circuit delay under supply variations.