By Topic

Driver Deadtime Control and Its Impact on System Stability of Synchronous Buck Voltage Regulator

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Weihong Qiu ; Intersil Corp., Milpitas ; Mercer, S. ; Zhixiang Liang ; Miller, G.

Driver dead-time control is a popular scheme used to prevent the occurrence of the shoot-through issue in a synchronous Buck voltage regulator. As the switching frequency is continually increasing in today's converter design, the deadtime interval is now long enough relative to the switching period to impact the system performance. In addition to its impact on efficiency, driver dead-time also impacts loop gain and system stability, especially under the critical load condition. In this paper, the influence of driver dead time on the synchronous buck converter is investigated in detail. With voltage mode control, the system loop gain will change under different load conditions due to the deadtime impact. The deadtime may cause sub-harmonic current ripple in the voltage regulator with sample-and-hold current mode control, while its impact on the peak current mode control can be ignored. Design equations are provided to avoid this issue. Some analysis data are included and compared to experimental results.

Published in:

Power Electronics, IEEE Transactions on  (Volume:23 ,  Issue: 1 )