Skip to Main Content
Bang-bang controlled clock-recovery is widely used in serial link applications as it offers ideal matching between the timing and data samplers and is amenable to fast digital implementation. However, the bang-bang phase detection discards the magnitude information of the timing error and results in inconsistent loop dynamics that vary with the input-noise characteristics. This paper analyzes the jitter-dependent behavior of the tracked oversampling clock-recovery loop, by introducing the concept of effective phase detector gain. Three methods to stabilize the loop bandwidth against jitter variation are described and compared: adapting the loop gain, increasing the oversampling ratio, and adjusting the sampling points. The analyses are then verified with the jitter transfer characteristics obtained from time-step behavioral simulation.