Skip to Main Content
Jitter is timing noise that causes bit errors in high-speed data transmission lines. If the data rate of a system is increased, the magnitude of jitter measured in seconds is roughly unchanged, but measured as a fraction of a bit period, it increases proportionally with the data rate and causes errors. Emerging technologies require the ratio of the number of errors to the total number of transmitted bits (the bit error rate, BER) to be less than one in a trillion (10-12). As datacom, bus, and backplane data rates have increased, many different techniques for characterizing jitter have been introduced, each using a variety of different types of laboratory equipment. To fix difficult jitter problems at high data rates, engineers need to understand the diverse jitter analysis techniques used in both synchronous and asynchronous networking. The article focuses on data rates of emerging technologies above 3 Gb/s. Below 3 Gb/s, real-time oscilloscopes can capture a contiguous data stream that can be analyzed simultaneously in both the time and frequency domains; at higher data rates jitter analysis is more challenging. This discussion is from the perspective of a digital engineer, drawing on the experience of synchronous optical network/synchronous digital hierarchy (SONET/SDH) where many challenges have already been addressed.