Skip to Main Content
As computer clock speeds continue to increase at a rate dictated by Moore's Law, the system buses must also scale in proportion to the processor speed. As data rates increase beyond ~5 Gb/s, the historical methods used to model transmission lines start to break down and become inadequate for the proper prediction of signal integrity. Specifically, the traditional approximations made in transmission line models, while perfectly adequate for slower speeds, do not properly account for the extra losses caused by surface roughness and do not model the frequency dependence of the complex dielectric constant, producing incorrect loss and phase-delay responses, as well as noncausal waveforms in the time domain. This paper will discuss the problems associated with modeling transmission lines at high frequencies, and will provide a practical modeling methodology that accurately predicts responses for very high data rates.