Skip to Main Content
Designing a non-ideal delay line (DL) with phase distortion in a transmitted-reference ultra-wideband system with an autocorrelation receiver is a great technical challenge. Differing from the currently empirical design method of DL, a semi-analytic approach is proposed through Gaussian approximation of the expression for conditional bit error rate (BER), based on investigation on the degradation of average BER caused by a group delay ripple range (GDRR) over independent Nakagami-m fading channels. This GDRR-based design method can directly evaluate its effects on the system performance and determine the acceptable phase distortion level to trade-off the BER performance and system complexity.
Date of Publication: November 25 2011