Skip to Main Content
In this paper, we perform the theoretical analysis and experimental demonstration of an on-chip implementation of a Ku-band nanosecond scale time-stretching (TS) system in a 130-nm IBM 8RF CMOS process. The theory of the TS system is applicable to general TS systems. In this study, we explain the impact of the time-bandwidth product (TBP) on practical design considerations and derive the error and distortion of a general TS system based on a dispersive delay line with perfect linear group delay and all pass amplitude characteristic. We also derive the time resolution of a general TS system using both the principle of uncertainty as well as the short time Fourier transform method. This fundamental result enables a designer to understand the qualitative relationship between the TBP and the best possible resolution of the TS system. Finally, we experimentally demonstrate the TS system on chip with nanosecond group-delay variance and 12-16-GHz bandwidth. This demonstration indicates the potential for implementation of more complicated time-scaling signal-processing systems on chip, as well as a quantification of the error and distortion for such systems.