When a CW skywave signal is received on a linearly polarized antenna, polarization (Faraday) rotation produces a variation of received signal strength with radio frequency. The resulting dependence of received signal amplitude on radio frequency may impose a bandwidth limitation on pulsed signals where waveform preservation is important. A measure of this limitation, termed polarization bandwidth, is defined to correspond to the bandwidth in which the plane of polarization rotates90deg. Computer ray-tracing calculations were performed using a single Chapman-layer ionospheric model to determine the 1-hop polarization bandwidth as a function of geomagnetic azimuth and radio frequency. The polarization bandwidth was found to decrease with increasing radio frequency and with increasingly close alignment of the propagation path with the longitudinal component of the earth's magnetic field. Assuming a critical frequency of 9 MHz and a path length of 2000 km, the polarization bandwidth increased from a minimum of 140 kHz at 10.5 MHz and from a minimum of 70 kHz at 17.5 MHz, as the propagation direction varied from geomagnetic north to east. A model for the 1-hop ionospheric signal channel is proposed whose parameters are the rate of change of polarization rotation with frequency and the phase versus frequency characteristic of the path. These two parameters are shown to be readily determined from FM-CW or equivalent oblique-path sounding records. Using this model, predictions are made of the effects of polarization rotation with frequency, and also of ionospheric dispersion or phase distortion, on the envelope shape of short-pulse signals (of from 1.5 to50 mus duration). A pronounced waveshape distortion due to the effects of polarization rotation on the pulse envelope was observed when the signal bandwidth appreciably exceeded the "polarization bandwidth" for the path.