Skip to Main Content
The effect of the Lorentz polarization term on the vertical incidence absorption of a radio wave in a deviating ionospheric layer is determined. This result is obtained by using a double parabola approximation to the Chapman distribution of electron density as a function of height, and a single parabola approximation to the height distribution of the product of electron density times the collisional frequency. A constant times the logarithm of the reflection coefficient divided by the product of the scale height times the collisional frequency at the level of maximum ionization is obtained as a function of the ratio of the wave frequency to the vertical incidence critical frequency. This is compared with a similar result for the Sellmeyer theory of dispersion as given by Hacke. The results for a type of "nondeviating region" absorption are compared for both theories. The expression for the apparent height of reflection using the Lorentz theory is derived and compared with that obtained by Hacke for the Sellmeyer theory. These latter results are given for use in the experimental determination of the scale height in the layer. The determination of the value of the collisional frequency at the level of maximum ionization is discussed for both theories.