Skip to Main Content
Phase error in the aperture field of a microwave paraboloidal antenna degrades antenna gain in two ways: the asynchronism of partial field contributions arriving at an axial field point reduces the magnitude of the total field there, and the phase error may generate a cross-polarized component of the aperture field that further reduces the axial gain. Because of phase ripples in the field reflected from the subdish, a Cassegrainian-fed antenna may be considerably more susceptible to phase-error effects than conventional focal-point-fed antennas. Consequently, a two-part analysis was conducted to evaluate the importance of these phase-error effects in Cassegrainian systems. The feed-system fields were computed and a best-fit phase center was found. Then the axial gain was computed in terms of the feed-system fields. An expression for the phase-error loss was defined to evaluate the effects of diffractive phase ripple, feed-system misalignment, etc. Numerical analyses were carried out for a wide range of antenna parameters. It was concluded that, for a 19-wavelength subdish and a nearly symmetrical phase and amplitude feed pattern, the loss in axial gain due to diffractive phase error may be only a small fraction of a decibel.