Skip to Main Content
Most radiometers utilize a calibration technique in which measurements of a known reference are subtracted from measurements of an unknown source so that common-mode bias errors are cancelled. When a radiometer is scanned over a varying scene, it produces a sequence of outputs, each being proportional to the difference between the reference and the corresponding input. The reference averaging technique presented herein employs a simple digital algorithm which exploits the asymmetry between the time-variable scene inputs and the nominally constant reference input by averaging many reference measurements to decrease the statistical uncertainty in the reference value. This algorithm is, therefore, optimized by an asymmetric chopping sequence in which the scene is viewed for more than one-half of the duty cycle (unlike the analog Dicke technique) Reference averaging algorithms are well within the capabilities of small microprocessors. Although this paper develops the technique for microwave radiometry, it may be beneficial for any system which measures a large number of unknowns relative to a known reference in the presence of slowly varying common-mode errors.