Skip to Main Content
The combined use of time-domain waveforms, detection probability, and false-alarm probability as criteria for selecting a sample period and quantization interval which minimizes bit rate is illustrated. This is contrasted with the frequencydomain concepts and mean-square error criteria usually employed in telemetry to choose sample period and quantization interval. The method is applied to a sampled-data alarm system defined in the paper. Its essence is to choose a quantization interval which provides, at minimum bit rate, a prescribed number of nonzero samples following the receipt of an impulse input of specified strength, while maintaining preset detection and false alarm probabilities. The effect of noise on optimum choice is considered. The methodology is flow-charted in detail sufficient that the technique may be applied to other problems.