A new theorem shows that additive quantizer noise decreases the mean-squared error of threshold-array optimal and suboptimal linear estimators. The initial rate of this noise benefit improves as the number of threshold sensors or quantizers increases. The array sums the outputs of identical binary quantizers that receive the same random input signal. The theorem further shows that zero-symmetric uniform quantizer noise gives the fastest initial decrease in mean-squared error among all finite-variance zero-symmetric scale-family noise. These results apply to all bounded continuous signal densities and all zero-symmetric scale-family quantizer noise with finite variance.