Skip to Main Content
The problem of optimum analysis of overlapping signals by means of least-square approximation techniques and by linear transformation processes, like resolution enhancement by inverse convolution, is considered. It is shown that optimum accuracy is obtained by employing, before doing a least-square fit, a transformation process which transforms the random noise into white noise. This particularly implies that in any process of artificial resolution enhancement, a loss of accuracy occurs. Examples from the field of spectroscopy are given to demonstrate the achievable accuracy and the requirements with respect to signal-to-noise ratio.