Skip to Main Content
Your organization might have access to this article on the publisher's site. To check, click on this link:http://dx.doi.org/+10.1063/1.362617
An algorithm for the enhancement of digital images is described. The algorithm is based upon the analogy that may exist between the description of a macroscopic system composed of many particles, and a digital image composed of many pixels. The analogy assumes that the intensity in a digital image fluctuates, so that the algorithm takes into account that fluctuations must decrease to a minimum in such a way that an enhanced image may be thought of as an image in an equilibrium state, leading to a quantitative criterion to stop the enhancement process. This may be taken as the starting point of a computer aided vision system, the next step being the image segmentation leading to the identification of the various patterns forming the image and which is described in a forthcoming paper. © 1996 American Institute of Physics.