Skip to Main Content
It is well known that the color of many natural and man-made objects is often very similar to that of human skin, such as sand, brick, to name a few. For the task of skin detection, it is often a very challenge task to identify the right skin locations while being robust against the distraction of these objects. In this paper, we present an on-line learning approach to model human skin by utilizing the similarity between neighboring pixels, and then combine it with region growing technique to accurately segment skin regions. By assuming the colors of neighboring skin pixels in YCbCr color space follows conditional Gaussian distributions, an on-line learning update is devised to efficiently estimate the parameters of these Gaussian distributions. For the inference stage, our algorithm first evaluates the color distance map in RGB space to reliably place the seeds of skin regions, then segment the skin regions by iterative seed growing based on the learned skin models. Empirical evaluations demonstrate the efficacy of the proposed approach.