By Topic

A generative model of human hair for hair sketching

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Hong Chen ; Dept. of Stat. & Comput. Sci., California Univ., Los Angeles, CA, USA ; Song Chun Zhu

Human hair is a very complex visual pattern whose representation is rarely studied in the vision literature despite its important role in human recognition. In this paper, we propose a generative model for hair representation and hair sketching, which is far more compact than the physically based models in graphics. We decompose a color hair image into three bands: a color band (a) (by Luv transform), a low frequency band (b) for lighting variations, and a high frequency band (c) for the hair pattern. Then we propose a three level generative model for the hair image (c). In this model, image (c) is generated by a vector field (d) that represents hair orientation, gradient strength, and directions; and this vector field is in turn generated by a hair sketch layer (e). We identify five types of primitives for the hair sketch each specifying the orientations of the vector field on the two sides of the sketch. With the five-layer representation (a-e) computed, we can reconstruct vivid hair images and generate hair sketches. We test our algorithm on a large data set of hairs and some results are reported in the experiments.

Published in:

Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on  (Volume:2 )

Date of Conference:

20-25 June 2005