By Topic

A non-photorealistic rendering framework with temporal coherence for augmented reality

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Jiajian Chen ; Sch. of Interactive Comput., Georgia Inst. of Technol., Atlanta, GA, USA ; Turk, G. ; MacIntyre, B.

Many augmented reality (AR) applications require a seamless blending of real and virtual content as key to increased immersion and improved user experiences. Photorealistic and non-photorealistic rendering (NPR) are two ways to achieve this goal. Compared with photorealistic rendering, NPR stylizes both the real and virtual content and makes them indistinguishable. Maintaining temporal coherence is a key challenge in NPR. We propose a NPR framework with support for temporal coherence by leveraging model-space information. Our systems targets painterly rendering styles of NPR. There are three major steps in this rendering framework for creating coherent results: tensor field creation, brush anchor placement, and brush stroke reshaping. To achieve temporal coherence for the final rendered results, we propose a new projection-based surface sampling algorithm which generates anchor points on model surfaces. The 2D projections of these samples are uniformly distributed in image space for optimal brush stroke placement. We also propose a general method for averaging various properties of brush stroke textures, such as their skeletons and colors, to further improve the temporal coherence. We apply these methods to both static and animated models to create a painterly rendering style for AR. Compared with existing image space algorithms our method renders AR with NPR effects with a high degree of coherence.

Published in:

Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on

Date of Conference:

5-8 Nov. 2012