By Topic

Generating occlusion-free textures for virtual 3D model of urban facades by fusing image and laser street data

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Karim Hammoudi ; Department of Computer Science, National University of Ireland Maynooth, Maynooth, Co. Kildare, Ireland ; Fadi Dornaika ; Bahman Soheilian ; Bruno Vallet
more authors

In this paper we present relevant results of a work in progress1 that deals with the texturing of 3D urban facade models by fusing terrestrial multi-source data acquired by a Mobile Mapping System (MMS). Some of current 3D urban facade models often are textured by using images that contain parts of urban objects that belong to the street. These urban objects represent in this case occlusions since they are located between the acquisition system and the facades. We show the potential use of georeferenced images and 3D point cloud that are acquired at street level by the MMS in generating occlusion-free facade textures. We describe a methodology for reconstructing texture parts of facades that are highly occluded by wide frontal objects.

Published in:

2012 IEEE Virtual Reality Workshops (VRW)

Date of Conference:

4-8 March 2012