By Topic

A method for object reconstruction based on point-cloud data via 3D scanning

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Fengxia Li ; Sch. of Comp. Sci. & Tech., Beijing Inst. of Tech., Beijing, China ; Rong Tang ; Chen Liu ; Haikun Yu

With the development of computer technology, the reconstruction technologies using point-cloud data from 3D scanner have been widely used. But as the original point-cloud is huge, redundant and may have many noise and holes, the following reconstructing process becomes slow and the reconstructed model may not be accurate. In this paper, a method for reconstructing object based on point-cloud data from 3D scanning is proposed, which adopts the memory-mapped file and OpenGL to accelerate point-cloud reading and rendering. In order to simplify point-cloud and reduce noise, the method clusters the original point-cloud first, and then processes it in terms of fitting lines of subclasses via nonlinear least square method. By applying the RBFNN, which is learned using points on the boundary of point-cloud hole, the original point-cloud is patched and finally a simply and complete object point-cloud model is obtained. The experimental results show that the proposed method largely reduces the number of point-cloud but retains its main features, and the error of hole-patching is small, which can meet engineering requirements.

Published in:

Audio Language and Image Processing (ICALIP), 2010 International Conference on

Date of Conference:

23-25 Nov. 2010