Skip to Main Content
In this paper, we present a mobile phone-based 3D modeling framework. The proposed framework enables users to create 3D content and to update the existing ones via their mobile phones. In the proposed system, users take photos of an object by using their mobile phones and give annotations on the image regions of the foreground object and the background through a touch-based user interface. The photos with annotations are transmitted to a remote server via wireless network. On the server machine, a 3D reconstruction process is conducted. Firstly, we recover the poses of cameras from the received photos. Then, the silhouettes of the object are obtained by a multi-view silhouette extraction method exploiting both the color and spatial constraints. Finally, we reconstruct a visual hull that has smooth surfaces through an iterative deformation. The created 3D models are reproduced on mobile phones by merging them on a real scene for mobile augmented reality applications.