Loading [MathJax]/extensions/MathMenu.js
Efficient 3D Reconstruction Through Enhanced PatchMatch Techniques for Accelerated Point Cloud Generation | IEEE Journals & Magazine | IEEE Xplore

Efficient 3D Reconstruction Through Enhanced PatchMatch Techniques for Accelerated Point Cloud Generation


This method accelerates point cloud generation by enhancing PMVS, using SfM rectification, stereo pair selection, sparse matching, and depth map merging to produce a dens...

Abstract:

The demand for high-quality 3D models of buildings and urban landscapes has significantly increased in recent years. To meet this demand, researchers have turned to multi...Show More

Abstract:

The demand for high-quality 3D models of buildings and urban landscapes has significantly increased in recent years. To meet this demand, researchers have turned to multi-view stereo (MVS) reconstruction methods that utilize low-altitude multi-angle oblique aerial images. However, the MVS approach has limitations, such as matching failures or errors resulting from the absence of texture on building surfaces and low efficiency in generating point clouds using high-resolution aerial photos. To address these challenges, an accelerated point-cloud generation method based on PatchMatch (Ac-PMVS) was developed. This method enhances the image-based MVS 3D reconstruction of urban buildings by improving the similarity matching method between pixel blocks, adopting a sparse matching strategy, and generating depth maps at intervals. The resulting 3D models are of high quality and can be generated rapidly, as demonstrated by tests using multi-angle aerial image sets captured by UAVs. In summary, the proposed method shows potential for meeting the increasing demand for high-quality 3D models of buildings and urban landscapes.
This method accelerates point cloud generation by enhancing PMVS, using SfM rectification, stereo pair selection, sparse matching, and depth map merging to produce a dens...
Published in: IEEE Access ( Volume: 12)
Page(s): 144588 - 144598
Date of Publication: 16 September 2024
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.