Loading [MathJax]/extensions/MathZoom.js
Depth Maps Restoration for Human Using RealSense | IEEE Journals & Magazine | IEEE Xplore

Depth Maps Restoration for Human Using RealSense


Experimental results of our method on RealSense. For every group with 5 images, from left to right: Input RGB image, depth image captured by RealSense, depth prediction r...

Abstract:

Recently, mobile devices such as iPhone X start to be equipped with depth cameras, and more applications based on captured depth maps are emerging. Among many depth camer...Show More

Abstract:

Recently, mobile devices such as iPhone X start to be equipped with depth cameras, and more applications based on captured depth maps are emerging. Among many depth cameras on the market, Intel RealSense has the ability to capture depth information and is expected to be widely used in mobile devices and laptops. However, depth maps captured by RealSense always suffer from severe holes and noises, which make it hard to be used in real applications. In this paper, we propose a method to fill holes and remove noises in depth maps captured by RealSense. This method includes two parts: human depth prediction and human depth optimization. Firstly, we propose a two-stage stacked hourglass network to predict human part-segmentation and human depth simultaneously based on RGB image. Then we use GradientFMM method to optimize captured depth maps with the guidance of the above human depth prediction. The RGB image and depth maps mentioned above are captured by the same RealSense device. Furthermore, in order to show the effectiveness of the proposed method, we register and measure human 3D models based on optimized depth maps. The experimental results show that our method can restore depth maps for human using RealSense effectively.
Experimental results of our method on RealSense. For every group with 5 images, from left to right: Input RGB image, depth image captured by RealSense, depth prediction r...
Published in: IEEE Access ( Volume: 7)
Page(s): 112544 - 112553
Date of Publication: 12 August 2019
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.