Depth filter design by jointly utilizing spatial-temporal depth and texture information | IEEE Conference Publication | IEEE Xplore

Depth filter design by jointly utilizing spatial-temporal depth and texture information


Abstract:

In depth-based 3D video systems, noisy pixels in depth map always introduce serious geometric distortions in the synthesized virtual view. To remove the noisy pixels, a s...Show More

Abstract:

In depth-based 3D video systems, noisy pixels in depth map always introduce serious geometric distortions in the synthesized virtual view. To remove the noisy pixels, a spatial-temporal depth filter is developed in this paper by utilizing depth and texture information jointly in a spatial-temporal domain. A pixel vector is introduced by jointly considering texture and the corresponding depth value of a pixel, with a weight between them. Moreover, the pixel similarity is measured by the distance of corresponding pixel vectors. The filtering process is performed in three steps. First, reference pixels of a to-be-filtered pixel are selected in the spatial-temporal domain based on the similarity of pixel vectors. Second, only the most relevant pixels are to be identified among reference pixels. Specifically, different algorithms are considered in the identification of relevant pixels in smooth and edge regions, where a pixel vector based classification is performed. Finally, a median filter among the identified pixels is used to obtain the result for the to-be-filtered pixel. The experimental results demonstrate that the proposed filter can effectively remove noisy pixels and improve the temporal consistency of depth map, where good synthesized results based on the filtered depth can be achieved.
Date of Conference: 17-19 June 2015
Date Added to IEEE Xplore: 06 August 2015
Electronic ISBN:978-1-4799-5865-8

ISSN Information:

Conference Location: Ghent, Belgium
No metrics found for this document.

No metrics found for this document.
Contact IEEE to Subscribe

References

References is not available for this document.