Loading [MathJax]/extensions/MathMenu.js
Deep Unrolled Graph Laplacian Regularization for Robust Time-of-Flight Depth Denoising | IEEE Journals & Magazine | IEEE Xplore

Deep Unrolled Graph Laplacian Regularization for Robust Time-of-Flight Depth Denoising


Abstract:

Depth images captured by Time-of-Flight (ToF) sensors are subject to severe noise. Recent approaches based on deep neural networks achieve good depth denoising performanc...Show More

Abstract:

Depth images captured by Time-of-Flight (ToF) sensors are subject to severe noise. Recent approaches based on deep neural networks achieve good depth denoising performance in synthetic data, but the application to real-world data is limited, due to the complexity of actual depth noise characteristics and the difficulty in acquiring ground truth. In this paper, we propose a novel ToF depth denoising network based on unrolled graph Laplacian regularization to “robustify” the network against both noise complexity and dataset deficiency. Unlike previous schemes that are ignorant of underlying ToF imaging mechanism, we formulate a fidelity term in the optimization problem to adapt to the depth probabilistic distribution with spatially-varying noise variance. Then, we add quadratic graph Laplacian regularization as the smoothness prior, leading to a maximum a posteriori problem that is optimized efficiently by solving a linear system of equations. We unroll the solution into iterative filters so that parameters used in the optimization and graph construction are amendable to data-driven tuning. Because the resulting network is built using domain knowledge of ToF imaging principle and graph prior, it is robust against overfitting to synthetic training data. Experimental results demonstrate that the proposal outperforms existing schemes in ToF depth denoising on synthetic FLAT dataset and generalizes well to real Kinectv2 dataset.
Published in: IEEE Signal Processing Letters ( Volume: 32)
Page(s): 821 - 825
Date of Publication: 07 February 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.