Iterative Approach to Reconstructing Neural Disparity Fields From Light-Field Data | IEEE Journals & Magazine | IEEE Xplore

Iterative Approach to Reconstructing Neural Disparity Fields From Light-Field Data


Abstract:

This study proposes a neural disparity field (NDF) that establishes an implicit, continuous representation of scene disparity based on a neural field and an iterative app...Show More

Abstract:

This study proposes a neural disparity field (NDF) that establishes an implicit, continuous representation of scene disparity based on a neural field and an iterative approach to address the inverse problem of NDF reconstruction from light-field (LF) data. NDF enables seamless and precise characterization of disparity variations in three-dimensional scenes and can discretize disparity at any arbitrary resolution, overcoming the limitations of traditional disparity maps that are prone to sampling errors and interpolation inaccuracies. The proposed NDF network architecture utilizes hash encoding combined with multilayer perceptrons (MLPs) to capture detailed disparities in texture levels, thereby enhancing its ability to represent the geometric information of complex scenes. By leveraging the spatial-angular consistency inherent in the LF data, a differentiable forward model to generate a central view image from the LF data is developed. Based on the forward model, an optimization scheme for the inverse problem of NDF reconstruction using differentiable propagation operators is established. Furthermore, an iterative solution method is adopted to reconstruct the NDF in the optimization scheme, which does not require training datasets and applies to LF data captured by various acquisition methods. Experimental results demonstrate that the proposed method can reconstruct high-quality NDF from LF data. The high-resolution disparity can be effectively recovered by NDF, demonstrating its capability for the implicit, continuous representation of scene disparities.
Published in: IEEE Transactions on Computational Imaging ( Volume: 11)
Page(s): 410 - 420
Date of Publication: 29 January 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.