Abstract:
We investigate how real-time, 360° view synthesis can be achieved on current virtual reality hardware from a single panoramic image input. We introduce a light-weight met...Show MoreMetadata
Abstract:
We investigate how real-time, 360° view synthesis can be achieved on current virtual reality hardware from a single panoramic image input. We introduce a light-weight method to automatically convert a single panoramic input into a multi-cylinder image representation that supports real-time, free-viewpoint view synthesis rendering for virtual reality. We apply an existing convolutional neural network trained on pinhole images to a cylindrical panorama with wrap padding to ensure agreement between the left and right edges. The network outputs a stack of semi-transparent panoramas at varying depths which can be easily rendered and composited with over blending. Quantitative experiments and a user study show that the method produces convincing parallax and fewer artifacts than a textured mesh representation.
Date of Conference: 17-21 October 2022
Date Added to IEEE Xplore: 27 December 2022
ISBN Information:
Print on Demand(PoD) ISSN: 1554-7868