Robust Saliency-Driven Quality Adaptation for Mobile 360-Degree Video Streaming | IEEE Journals & Magazine | IEEE Xplore

Robust Saliency-Driven Quality Adaptation for Mobile 360-Degree Video Streaming


Abstract:

Mobile 360-degree video streaming has grown significantly in popularity but the quality of experience (QoE) suffers from insufficient and variable wireless network bandwi...Show More

Abstract:

Mobile 360-degree video streaming has grown significantly in popularity but the quality of experience (QoE) suffers from insufficient and variable wireless network bandwidth. Recently, saliency-driven 360-degree streaming overcomes the buffer size limitation of head movement trajectory (HMT)-driven solutions and thus strikes a better balance between video quality and rebuffering. However, inaccurate network estimations and intrinsic saliency bias still challenge saliency-based streaming approaches, limiting further QoE improvement. To address these challenges, we design a robust saliency-driven quality adaptation algorithm for 360-degree video streaming, RoSal360. Specifically, we present a practical, tile-size-aware deep neural network (DNN) model with a decoupled self-attention architecture to accurately and efficiently predict the transmission time of video tiles. Moreover, we design a reinforcement learning (RL)-driven online correction algorithm to robustly compensate the improper quality allocations due to saliency bias. Through extensive prototype evaluations over real wireless network environments including commodity WiFi, 4 G/LTE, and 5 G links in the wild, RoSal360 significantly enhances the video quality and reduces the rebuffering ratio, thereby improving the viewer QoE, compared to the state-of-the-art algorithms.
Published in: IEEE Transactions on Mobile Computing ( Volume: 23, Issue: 2, February 2024)
Page(s): 1312 - 1329
Date of Publication: 09 January 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.