Robust Saliency-Driven Quality Adaptation for Mobile 360-Degree Video Streaming

IEEE TRANSACTIONS ON MOBILE COMPUTING(2024)

引用 1|浏览6
暂无评分
摘要
Mobile 360-degree video streaming has grown significantly in popularity but the quality of experience (QoE) suffers from insufficient and variable wireless network bandwidth. Recently, saliency-driven 360-degree streaming overcomes the buffer size limitation of head movement trajectory (HMT)-driven solutions and thus strikes a better balance between video quality and rebuffering. However, inaccurate network estimations and intrinsic saliency bias still challenge saliency-based streaming approaches, limiting further QoE improvement. To address these challenges, we design a robust saliency-driven quality adaptation algorithm for 360-degree video streaming, RoSal360. Specifically, we present a practical, tile-size-aware deep neural network (DNN) model with a decoupled self-attention architecture to accurately and efficiently predict the transmission time of video tiles. Moreover, we design a reinforcement learning (RL)-driven online correction algorithm to robustly compensate the improper quality allocations due to saliency bias. Through extensive prototype evaluations over real wireless network environments including commodity WiFi, 4 G/LTE, and 5 G links in the wild, RoSal360 significantly enhances the video quality and reduces the rebuffering ratio, thereby improving the viewer QoE, compared to the state-of-the-art algorithms.
更多
查看译文
关键词
Quality adaptation,saliency,network estimation,360-degree video streaming
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要