Meta360: Exploring User-Specific and Robust Viewport Prediction in 360-Degree Videos through Bi-Directional LSTM and Meta-Adaptation

2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)(2023)

引用 0|浏览0
暂无评分
摘要
Viewport prediction is a critical aspect of virtual reality (VR) video streaming, directly impacting user experience in adaptive streaming. However, most existing algorithms treat users as homogeneous entities and overlook the variations in user behaviors and video content. Additionally, they often struggle with long-term predictions and intense movement. Our research sheds light on the importance of considering user behavior variations and leveraging advanced techniques to optimize robust viewport prediction in VR video streaming. First, we address these limitations by conducting a comprehensive feature analysis on existing datasets to uncover distinctive user behaviors. Building upon these findings, we propose a novel approach that utilizes the power of Bidirectional Long Short-Term Memory (BiLSTM) networks and meta-learning. The BiLSTM architecture effectively captures long-term dependencies, which can strengthen the robustness of viewport prediction especially in long-term prediction and intense movement. Additionally, meta-learning enables personalized adaptation to individual users' viewing behaviors. Through extensive evaluations on diverse datasets, our algorithm Meta360 demonstrates superior performance in terms of accuracy and robustness compared to state-of-the-art methods.
更多
查看译文
关键词
Human-centered computing,Human computer interaction (HCI),Interaction paradigms,Virtual reality,Computing methodologies,Artificial intelligence,Computer vision,Computer vision problems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要