Designing an user-centric framework for perceptually-efficient streaming of 360° edited videos.

IQSP(2022)

引用 0|浏览5
暂无评分
摘要
In the last few years, the popularity of immersive applications has experienced a major increase because of the introduction of powerful imaging and display devices. The most popular immersive media are 360-degree videos, which provide the sensation of immersion. Naturally, these videos require significantly more data, which is a challenge for streaming applications. In this work, our goal is to design a perceptually efficient streaming protocol based on edited versions of the original content. More specifically, we propose to use visual attention and semantic analysis to implement an automatic perceptual edition of 360-degree videos and design an efficient Adaptive Bit Rate (ABR) streaming scheme. The proposed scheme takes advantage of the fact that movies are made of a sequence of different shots, separated by cuts. Cuts can be used to attract viewer’s attention to important events and objects. In this paper, we report the first stage of this scheme: the content analysis used to select temporal and spatial candidate cuts. For this, we manually selected candidate cuts from a set of 360-degree videos and analyzed the users' quality of experience (QoE). Then, we computed their salient areas and analyzed if these areas are good candidates for the video cuts.
更多
查看译文
关键词
framework,user-centric,perceptually-efficient
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要