Using Multimodal Input in Augmented Virtual Teleportation

2022 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2022)(2022)

引用 0|浏览17
暂无评分
摘要
Augmented (AR) and Virtual Reality (VR) can create compelling emotional collaborative experiences, but very few studies have explored the importance of sharing a user's live environment and their physiological cues. In this PhD thesis, I am investigating how to use scene reconstruction and emotion recognition to enhance shared collaborative AR/VR experiences. I have developed a framework that can be broadly classified into two sections: 1) Live scene capturing for real-time environment reconstruction, 2) Sharing multimodal input such as gaze, gesture, and physiological cues. The main novelty of the research is that it is one of the first systems for real-time sharing of environment and emotion cues. It provides significant insight into how to create, measure, and share remote collaborative experiences. The research will be helpful in multiple application domains such as remote assistance, tourism, training and entertainment. It will also enable the creation of interfaces that automatically adapt to the user's emotional needs and environment and provide a better collaborative experience.
更多
查看译文
关键词
Human-centered computing, Human computer Interaction (HCI), Interaction paradigms, Virtual reality, Human-centered computing, Interaction design Interaction design process and methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要