PhysioHMD: a conformable, modular toolkit for collecting physiological data from head-mounted displays.

UbiComp '18: The 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing Singapore Singapore October, 2018(2018)

引用 50|浏览24
暂无评分
摘要
Virtual and augmented reality headsets are unique as they have access to our facial area: an area that presents an excellent opportunity for always-available input and insight into the user's state. Their position on the face makes it possible to capture bio-signals as well as facial expressions. This paper introduces the PhysioHMD, a software and hardware modular interface built for collecting affect and physiological data from users wearing a head-mounted display. The PhysioHMD platform is a flexible architecture enables researchers and developers to aggregate and interprets signals in real-time, and use those to develop novel, personalized interactions and evaluate virtual experiences. Offering an interface that is not only easy to extend but also is complemented by a suite of tools for testing and analysis. We hope that PhysioHMD can become a universal, publicly available testbed for VR and AR researchers.
更多
查看译文
关键词
Affect recognition, virtual reality, augmented reality, physiological signals, BCI, behavioural measures
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要