User-state sensing for virtual health agents and telehealth applications.

MMVR(2013)

引用 41|浏览34
暂无评分
摘要
Nonverbal behaviors play a crucial role in shaping outcomes in face-to-face clinical interactions. Experienced clinicians use nonverbals to foster rapport and "read" their clients to inform diagnoses. The rise of telemedicine and virtual health agents creates new opportunities, but it also strips away much of this nonverbal channel. Recent advances in low-cost computer vision and sensing technologies have the potential to address this challenge by learning to recognize nonverbal cues from large datasets of clinical interactions. These techniques can enhance both telemedicine and the emerging technology of virtual health agents. This article describes our current research in addressing these challenges in the domain of PTSD and depression screening for U.S. Veterans. We describe our general approach and report on our initial contribution: the creation of a large dataset of clinical interview data that facilitates the training of user-state sensing technology.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要