Two-Dimensional Video-Based Analysis Of Human Gait Using Pose Estimation

PLOS COMPUTATIONAL BIOLOGY(2021)

引用 89|浏览25
暂无评分
摘要
Author summaryThere is a growing interest among clinicians and researchers to use novel pose estimation algorithms that automatically track human movement to analyze human gait. Gait analysis is routinely conducted in designated laboratories with specialized equipment. On the other hand, pose estimation relies on digital videos that can be recorded from household devices such as a smartphone. As a result, these new techniques make it possible to move beyond the laboratory and perform gait analysis in other settings such as the home or clinic. Before such techniques are adopted, we identify a critical need for comparing outcome parameters against three-dimensional motion capture and to evaluate how camera viewpoint affect outcome parameters. We used simultaneous motion capture and left- and right-side video recordings of healthy human gait and calculated spatiotemporal gait parameters and lower-limb joint angles. We find that our provided workflow estimates spatiotemporal gait parameters together with hip and knee angles with the accuracy and precision needed to detect changes in the gait pattern. We demonstrate that the position of the participant relative to the camera affect spatial measures such as step length and discuss the limitations posed by the current approach.Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s(-1). Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0 degrees, 5.6 degrees and 7.4 degrees. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis.
更多
查看译文
关键词
gait analysis,pose estimation,marker-less motion capture,kinematics,OpenPose
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要