OpenCap: Human movement dynamics from smartphone videos

biorxiv(2023)

引用 17|浏览22
暂无评分
摘要
Measures of human movement dynamics can predict outcomes like injury risk or musculoskeletal disease progression. However, these measures are rarely quantified in large-scale research studies or clinical practice due to the prohibitive cost, time, and expertise required. Here we present and validate OpenCap, an open-source platform for computing both the kinematics (i.e., motion) and dynamics (i.e., forces) of human movement using videos captured from two or more smartphones. OpenCap leverages pose estimation algorithms to identify body landmarks from videos; deep learning and biomechanical models to estimate three-dimensional kinematics; and physics-based simulations to estimate muscle activations and musculoskeletal dynamics. OpenCap's web application enables users to collect synchronous videos and visualize movement data that is automatically processed in the cloud, thereby eliminating the need for specialized hardware, software, and expertise. We show that OpenCap accurately predicts dynamic measures, like muscle activations, joint loads, and joint moments, which can be used to screen for disease risk, evaluate intervention efficacy, assess between-group movement differences, and inform rehabilitation decisions. Additionally, we demonstrate OpenCap's practical utility through a 100-subject field study, where a clinician using OpenCap estimated musculoskeletal dynamics 25 times faster than a laboratory-based approach at less than 1% of the cost. By democratizing access to human movement analysis, OpenCap can accelerate the incorporation of biomechanical metrics into large-scale research studies, clinical trials, and clinical practice. Analyzing how humans move, how we coordinate our muscles, and what forces act on the musculoskeletal system is important for studying neuro-musculoskeletal conditions. Traditionally, measuring these quantities requires expensive laboratory equipment, a trained expert, and hours of analysis. Thus, high-quality measures of human movement are rarely incorporated into clinical practice and large-scale research studies. The advent of computer vision methods for locating human joints from standard videos offers a promising alternative to laboratory-based movement analysis. However, it is unclear whether these methods provide sufficient information for informing biomedical research and clinical practice. Here, we introduce OpenCap, an open-source, web-based software tool for computing the motion (e.g., joint angles) and the musculoskeletal forces underlying human movement (e.g., joint forces) from smartphone videos. OpenCap combines advances in computer vision, machine learning, and musculoskeletal simulation to make movement analysis widely available without specialized hardware, software, or expertise. We validate OpenCap against laboratory-based measurements and show its usefulness for applications including screening for disease risk, evaluating intervention efficacy, and informing rehabilitation decisions. Finally, we highlight how OpenCap enables large-scale human studies of human movement in real-world settings.
更多
查看译文
关键词
smartphone videos,human movement
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要