Automatic scoring of medical students' clinical notes to monitor learning in the workplace.

MEDICAL TEACHER(2014)

引用 19|浏览4
暂无评分
摘要
Background: Educators need efficient and effective means to track students' clinical experiences to monitor their progress toward competency goals. Aim: To validate an electronic scoring system that rates medical students' clinical notes for relevance to priority topics of the medical school curriculum. Method: The Vanderbilt School of Medicine Core Clinical Curriculum enumerates 25 core clinical problems (CCP) that graduating medical students must understand. Medical students upload clinical notes pertinent to each CCP to a web-based dashboard, but criteria for determining relevance of a note and consistent uploading practices by students are lacking. The Vanderbilt Learning Portfolio (VLP) system automates both tasks by rating relevance for each CCP and uploading the note to the student's electronic dashboard. We validated this electronic scoring system by comparing the relevance of 265 clinical notes written by third year medical students to each of the 25 core patient problems as scored by VLP verses an expert panel of raters. Results: We established the threshold score which yielded 75% positive prediction of relevance for 16 of the 25 clinical problems to expert opinion. Discussion: Automated scoring of student's clinical notes provides a novel, efficient and standardized means of tracking student's progress toward institutional competency goals.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要