Beyond multiple choice exams: using computerized lexical analysis to understand students' conceptual reasoning in STEM disciplines

FIE'09 Proceedings of the 39th IEEE international conference on Frontiers in education conference(2009)

引用 7|浏览2
暂无评分
摘要
Constructed response questions - in which students must use their own language in order to explain a phenomenon - create more meaningful opportunities for instructors to identify their students' learning obstacles than multiple choice questions. However, the realities of typical large-enrollment undergraduate classes restrict the options faculty have for moving towards more learner-focused instruction. We are exploring the use of computerized lexical analysis of students' writing in large enrollment undergraduate biology and geology courses. We have created libraries that categorize student responses with 90% accuracy. These categories can be used to predict expert ratings of student responses with accuracy approaching inter-rater reliability among expert raters. These techniques also provide insight into students' use of analogical thinking, a fundamental part of scientific modeling. These techniques have potential for improving assessment practices across STEM disciplines.
更多
查看译文
关键词
expert raters,computerized lexical analysis,multiple choice exam,conceptual reasoning,stem discipline,conceptual barriers,analogical thinking,assessment practice,lexical analysis software,index terms,categorize student response,constructed responses,large enrollment undergraduate biology,typical large-enrollment undergraduate class,expert rating,constructed response question,assessment,student response,biology,interrater reliability,inter rater reliability,multiple choice,chemistry,cognition,lexical analysis,multiple choice question
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要