An Experience Of Automated Assessment In A Large-Scale Introduction Programming Course

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION(2021)

引用 15|浏览1
暂无评分
摘要
The 2020 pandemic imposed new demands on teaching practices to support student's distance learning process. In this context, automated assessment (AA) is a pivotal resource that offers immediate and automatic feedback on students' programming tasks. Although the literature provides several contributions regarding AA of Programming Exercises (PEs), very few works discuss the automatic generation of personalized PE. This study reports our experience in applying a new proposal for AA-PE in an Introduction to Programming (IP) course for a large group of students. This proposal's key feature is the ability to apply AA-PE and parameterized unified exams to different programming languages by using the open-source tools MCTest and Moodle (with virtual programming lab [VPL] plugin). During the first quarter of 2019, teachers of 19 of 44 IP-FF (face-to-face) classes embraced our approach as a component in their pedagogical intervention. These classes achieved a higher pass rate (67.5%) than those that did not adopt our AA solution (59.1%), whereas the standard deviation was quite the same (22.5% and 21.3%, respectively). Additionally, preliminary results revealed a strong linear correlation (r = .93) between the pass rate and the average grade of the AA-PE. In IP-BL (blended learning), two classes used our method in the exams, with 171 students and a pass rate of 70.4%. These results corroborate previous works that continuous assessment combined with immediate feedback can contribute to students' learning process.
更多
查看译文
关键词
automated assessment, computer science education, CS1, parameterized assessment
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要