Eindhoven
University of
Technology

Summary of the project

This project has been researching the possibility of formulating a modern generation of multiple-choice questions (MCQs) for the automatic assessment of all learning competencies in the Biomechanics course (8TB00). The project has been addressing the shortcomings of traditional open-question exams, which require manual marking and are prone to marking bias.

The project has been developing two types of MCQs:

  1. MCQs for single-strategy exercises: MCQs for single-strategy exercises: These exercises involve a reasoning node (R), a calculus node (C), and an answer node (A) – this is referred to as an RCA block. The project has been designing MCQs to assess both lower-order competencies (remembering facts) and higher-order competencies (understanding, applying, analyzing) within these exercises.
  2. MCQs for sequential-strategy exercises: These exercises involve a chain of multiple R-C-A blocks. The project has been exploring the hypothesis that these exercises can be reduced to a system of independent R-C-A blocks, allowing for automatic assessment through a set of extended MCQs.

Aim of the project

The primary aim of this project has been to create a fully digital, automatically assessed exam for the Biomechanics course. This aims to reduce the workload of educators, minimize marking bias, and provide students with faster feedback. The project has also been investigating the potential of MCQs to assess higher-order thinking skills, challenging the traditional view that MCQs are only suitable for assessing lower-order competencies.

By achieving these goals, the project has been aiming to:

Results and learnings

This grant laid the foundation of a robust framework for the automatic digital assessment of the Biomechanics course 8TB0. Student’s performance at the exam shows that it can perform as well as traditional manual assessments, with added benefits unique to digital formats. Crucially, this initiative helped raise the conversation and change perceptions about automatic digital assessment for mechanics-based courses. There are several avenues to further develop and innovate automated digital assessments, drawing inspiration from other fields, such as medicine. One approach could involve refining the R-C-A  blocks while creating more sophisticated distractor designs to evaluate learning objectives and student competencies at even finer levels. This aligns with a second development route: hybridising Extended Multiple-Choice Questions to include a mix of Simple Multiple-Choice subquestions and other formats (e.g., Jumbled Sentence Questions or interactive questions that may require Artificial Intelligence). A third route could introduce more complex weighting and penalty systems in grading—for instance, assigning greater weight to correct answers at higher cognitive levels and varying penalties based on the severity of errors. A fourth approach could involve generating automated feedback for each student based both on their specific performance at each questions as well as globally at the exam, automatically compiling tailored responses to each correct or incorrect answer provided (potentially via AI).


For more information, please contact:

Assistant Professor
Vito Conte
Gemini zuid 4.127
+31 40 247 5637

Tags

CompletedOnline exams
Automated feedback/assessment