Multiple-choice tests are actually making students worse at knowing what they don't know.
March 26, 2026
Original Paper
Assessment Format Matters: Evidence for Differences in Metacogni-tive Resolution Between Multiple-Choice and Open-Ended Exams
EdArXiv · v8n5p_v1
The Takeaway
While open-ended questions force students to confront gaps in their knowledge, multiple-choice formats provide recognition cues that inflate confidence. This leads to a 'metacognitive' failure where students think they understand the material much better than they actually do, unlike generative tasks which offer more 'diagnostic' feedback.
From the abstract
Assessment format may influence not only students’ performance but also how they monitor and evaluate their own learning. This study examined how multiple-choice and open-ended questions are associated with different components of metacognitive monitoring in a real university exam context. A sample of 150 undergraduate stu-dents completed an exam including both formats and provided self-assessments (SSA) and confidence judgments (JC) for each section. Results showed that students achieved higher