Email updates

Keep up to date with the latest news and content from BMC Medical Education and BioMed Central.

Open Access Research article

Comparison between Long-Menu and Open-Ended Questions in computerized medical assessments. A randomized controlled trial

Thomas Rotthoff1*, Thomas Baehring1, Hans-Dieter Dicken2, Urte Fahron3, Bernd Richter1, Martin R Fischer4 and Werner A Scherbaum12

Author Affiliations

1 Department of Endocrinology, Diabetes and Rheumatology University Hospital, Moorenstr. 5, 40225 Duesseldorf, Germany

2 German Diabetes Centre at Duesseldorf University, Auf'm Hennekamp, 40225 Duesseldorf, Germany

3 Central Institute of Clinical Chemistry and Laboratory Medicine, University Hospital Duesseldorf, Moorenstr. 5, 40225 Duesseldorf, Germany

4 University of Munich Hospital, Medizinische Klinik Innenstadt, Schwerpunkt Medizindidaktik, Ziemssenstr. 1, 80336 Munich, Germany

For all author emails, please log on.

BMC Medical Education 2006, 6:50  doi:10.1186/1472-6920-6-50

Published: 10 October 2006

Abstract

Background

Long-menu questions (LMQs) are viewed as an alternative method for answering open-ended questions (OEQs) in computerized assessment. So far this question type and its influence on examination scores have not been studied sufficiently. However, the increasing use of computerized assessments will also lead to an increasing use of this question type.

Using a summative online key feature (KF) examination we evaluated whether LMQs can be compared with OEQs in regard to the level of difficulty, performance and response times. We also evaluated the content for its suitability for LMQs.

Methods

We randomized 146 fourth year medical students into two groups. For the purpose of this study we created 7 peer-reviewed KF-cases with a total of 25 questions. All questions had the same content in both groups, but nine questions had a different answer type. Group A answered 9 questions with an LM type, group B with an OE type. In addition to the LM answer, group A could give an OE answer if the appropriate answer was not included in the list.

Results

The average number of correct answers for LMQs and OEQs showed no significant difference (p = 0.93). Among all 630 LM answers only one correct term (0.32%) was not included in the list of answers. The response time for LMQs did not significantly differ from that of OEQs (p = 0.65).

Conclusion

LMQs and OEQs do not differ significantly. Compared to standard multiple-choice questions (MCQs), the response time for LMQs and OEQs is longer. This is probably due to the fact that they require active problem solving skills and more practice. LMQs correspond more suitable to Short answer questions (SAQ) then to OEQ and should only be used when the answers can be clearly phrased, using only a few, precise synonyms.

LMQs can decrease cueing effects and significantly simplify the scoring in computerized assessment.