Home Does Educator Training or Experience Affect the Quality of Multiple-Choice Questions?
Post
Cancel

Does Educator Training or Experience Affect the Quality of Multiple-Choice Questions?

Rationale and Objectives

Physicians receive little training on proper multiple-choice question (MCQ) writing methods. Well-constructed MCQs follow rules, which ensure that a question tests what it is intended to test. Questions that break these are described as “flawed.” We examined whether the prevalence of flawed questions differed significantly between those with or without prior training in question writing and between those with different levels of educator experience.

Materials and Methods

We assessed 200 unedited MCQs from a question bank for our senior medical student radiology elective: an equal number of questions (50) were written by faculty with previous training in MCQ writing, other faculty, residents, and medical students. Questions were scored independently by two readers for the presence of 11 distinct flaws described in the literature.

Results

Questions written by faculty with MCQ writing training had significantly fewer errors: mean 0.4 errors per question compared to a mean of 1.5–1.7 errors per question for the other groups ( P < .001). There were no significant differences in the total number of errors between the untrained faculty, residents, and students ( P values .35–.91). Among trained faculty 17/50 questions (34%) were flawed, whereas other faculty wrote 38/50 (76%) flawed questions, residents 37/50 (74%), and students 44/50 (88%). Trained question writers’ higher performance was mainly manifest in the reduced frequency of five specific errors.

Conclusions

Faculty with training in effective MCQ writing made fewer errors in MCQ construction. Educator experience alone had no effect on the frequency of flaws; faculty without dedicated training, residents, and students performed similarly.

Physicians are rarely trained to properly write multiple-choice examinations, including those working in academic settings. However, this skill set has become much more relevant in recent years. With the transition to the new written format of radiology board certification examinations , the development of more rigorous self-assessment requirements for maintenance of certification examinations , and the greater inclusion of radiology into integrated medical student curricula , multiple choice radiology questions are in great demand.

Well-constructed multiple-choice questions (MCQs) follow a set of parameters that ensure the question tests what it is intended to test . Questions that violate widely agreed on rules are described in the education literature as flawed . In simple terms, a flawed question tends to test “how good of a test taker” someone is, rather than the relevant knowledge intended, which can disadvantage some students . Previous literature examining MCQs has revealed that such mistakes are common within continuing medical education (CME) materials and on health care sciences examinations .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and Methods

Study Design

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

MCQ Flaws

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Data Collection

Get Radiology Tree app to read full this article<

Statistical Analysis

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 1

Prevalence of Multiple Choice Question Flaws for Different Educator Groups

Group Number of Questions Number of Flaws Overall Number of Flawed Questions Mean Errors per Question Trained Faculty 50 19 17 (34%) 0.4 Faculty 50 74 38 (76%) 1.5 Residents 50 76 37 (74%) 1.5 Students 50 85 44 (88%) 1.7 Totals 200 254 136 (68%) 1.3

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Distribution of Each Multiple Choice Question Flaw Across Different Educator Groups

Question Flaw Number of Occurrences Core Faculty Faculty Residents Students Totals Statistical Difference, P Values ∗ (1) Content not important 0 4 7 11 22.004 (2) Unfocused stem 0 20 22 15 57.001 (3) Negative phrasing 1 15 16 8 40.001 (4) Multiple answer options 0 9 7 9 25.019 (5) Too many or few answer options 0 9 4 8 21.013 (6) Superfluous information 0 1 0 11 12.001 (7) Unequal option length 6 4 7 11 28 .231 (8) Absolute or vague terms 0 3 5 3 11 .181 (9) Grammatical clues 2 2 0 0 4 .255 (10) Logical clues 0 1 1 3 5 .275 (11) Convergence 10 6 7 6 29 .631 Totals 19 74 76 85 254.0001

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

References

  • 1. Initial Certification: Diagnostic Radiology. Available at: http://www.theabr.org//ic-dr-landing . Accessed May 18, 2015.

  • 2. Madewell J.E., Hattery R.R., Thomas S.R., et. al.: American Board of Radiology: maintenance of certification. Radiology 2005; 234: pp. 17-25.

  • 3. ABMS Evidence Library. Available at: http://www.abms.org/initiatives/committing-to-physician-quality-improvement/evidence-library/ . Accessed May 20, 2015.

  • 4. Maintenance of Certification. Available at: http://www.theabr.org/moc-gen-landing . Accessed May 20, 2015.

  • 5. Straus C.M., Webb E.M., Kondo K.L., et. al.: Medical student radiology education: summary and recommendations from a national survey of medical school and radiology department leadership. J Am Coll Radiol 2014; 11: pp. 606-610. Epub Apr 6, 2014

  • 6. McCoubrie P.: Improving the fairness of multiple-choice questions: a literature review. Med Teach 2004; 26: pp. 709-712.

  • 7. Haladyna T.M., Downing S.M., Rodriguez M.C.: A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ 2002; 15: pp. 309-334.

  • 8. Collins J.: Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiographics 2006; 26: pp. 543-551.

  • 9. Downing S.M.: Construct-irrelevant variance and flawed test questions: do multiple-choice item-writing principles make a difference?. Acad Med 2002; 77: pp. S103-S104.

  • 10. Downing S.M.: The effect of violating standard item writing principles on test and students: the consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract 2005; 10: pp. 133-143.

  • 11. Brunnquell A., Degirmenci U., Kreil S., et. al.: Web-based application to eliminate five contraindicated multiple-choice question practices. Eval Health Prof 2011; 34: pp. 226-238.

  • 12. Case S.M.: The use of imprecise terms in examination questions: how frequent is frequently?. Acad Med 1994; 69: pp. S4-S6.

  • 13. Case S.M., Swanson D.B., Becker D.F.: Verbosity, window dressing, and red herrings: do they make a better test item?. Acad Med 1996; 71: pp. S28-S30.

  • 14. DiSantis D.J., Ayoob A.R., Williams L.E.: Journal club: prevalence of flawed multiple-choice questions in continuing medical education activities of major radiology journals. AJR Am J Roentgenol 2015; 204: pp. 698-702.

  • 15. Stagnaro-Green A.S., Downing S.M.: Use of flawed multiple-choice items by the New England Journal of Medicine for continuing medical education. Med Teach 2006; 28: pp. 566-568.

  • 16. Tarrant M., Knierim A., Hayes S.K., et. al.: The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Educ Pract 2006; 6: pp. 354-363. Epub Nov 13, 2006

  • 17. Abdulghani H.M., Ahmad F., Irshad M., et. al.: Faculty development programs improve the quality of multiple choice questions items’ writing. Sci Rep 2015; 5: pp. 9556.

  • 18. Masters S.: Tips for constructing multiple choice exam questions [faculty development handout].2010.School of Medicine, University of California, San FranciscoSan Francisco, CA

  • 19. American Board of Radiology website: Item writers’ guide.2009. www.aur.org/uploadedFiles/Alliances/AMSER/Educator_Resources/Student_Evaluation/ABR-Item-Writers-Guide.pdf Accessed September 2, 2013

  • 20. National Board of Medical Examiners: Technical item flaws.Item writing manual.2002.National Board of Medical ExaminersPhiladelphia, PA: www.nbme.org/PDF/ItemWriting_2003/2003IWGwhole.pdf Accessed September 2, 2013

  • 21. Bloom B.S.: Taxonomy of educational objectives: cognitive domain.1984.LongmanNew York

This post is licensed under CC BY 4.0 by the author.