The American Board of Radiology’s (ABR) new Core Exam is not working, at least not as well as it needs to. Having helped to prepare candidates (RG), studied for and taken the exam (WK), and talked with hundreds of candidates who have taken the exam (RG and WK), we believe that that one aspect of the exam, its validity, can be significantly enhanced.
Just as we expect candidates for board certification and practicing radiologists to measure up to the high standards, so we should subject the board exam to continuous scrutiny, seeking opportunities to rectify errors and enhance the exam’s overall quality, with a view to better promoting excellence in radiology practice and the care of patients. Here we focus on two exam parameters: validity and timing.
The Exam
Taken 36 months after the beginning of radiology residency, the Core Exam is administered over 2 days at either the Chicago or Tucson exam center. According to the board, it “tests knowledge and comprehension of anatomy, pathophysiology, all aspects of diagnostic radiology, and physics concepts important for diagnostic radiology.”
Eighteen categories are included: breast, cardiac, gastrointestinal, interventional, musculoskeletal, neuroradiology, nuclear, pediatric, reproductive or endocrinology, thoracic, urinary, vascular, computed tomography, magnetic resonance, radiography or fluoroscopy, ultrasound, physics, and safety. The exam is administered twice per year, June and November.
Validity
Simply put, the validity of a test is the extent to which it accurately assesses what it is intended to assess. A valid test is one whose results actually mean something, whereas an invalid test is one whose results fail to tell us much—or positively mislead us—about whatever the test is designed to assess.
A famous example of a test with poor validity was the Scholastic Aptitude Test. After decades of research failed to support the hypothesis that the test actually provided useful assessment of scholastic aptitude, in the 1990s the test’s owners finally changed its name to SAT . The test is not completely useless, but less than 20% of college performance is predicted by SAT scores.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Face Validity
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Construct Validity
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Content Validity
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Concurrent Validity
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Discriminative Validity
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Predictive Validity
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Timing
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Conclusion
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
References
1. Commission on New Possibilities for the Admissions Testing Program : Beyond prediction.1990.College Entrance Examination BoardNew York City, NYpp. 9.
2. American Board of Radiology : Quality & safety domain specification & resource guide. Tuscon, AZ2016.pp. 34-35.
3. Weiner I.B., Craighead W.E.: The Corsini encyclopedia of psychology.2010.WileyHoboken, NJpp. 637-638.
4. Devitt J.H., Kurrek M.M., Cohen M.M., et. al.: Testing internal consistency and construct validity during evaluation of performance in a patient simulator. Anesth Analg 1998; 86: pp. 1160-1164.
5. Lawshe C.H.: A quantitative approach to content validity.1976.Personnel Psychology, Inc.Lafayette, INpp. 563-575.
6. Validity in assessments: content, construct & predictive validity. 27 Apr.; Study.com, Mountain View, CA; Available at: http://study.com/academy/lesson/validity-in-assessments-content-construct-predictive-validity.html
7. Cronbach L.J., Meehl P.E.: Construct validity in psychological tests. S.l.: S.n.; Psychological Bulletin, Washington DC1955.