Home Do Residency Selection Factors Predict Radiology Resident Performance?
Post
Cancel

Do Residency Selection Factors Predict Radiology Resident Performance?

Rationale and Objectives

The purpose of our study is to determine what information in medical student residency applications predicts radiology residency success as defined by objective clinical performance data.

Materials and Methods

We performed a retrospective cohort study of residents who entered our institution’s residency program through the National Resident Matching Program as postgraduate year 2 residents and completed the program over the past 2 years. Medical school grades, selection to Alpha Omega Alpha (AOA) Honor Society, United States Medical Licensing Examination (USMLE) scores, publication in peer-reviewed journals, and whether the applicant was from a peer institution were the variables examined. Clinical performance was determined by calculating each resident’s cumulative major discordance rate for on-call cases the resident read and gave a preliminary interpretation. A major discordance was defined as a difference between the preliminary resident and the final attending interpretations that could immediately impact the care of the patient. A multivariate logistic regression was performed to determine significant variables.

Results

Twenty-seven residents provided preliminary reports on call for 67,145 studies. The mean major discordance rate was 1.08% (range 0.34%–2.54%). Higher USMLE Step 1 scores, publication before residency, and election to AOA Honor Society were all statistically significant predictors of lower major discordance rates ( P values 0.01, 0.01,  and <0.001, respectively).

Conclusions

Overall resident performance was excellent. There are predictors that help select the better performing residents, namely higher USMLE Step 1 scores, one to two publications during medical school, and election to AOA in the junior year of medical school.

Introduction

Radiology remains a competitive specialty and has been bolstered by the addition of interventional radiology as an independent residency . As such, there continues to be a large competitive pool of applicants. Predicting which of these students will make the best radiology residents therefore remains one of the most daunting tasks for program directors and residency selection committees.

Applicants to radiology residency programs use the Electronic Residency Application Service (ERAS) to submit their applications and supporting documents to their selected programs. This information includes demographic data, objective data including medical school transcripts, election to Alpha Omega Alpha (AOA) Honor Society, United States Medical Licensure Examination (USMLE) Step 1 and 2 scores, publications in peer-reviewed journals, and subjective data including letters of recommendation, dean’s letter, and the applicant’s personal statement. Residency program directors and selection committees review the information provided through ERAS to help identify which applicants they think will become the best radiology residents. The importance assigned to each piece of information varies between residencies. Grantham surveyed radiology program directors and found that an overwhelming majority considered medical school grades, class rank, and selection to AOA to be very important factors, whereas roughly half emphasized USMLE scores . Only a handful of studies have investigated whether these variables can predict future success in a radiology residency. In one such study, the authors were unable to demonstrate any statistically significant value of USMLE scores in predicting performance on the American Board of Radiology (ABR) written and oral examinations . Another group of investigators found that medical school grades in some preclinical and clinical courses and USMLE scores could predict success on the ABR examinations but did not predict performance during radiology residency rotations .

Get Radiology Tree app to read full this article<

Materials and Methods

Get Radiology Tree app to read full this article<

TABLE 1

Examples of Significant Discordances

Actual Significant Discordances Reported During Study Period Subspecialty Preliminary Interpretation Final Interpretation Neuroradiology No acute intracranial injury Acute right hemispheric subdural hematoma Pediatrics Appendix remains compressible and within upper limits of normal in caliber Acute appendicitis Thoracic No central or segmental pulmonary embolism Segmental left upper lobe pulmonary embolism Abdominal imaging No filling defect in collecting systems Acute left ovarian vein thrombosis Nuclear medicine Radiotracer uptake within distal sigmoid and rectum suggestive of bleeding source No abnormal tracer activity within GI tract to identify active GI bleed during image acquisition Musculoskeletal No acute fracture Displaced acute comminuted intertrochanteric fracture with varus deformity of the left hip

GI, gastrointestinal.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

IRB Statement

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

TABLE 2

Resident Descriptors

Resident Characteristic Sex 21 M/6 F Graduate of US Medical School 26/27 (96%) MD degree 27/27 (100%) MD/PhD 0/27 (0%) Elected to Alpha Omega Alpha (AOA) Honor Society Junior year 14 (52%) Senior year 5 (19%) Medical school does not have AOA 2 (7%) No 6 (22%) USMLE Step 1 mean and standard deviation 248.1 (10.1) USMLE Step 2 mean and standard deviation 246.6 (19.3) Recruited from a peer medical school 9/27 (33%) Published during medical school 17/27 (63%)

USMLE, United States Medical Licensing Examination.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 3

Resident Volume and Discordance Data

Study Type # of studies Median per Res. (std dev) Range # of Discord (std dev) Discord Rate Total 67,145 2,341 (721) 1,498–4,870 710 1.06% CT 36,682 1,303 (280) 824–2,055 307 0.84% Radiographs 17,831 599 (424) 276–2,352 274 1.54% Ultrasound 6,853 245 (60) 169–471 37 0.54% MRI 4,941 173 (48) 112–289 89 1.80% Nuc. Med. 476 17 (9) 6–48 2 0.42%

CT, computed tomography; MRI, magnetic resonance imaging.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 4

Univariate Analysis of Major Discordance Rates

Elected to AOA Junior

0.88 Senior

1.18 N/A

1.33 No

1.44 USMLE Step 1 <230 230–249 >249 Number 2 10 15 Major discordance rate (%) 1.92 1.17 0.91 USMLE Step 2 <230 230–249 >249 Number 3 10 14 Major discordance rate (%) 1.44 1.10 0.98 Accepted from peer school Yes No Major discordance rate (%) 1.09 1.07 Published during medical school Yes No Major discordance rate (%) 1.03 1.17 Number of publications 0 1–2 >3 Number 10 10 7 Major discordance rate (%) 1.17 0.92 1.17 Third year surgery grade Honors High pass Pass Number 20 5 2 Major discordance rate (%) 1.07 0.92 1.53 Third year medicine grade Honors High pass Pass Number 14 9 4 Major discordance rate (%) 1.00 1.24 0.99 Third year OB/Gyn grade Honors High pass Pass Number 9 12 6 Major discordance rate (%) 1.08 1.02 1.21

AOA, Alpha Omega Alpha; USMLE, United States Medical Licensing Examination.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 5

Pairwise Comparison of Positive Results From Multivariate Logistic Analysis of Major Discordance Rates

Variable Odds Ratio \* 95% Confidence Interval USMLE Step 1 score < 230 vs. ≥249 1.97 1.28–3.04 230–249 vs. >249 1.47 1.11–1.95 Election to AOA Not elected or not available vs. Junior AOA 1.38 1.00–1.89 Senior AOA vs. junior AOA 1.53 1.07–2.18 Publications Zero publications vs. 1–2 publications 1.85 1.25–2.73 ≥3 publications vs. 1–2 publications 1.52 1.04–2.23 Third year medicine grade High pass or pass in medicine vs. honors 0.674 0.47–0.954

AOA, Alpha Omega Alpha; USMLE, U.S. Medical Licensing Examination.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. National Resident Matching Program : Results and data: 2017 main residency match©.2017.National Resident Matching ProgramWashington, DC

  • 2. Grantham J.R.: Radiology resident selection: results of a survey. Invest Radiol 1993; 8: pp. 99-101.

  • 3. Gunderman R.B., Jackson V.P.: Are NBME examination scores useful in selecting radiology residency candidates?. Acad Radiol 2000; 7: pp. 603-606.

  • 4. Boyse T.D., Patterson S.K., Cohan R.H., et. al.: Does medical school performance predict radiology resident performance?. Acad Radiol 2002; 9: pp. 437-445.

  • 5. Branstetter B.F., Morgan M.B., Nesbit C.E., et. al.: Preliminary reports in the emergency department: Is a subspecialist radiologist more accurate than a radiology resident?. Acad Radiol 2007; 14: pp. 201-206.

  • 6. Burish M.J., Fredericks C.A., Engstrom J.W., et. al.: Predicting success: what medical student measures predict resident performance in neurology?. Clin Neurol Neurosurg 2015; 135: pp. 69-72.

  • 7. Van Meter M., Williams M., Banuelos R., et. al.: Does the National Resident Match Program rank list predict success in emergency medicine residency programs?. J Emerg Med 2017; 52: pp. 77-82.

  • 8. Wagner J.G., Schneberk T., Zobrist M., et. al.: What predicts performance? A multicenter study examining the association between resident performance, rank list position, and United States Medical Licensing Examination Step 1 scores. J Emerg Med 2017; 52: pp. 332-340.

  • 9. McCaskill Q.E., Kirk J.J., Barata D.M., et. al.: USMLE step 1 scores as a significant predictor of future board passage in pediatrics. Ambul Pediatr 2007; 7: pp. 192-195.

  • 10. Kanna B., Gu Y., Akhuetie J., et. al.: Predicting performance using background characteristics of international medical graduates in an inner-city university-affiliated Internal Medicine residency training program. BMC Med Educ 2009; 9: pp. 42.

  • 11. Spurlock D.R., Holden C., Hartranft T.: Using United States Medical Licensing Examination (USMLE) examination results to predict later in-training examination performance among general surgery residents. J Surg Educ 2010; 67: pp. 452-456.

  • 12. Tolan A.M., Kaji A.H., Quach C., et. al.: The electronic residency application service application can predict Accreditation Council for Graduate Medical Education competency-based surgical resident performance. J Surg Educ 2010; 67: pp. 444-448.

  • 13. Kron I.L., Kaiser D.L., Nolan S.P., et. al.: Can success in the surgical residency be predicted from pre-residency evaluation?. Ann Surg 1985; 202: pp. 694-695.

  • 14. Erlandson E.E., Calhoun J.G., Barrack F.M., et. al.: Resident selection: applicant selection criteria compared with performance. Surgery 1982; 92: pp. 270-275.

  • 15. Daly K.A., Levine S.C., Adams G.L.: Predictors for resident success in otolaryngology. J Am Coll Surg 2006; 202: pp. 649-654.

  • 16. Barzansky B., Etzel S.I.: Medical schools in the United States, 2012–2013. JAMA 2013; 310: pp. 2319-2327.

  • 17. Brothers T.E., Wetherholt S.: Importance of the faculty interview during the resident application process. J Surg Educ 2007; 64: pp. 378-385.

  • 18. Adusumilli S., Cohan R.H., Korobikin M., et. al.: Correlation between radiology resident rotation performance and examination scores. Acad Radiol 2000; 7: pp. 920-926.

  • 19. Dudek N.L., Marks M.B., Regehr G.: Failure to fail: the perspectives of clinical supervisors. Acad Med 2005; 80: pp. S84-S87.

  • 20. Collins J.: Evaluation of residents, faculty, and program. Acad Radiol 2003; 10: pp. S35-S43.

  • 21. Alderson P.O., Becker G.J.: The new requirements and testing for American Board of Radiology certification in diagnostic radiology. Radiology 2008; 248: pp. 707-709.

  • 22. Ruchman R.B., Jaeger J., Wiggins E.F., et. al.: Preliminary radiology resident interpretations versus final attending radiologist interpretations and the impact on patient care in a community hospital. Am J Roentgenol 2007; 189: pp. 523-526.

  • 23. Cooper V.F., Goodhartz L.A., Nemcek A.A., et. al.: Radiology resident interpretations of on-call imaging studies: the incidence of major discrepancies. Acad Radiol 2008; 15: pp. 1198-1204.

  • 24. Ruutiainen A.T., Scanlon M.H., Itri J.N.: Identifying benchmarks for discrepancy rates in preliminary interpretations provided by radiology trainees at an academic institution. J Am Coll Radiol 2011; 8: pp. 644-648.

  • 25. Sistrom C., Deitte L.: Factors affecting attending agreement with resident early readings of computed tomography and magnetic resonance imaging of the head, neck, and spine. Acad Radiol 2008; 15: pp. 934-941.

  • 26. Wildenberg J.C., Chen P., Scanlon M.H., et. al.: Attending radiologist variability and its effect on radiology resident discordance rates. Acad Radiol 2017; 24: pp. 694-699.

This post is licensed under CC BY 4.0 by the author.