Home Effecting Change Based on What Program Directors Think
Post
Cancel

Effecting Change Based on What Program Directors Think

Within the discussion of the recent radiology program directors’ surveys , the statement was made that “recent data show that although citizen international medical graduates (IMG)s do not perform as well as noncitizen IMGs and US medical doctors (MD)s, noncitizen IMGs have better patient outcomes than both non-citizen IMGs and US MDs.” The recent studies referenced included a review of United States Medical Licensing Exam steps 1 and 2 scores from 1995 to 2004, a 1981 study of IMGs taking internal medicine certifying exams, a survey of internal medicine board exam scores of Caribbean medical school graduates from 1984 to 1987, a 2010 retrospective assessment of IMGs caring for patients with congestive heart failure or myocardial infarction, and a 2002 analysis of internal medicine in-training exam results. The connection of any of these manuscripts to modern performance in diagnostic radiology residency programs is difficult to make.

My own anecdotal experience is that IMGs, DOs, and postmatch applicants turned away from surgical specialties have been exemplary radiology trainees. Until there is evidence that these candidates perform differently as radiologists when compared to traditional allopathic physicians while controlling for residency program quality/performance, I struggle to appreciate the significance of this applicant trait.

An additional concern in reading the survey results is that 91% of respondents rated the prior oral exam as superior to the new Core examination in testing readiness for clinical practice. If the overseers of resident training come out so strongly in favor of the exam of the past, is it reasonable to suggest that returning to the oral exam structure may be a more suitable alternative than advocating for a nationally standardized core curriculum aimed at decoding multiple choice items as favored (98%) by survey respondents? Change was rooted in the sentiment that “the greater degree to which the new examinations emulate the clinical practice of radiology will more than balance any loss experienced by eliminating the oral examination” . Perhaps my supposition that the ship has sailed was premature.

I am reminded of a soft drink company that decided to change its long-utilized recipe in 1985 only to revert to its classic formulation 3 months later. While I doubt that company officials waited until 91% of those consuming the product preferred the taste of the prior version before reverting, the question of appropriate process improvement comes to mind given the findings here. The American Board of Medical Specialties implores its member boards to require that certified physicians become versed and practiced in quality improvement methodologies, and the ABR embraces this mission by insisting that its diplomates possess a modicum of knowledge dedicated to noninterpretive skills that include many of these principles. One such exercise is the Plan-Do-Study-Act cycle that we have witnessed in action by the ABR regarding Maintenance of Certification (MOC). A well-planned initiative to evaluate cognitive expertise was executed, but subsequent study compelled the board to act by modifying its MOC process this year. The new initial certification exams were likewise thoroughly planned and executed, but study of the initial rounds suggests some divergence from the intent of the change. It is unsettling that resident education could devolve into an arrangement whereby the infusion of medical knowledge takes place via a centralized cyber professor when the prior strategy of candidate-to-expert panel, face-to-face assessment is viewed by radiology educators as a superior means to judge readiness for clinical practice.

A message of gratitude is in order toward the Association of Program Directors in Radiology for a comprehensive survey and much food for thought regarding diagnostic radiology education. I hope that future iterations of the survey include an attempt to capture the pulse of members with respect to other hot button issues such as shifts in resident overnight call autonomy, program correlates with board exam performance, and job market perspectives.

References

  • 1. Rozenshtein A., Heitkamp D.E., Muhammad T.H., et. al.: “What program directors think” III: results of the 2014/2015 annual surveys of the Association of Program Directors in Radiology (APDR). Acad Radiol 2016; pp. 861-869.

  • 2. Alderson P.O., Becker G.J.: The new requirements and testing for American Board of Radiology certification in diagnostic radiology. Radiology 2008; pp. 707-709.

  • 3. Pfeifer C.M.: Changes to radiology: simpler is better. Acad Radiol 2015; 22: pp. 1326-1327.

This post is licensed under CC BY 4.0 by the author.