Home Musculoskeletal Ultrasound Training for Radiology Residents
Post
Cancel

Musculoskeletal Ultrasound Training for Radiology Residents

Rationale and Objectives

A prospective randomized study was conducted to assess whether an electronic learning module was as effective as a didactic lecture to teach musculoskeletal ultrasound to radiology residents.

Materials and Methods

Thirty-three residents were randomized into a module group versus a didactic group. All residents took a written “pretest” to assess baseline knowledge. Subsequently, the 17 residents in the didactic group attended a live didactic session delivered by a subspecialist musculoskeletal radiology faculty member. The 16 residents in the module group completed an electronic learning module that contained similar content to the live didactic session. Finally, all residents completed a written “posttest,” which served as the outcome measure.

Results

Mean score significantly improved between pre- and posttest by 10.6 ± 11.2% in the didactic group (DG; P = 0.002) and 14.0 ± 8.2% in the module group (MG; P < 0.001), with a nonsignificant difference between groups ( P = 0.4). Mean pretest scores (75.6 ± 9.4% DG and 73.7 ± 9.2% MG, P = 0.6) and posttest scores (86.2 ± 9.7% DG and 87.7 ± 5.2% MG, P = 0.5) were not significantly different. The adjusted mean difference in posttest scores between groups was −1.9% (95% confidence interval: −7.2 to 3.5%).

Conclusion

If didactic training was better than electonic module training, the difference was relatively small (<5%). A similar web-based, interactive module could be employed to teach American Board of Radiology Core Examination and Accreditation Council for Graduate Medical Education (ACGME) Diagnostic Radiology Milestone musculoskeletal ultrasound learning objectives to radiology residents. An electronic module could reduce demands on faculty staff time invested in musculoskeletal ultrasound training and be more widely available to residents.

Introduction

Ultrasound is a dynamic, safe, relatively low-cost, portable, and rapidly growing cross-sectional imaging modality, comprising nearly 25% of all imaging studies performed globally . In the US Medicare/Medicaid patient population, musculoskeletal (MSK) ultrasound volume has increased by 316% from 2000 to 2009, with approximately 234,000 reimbursed MSK ultrasound procedures in 2009, 38.9% of which were performed by radiologists . Accordingly, the new American Board of Radiology (ABR) Diagnostic Radiology Core Examination covers multiple MSK ultrasound topics. As detailed in the ABR Core Examination study guide, these topics include normal anatomy, pathology, and American College of Radiology (ACR) Appropriateness Criteria for effective clinical use of ultrasound and its integration with complementary cross-sectional imaging studies . In addition, the 2012 ACGME and ABR Diagnostic Radiology Milestones Project includes “recognition of suboptimal images” and “application of physical principles to optimize image quality” within the expected resident competencies . To address these requirements for radiology residents, generalized curricular content guidelines were recently published by the Society of Radiologists in Ultrasound Resident Curriculum based on expert opinion ; but to date, no prospective studies have evaluated specific educational modalities for teaching MSK ultrasound to radiology residents, and training methods remain nonstandardized at the national level.

Point of care ultrasound is rapidly finding diagnostic and therapeutic niches within numerous medical specialties, and research regarding emerging curricula is booming in kind. A review of recent literature regarding development of ultrasound curricula for physicians in training reveals studies spanning multiple disciplines just within the past 2 years—ie, emergency medicine , rheumatology , sports medicine , general surgery , physical medicine and rehabilitation , neurosurgery , and anesthesia . Many of these curricula contain both didactic and simulation-based, “hands-on” teaching sessions—allowing tailored education that focuses on the trainee within a realistic and safe environment. Even so, data to support specific training methods for ultrasound remain limited and application dependent.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Subjects and Methods

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 1, Screenshot from the electronic musculoskeletal ultrasound module. An interactive and dynamic toolbar was integrated into the module (screen upper left), as were clickable multiple-choice questions targeted at the image-based anatomic, pathologic, and technical learning objectives (screen bottom center).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 2, Study design flowchart. Thirty-three radiology residents were recruited and subsequently randomized by postgraduate-year (PGY) level into a module training or a didactic training group, with 2 weeks between the enrollment and the training sessions. In total, 16 participants in each group were included in data analysis. Please see the Subjects and Methods section for further details. *Six individuals of corresponding PGY level were randomly swapped between the two groups because of scheduling conflicts that emerged following initial randomization. †One individual arrived 30 minutes late to the didactic session and was excluded from data analysis.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Table 1

Baseline Experience as Reported on Questionnaire #1

Variable Training Group_P_ Value \* Didactic ( N = 16) Module ( N = 16) Postgraduate year (PGY) PGY2 5(31.2) 5(31.2) 0.86 PGY3 4(25.0) 5(31.2) PGY4 3(18.8) 4(25.0) PGY5 4(25.0) 2(12.5) Experience level with musculoskeletal ultrasound † Some 11(68.8) 15(93.8) 0.17 None 5(31.2) 1(6.2) Time since last experience with musculoskeletal ultrasound Within 1 year 9(56.2) 11(68.8) 0.46 >1 year 0(0.0) 1(6.2) Never 7(43.8) 4(25.0)

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Comparison of Test Scores Between Groups

Variable Training Group Difference Between Groups \* Lecture ( N = 16) Presentation ( N = 16) Mean (95% CI)P Value Pretest score Total, % (out of 29) 75.6 ± 9.4 73.7 ± 9.2 1.4 (−4.6, 7.4) 0.64 Anatomy questions, % (out of 7) 81.2 ± 16.3 79.5 ± 22.7 0.9 (−12.6, 14.4) 0.90 Tech/IO/Physics questions, % (out of 11) 71.6 ± 14.0 70.5 ± 13.5 0.6 (−9.0, 10.2) 0.90 Pathology questions, % (out of 11) 76.1 ± 13.6 73.3 ± 10.2 2.5 (−6.2, 11.2) 0.55 Posttest score Total, % (out of 29) 86.2 ± 9.7 87.7 ± 5.2 −1.9 (−7.2, 3.5) 0.48 Anatomy questions, % (out of 7) 97.3 ± 5.8 97.3 ± 5.8 −0.2 (−4.3, 4.0) 0.93 Tech/IO/Physics questions, % (out of 11) 85.8 ± 12.0 84.1 ± 10.2 1.3 (−6.5, 9.1) 0.74 Pathology questions, % (out of 11) 79.5 ± 13.5 85.2 ± 9.3 −6.1 (−14.3, 2.1) 0.14 Score difference (post − pre) Total, % (out of 29) 10.6 ± 11.2 14.0 ± 8.2 −3.3 (−10.4, 3.9) 0.36 Anatomy questions, % (out of 7) 16.1 ± 16.4 17.9 ± 24.7 −1.0 (−15.9, 13.8) 0.89 Tech/IO/Physics questions, % (out of 11) 14.2 ± 16.6 13.6 ± 15.2 0.7 (−11.0, 12.4) 0.90 Pathology questions, % (out of 11) 3.4 ± 14.4 11.9 ± 16.5 −8.6 (−20.0, 2.8) 0.13

IO, image optimization; Tech, technique.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, Comparison of mean pretest and posttest scores. *Mean score significantly improved between the pretest and the posttest by 10.6 ± 11.2% in the didactic group ( P = 0.002) and by 14.0 ± 8.2% in the module group ( P < 0.001). The mean pretest scores were not significantly different between groups (75.6 ± 9.4% vs. 73.7 ± 9.2%, P = 0.64). The mean posttest scores were also not significantly different between groups (86.2 ± 9.7 vs. 87.7 ± 5.2%, P = 0.48). Error bars represent one standard deviation from the mean.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusion

Get Radiology Tree app to read full this article<

Appendix

Outline of Electronic Learning Module and Didactic Lecture Content

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Goldberg B.B.: International arena of ultrasound education. J Ultrasound Med 2003; 22: pp. 549-551.

  • 2. Sharpe R.E., Nazarian L.N., Parker L., et. al.: Dramatically increased musculoskeletal ultrasound utilization from 2000 to 2009, especially by podiatrists in private offices. J Am Coll Radiol 2012; 9: pp. 141-146.

  • 3. ABR CORE examination study guide, downloaded 7/13/2015. Subsections “Ultrasound” and “Musculoskeletal”; Available at: http://www.theabr.org/sites/all/themes/abr-media/pdf/CORE_Exam_Study_Guide_FINAL(V10).pdf

  • 4. The diagnostic radiology milestones project, a joint institution of the ACGME and the ABR, downloaded 7/13/2015. Subsection “Patient Care and Technical Skills”; Available at: https://www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/DiagnosticRadiologyMilestones.pdf

  • 5. Dubinsky T.J., Garra B.S., Reading C., et. al.: Society of radiologists in ultrasound resident curriculum. Ultrasound Q 2013; 29: pp. 275-291. Available at: http://journals.lww.com/ultrasound-quarterly/Fulltext/2013/12000/Society_of_Radiologists_in_Ultrasound_Resident.1.aspx

  • 6. Lewiss R.E., Pearl M., Nomura J.T., et. al.: CORD-AEUS: consensus document for the emergency ultrasound milestone project. Acad Emerg Med 2013; 20: pp. 740-745.

  • 7. Kissin E.Y., Niu J., Balint P., et. al.: Musculoskeletal ultrasound training and competency assessment program for rheumatology fellows. J Ultrasound Med 2013; 32: pp. 1735-1743.

  • 8. Finnoff J., Lavallee M.E., Smith J.: Musculoskeletal ultrasound education for sports medicine fellows: a suggested/potential curriculum by the American Medical Society for Sports Medicine. Br J Sports Med 2010; 44: pp. 1144-1148.

  • 9. Mollenkopf M., Tait N.: Is it time to include point-of-care ultrasound in general surgery training? A review to stimulate discussion. ANZ J Surg 2013; 83: pp. 908-911.

  • 10. Finnoff J.T., Smith J., Nutz D.J., et. al.: A musculoskeletal ultrasound course for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 2010; 89: pp. 56-69.

  • 11. Muns A., Muhl C., Haase R., et. al.: A neurosurgical phantom-based training system with ultrasound simulation. Acta Neurochir (Wien) 2014; 156: pp. 1237-1243.

  • 12. Adhikary S.D., Hadzic A., McQuillan P.M.: Simulator for teaching hand-eye coordination during ultrasound-guided regional anaesthesia. Br J Anaesth 2013; 111: pp. 844-845.

  • 13. McCoy C.E., Menchine M., Anderson C., et. al.: Prospective randomized crossover study of simulation vs. didactics for teaching medical students the assessment and management of critically ill patients. J Emerg Med 2011; 40: pp. 448-455.

  • 14. Daniels K., Arafeh J., Clark A., et. al.: Prospective randomized trial of simulation versus didactic teaching for obstetrical emergencies. Simul Healthc 2010; 5: pp. 40-45.

  • 15. Bruppacher H.R., Alam S.K., LeBlanc V.R., et. al.: Simulation-based training improves physicians’ performance in patient care in high-stakes clinical setting of cardiac surgery. Anesthesiology 2010; 112: pp. 985-992.

  • 16. Gaba D.M.: Improving anesthesiologists’ performance by simulating reality. Anesthesiology 1992; 76: pp. 491-494.

  • 17. Alba G.A., Kelmenson D.A., Noble V.E., et. al.: Faculty staff-guided versus self-guided ultrasound training for internal medicine residents. Med Educ 2013; 47: pp. 1099-1108.

  • 18. Sarwani N., Tappouni R., Flemming D.: Use of a simulation laboratory to train radiology residents in the management of acute radiologic emergencies. AJR Am J Roentgenol 2012; 199: pp. 244-251.

  • 19. Gaca A.M., Frush D.P., Hohenhaus S.M., et. al.: Enhancing pediatric safety: using simulation to assess radiology resident preparedness for anaphylaxis from intravenous contrast media. Radiology 2007; 245: pp. 236-244.

  • 20. Sica G.T., Barron D.M., Blum R., et. al.: Computerized realistic simulation: a teaching module for crisis management in radiology. AJR Am J Roentgenol 1999; 172: pp. 301-304.

  • 21. Tubbs R.J., Murphy B., Mainiero M.B., et. al.: High-fidelity medical simulation as an assessment tool for radiology residents’ acute contrast reaction management skills. J Am Coll Radiol 2009; 6: pp. 582-587.

  • 22. Lerner C., Gaca A.M., Frush D.P., et. al.: Enhancing pediatric safety: assessing and improving resident competency in life-threatening events with a computer-based interactive resuscitation tool. Pediatr Radiol 2009; 39: pp. 703-709.

  • 23. Wang C.L., Schopp J.G., Petscavage J.M., et. al.: Prospective randomized comparison of standard didactic lecture versus high-fidelity simulation for radiology resident contrast reaction management training. AJR Am J Roentgenol 2011; 196: pp. 1288-1295.

  • 24. Wang C.L., Schopp J.G., Kani K., et. al.: Prospective randomized study of contrast reaction management curricula: computer-based interactive simulation versus high-fidelity hands-on simulation. Eur J Radiol 2013; 82: pp. 2247-2252.

  • 25. Petscavage J.M., Wang C.L., Schopp J.G., et. al.: Cost analysis and feasibility of high-fidelity simulation based radiology contrast reaction curriculum. Acad Radiol 2011; 18: pp. 107-112.

  • 26. Grover S., Currier P.F., Elinoff J.M., et. al.: Improving residents’ knowledge of arterial and central line placement with a web-based curriculum. J Grad Med Educ 2010; 2: pp. 548-554.

  • 27. Kerfoot B.P., Baker H., Jackson T.L., et. al.: A multi-institutional randomized controlled trial of adjuvant Web-based teaching to medical students. Acad Med 2006; 81: pp. 224-230.

  • 28. Satterwhite T., Son J., Carey J., et. al.: Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68: pp. 410-414.

  • 29. Moreno-Ger P., Torrente J., Bustamante J., et. al.: Application of a low-cost web-based simulation to improve students’ practical skills in medical education. Int J Med Inform 2010; 79: pp. 459-467.

  • 30. Lin J., Weadock W.: Musculoskeletal ultrasound. Available at: http://www.med.umich.edu/rad/muscskel/mskus/index.html Accessed July 13, 2015

  • 31. Nevid J.S.: Teaching the millennials. APS Obs 2011; 24: pp. 53-56.

This post is licensed under CC BY 4.0 by the author.