Home Utility of Interobserver Agreement Statistics in Establishing Radiology Resident Learning Curves During Self-directed Radiologic Anatomy Training
Post
Cancel

Utility of Interobserver Agreement Statistics in Establishing Radiology Resident Learning Curves During Self-directed Radiologic Anatomy Training

Rationale and Objectives

The aim of the study was to ascertain the learning curves for the radiology residents when first introduced to an anatomic structure in magnetic resonance images (MRI) to which they have not been previously exposed to.

Materials and Methods

The iliolumbar ligament is a good marker for testing learning curves of radiology residents because the ligament is not part of a routine lumbar MRI reporting and has high variability in detection. Four radiologists, three residents without previous training and one mentor, studied standard axial T1- and T2-weighted images of routine lumbar MRI examinations. Radiologists had to define iliolumbar ligament while blinded to each other’s findings. Interobserver agreement analyses, namely Cohen and Fleiss κ statistics, were performed for groups of 20 cases to evaluate the self-learning curve of radiology residents.

Results

Mean κ values of resident–mentor pairs were 0.431, 0.608, 0.604, 0.826, and 0.963 in the analysis of successive groups ( P < .001). The results indicate that the concordance between the experienced and inexperienced radiologists started as weak (κ <0.5) and gradually became very acceptable (κ >0.8). Therefore, a junior radiology resident can obtain enough experience in identifying a rather ambiguous anatomic structure in routine MRI after a brief instruction of a few minutes by a mentor and studying approximately 80 cases by oneself.

Conclusions

Implementing this methodology will help radiology educators obtain more concrete ideas on the optimal time and effort required for supported self-directed visual learning processes in resident education.

Radiology is a specialty with a pivotal role in diagnosis and treatment of patients. Because of the multifaceted and rapidly evolving nature of the specialty, radiology also requires training and learning throughout the entire career .

Radiology residency is an apprenticeship during which the resident learns through observation, instruction, and implementation of the knowledge into practice and continuous repetition . The radiology residency education has 3 components: curriculum, instruction, and assessment. Curriculum is the sum of all knowledge and skills which residents are required to master to achieve clinical competency. Instruction is how the sum of knowledge and abilities is transferred from the teacher to the learner. From a behaviorist point of view, instruction is the way information is presented to the learner and how this knowledge is practiced by the learner to the perfection. Assessment, in addition to being criterion referenced, objectively determines what and how well the resident has learned .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Open full size image

Figure 1

T1-weighted axial magnetic resonance image and schematic representation of the iliolumbar ligament seen as a double hypointense band ( arrowheads ) originating from transverse processes of the L5 lumbar vertebra and inserting to the posteromedial aspect of crista iliaca.

Get Radiology Tree app to read full this article<

Materials and Methods

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 1

Scale for Interpretation of κ Values

κ Agreement <0 Less than chance agreement 0.01–0.20 Slight agreement 0.21–0.40 Fair agreement 0.41–0.60 Moderate agreement 0.61–0.80 Substantial agreement 0.81–0.99 Almost perfect agreement

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Table 2

Interobserver Agreement for Resident–Experienced Radiologist Pairs in Groups of 20 ( P < .001 for Each Pair per Case Group)

Pairs Versus Cases 1–20 21–40 41–60 61–80 81–100 Resident 1–mentor 0.464 0.583 0.588 0.789 0.889 Resident 2–mentor 0.427 0.655 0.636 0.899 1.000 Resident 3–mentor 0.401 0.587 0.588 0.789 1.000 Mean κ values 0.431 0.608 0.604 0.826 0.963

Table 3

Cumulative Interobserver Agreement for Resident–Experienced Radiologist Pairs ( P < .001 for Each Pair per Case Group)

Pairs Versus Cases 1–20 1–40 1–60 1–80 1–100 Resident 1–mentor 0.464 0.528 0.552 0.605 0.653 Resident 2–mentor 0.427 0.543 0.576 0.646 0.706 Resident 3–mentor 0.401 0.497 0.530 0.587 0.659 Mean κ values 0.431 0.523 0.553 0.613 0.672

Figure 2, Interobserver agreement for resident–experienced radiologist pairs reflecting the gradual learning process of the residents. Cohen κ values > 0.6 imply a substantial agreement, whereas κ values > 0.8 mean an almost perfect agreement.

Figure 3, Cumulative interobserver agreement (Cohen κ) for resident–experienced radiologist pairs reflecting the gradual learning process of the residents.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 4

Fleiss’ κ Values for Residents When Studying Images From Patients in Groups of 20 ( P < .001 in Each)

Cases 1–20 1–40 1–60 1–80 1–100 Residents 0.775 0.890 0.883 0.922 0.927 Cases 1–20 21–40 41–60 61–80 81–100 Residents 0.775 0.826 0.874 0.863 0.874

Figure 4, Intragroup agreement of residents (Fleiss' κ analysis) reflecting the gradual increase in uniformity of responses by the residents.

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Corry C.A.: The future of recruitment and selection in radiology. Is there a role for assessment of basic visuospatial skills?. Clin Radiol 2011; 66: pp. 481-483.

  • 2. Harvey J.A., Nicholson B.T., Rochman C.M., et. al.: A milestone-based approach to breast imaging instruction for residents. J Am Coll Radiol 2014; 11: pp. 600-605.

  • 3. Williamson K.B., Gunderman R.B., Cohen M.D., et. al.: Learning theory in radiology education. Radiology 2004; 233: pp. 15-18.

  • 4. Kelly A., Thapa M.M.: Educational curriculum, assessment, research and outcomes: past, present and future directions. Acad Radiol 2013; 20: pp. 261-262.

  • 5. Yablon C.M., Jacobson J.A., Flemming D.J., et. al.: Radiology fellowship with a focus on musculoskeletal imaging: current challenges and future directions. AJR Am J Roentgenol 2013; 200: pp. 379-382.

  • 6. D’Agostino M.A., Maillefert J.F., Said-Nahal R., et. al.: Detection of small joint synovitis by ultrasonography: the learning curve of rheumatologists. Ann Rheum Dis 2004; 63: pp. 1284-1287.

  • 7. Rozycki G.S., Ballard R.B., Feliciano D.V., et. al.: Surgeon-performed ultrasound for the assessment of truncal injuries: lessons learned from 1540 patients. Ann Surg 1998; 224: pp. 557-567.

  • 8. Roethlin M.A., Naf R., Amgwerd M., et. al.: Ultrasound in blunt abdominal and thoracic trauma. J Trauma 1993; 344: pp. 488-495.

  • 9. Hughes R.J., Saifuddin A.: Numbering of lumbosacral transitional vertebrae on MRI: role of the iliolumbar ligaments. Am J Roentgenol 2006; 187: pp. 59-66.

  • 10. Viera A.J., Garrett J.M.: Understanding interobserver agreement: the kappa statistic. Fam Med 2005; 37: pp. 360-363.

  • 11. Geertzen J.: Inter-rater agreement with multiple raters and variables.2012. Retrieved April 7, 2014, from https://mlnl.net/jg/software/ira/

  • 12. Fleiss J.L., Davies M.: Jackknifing functions of multinomial frequencies, with an application to a measure of concordance. Am J Epidemiol 1982; 115: pp. 841-845.

  • 13. Davis M.H., Karunathilake I., Harden R.M.: AMEE education guide no. 28: the development and role of departments of medical education. Med Teach 2005; 27: pp. 665-675.

  • 14. Albanese M.A., Dottl S., Nowacek G.A.: Offices of research in medical education: accomplishments and added value contributions. Teach Learn Med 2001; 13: pp. 258-267.

  • 15. Kelly A.M.: Evaluating and writing education papers compared with noneducation papers. Acad Radiol 2012; 19: pp. 1100-1109.

  • 16. Cook D.A., Bordage G., Schmidt H.G.: Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ 2008; 42: pp. 128-133.

  • 17. Parker L., Nazarian L.N., Carrino J.A., et. al.: Musculoskeletal imaging: Medicare use, costs, and potential for cost substitution. J Am Coll Radiol 2008; 5: pp. 182-188.

  • 18. Findlater G.S., Kristmundsdottir F., Parson S.H., et. al.: Development of a supported self-directed learning approach for anatomy education. Anat Sci Educ 2012; 5: pp. 114-121.

  • 19. Cake M.A.: Deep dissection: motivating students beyond rote learning in veterinary anatomy. J Vet Med Educ 2006; 33: pp. 266-271.

  • 20. Gillingwater T.H.: The importance of exposure to human material in anatomical education: a philosophical perspective. Anat Sci Educ 2008; 1: pp. 264-266.

  • 21. Wilkerson L., Irby D.M.: Strategies for improving teaching practices: a comprehensive approach to faculty development. Acad Med 1998; 73: pp. 387-396.

  • 22. Premkumar K., Pahwa P., Banerjee A., et. al.: Does medical training promote or deter self-directed learning? A longitudinal mixed-methods study. Acad Med 2013; 88: pp. 1754-1764.

  • 23. Berbaum K.S., Smoker W.R., Smith W.L.: Measurement and prediction of diagnostic performance during radiology training. AJR Am J Roentgenol 1985; 145: pp. 1305-1311.

  • 24. Rumack C.M.: American diagnostic radiology residency and fellowship programmes. Ann Acad Med Singapore 2011; 40: pp. 126-131.

  • 25. American College of Radiology Diagnostic in-training exam (ACR DXIT). Retrieved May 1, 2015, from http://www.acr.org/Education/Exams-Certifications/DXIT-TXIT/DXIT

  • 26. Bailey J.H., Steele J.L., Gunderman R.B.: Monotonic responses in radiology education evaluations. Acad Radiol 2014; 21: pp. 424-425.

  • 27. Jamieson S.: Likert scales: how to (ab)use them. Med Educ 2004; 38: pp. 1217-1218.

This post is licensed under CC BY 4.0 by the author.