Home Evaluation of Radiology Teachers' Performance and Identification of the “Best Teachers” in a Residency Program
Post
Cancel

Evaluation of Radiology Teachers' Performance and Identification of the “Best Teachers” in a Residency Program

Rationale and Objectives

Radiology teachers are well trained in their specialty; however, when working in academic institutions, faculty development and promotion through the education pathway tends to be based on their teaching knowledge and skills. The aim of this study is to assess psychometric properties of the Medicina Universidad Católica—Radiology 32 items (MEDUC-RX32), an instrument designed to evaluate the performance of postgraduate radiology teachers and to identify the best teachers.

Materials and Methods

Mixed methodology was used, including qualitative and quantitative phases. The psychometric properties of the MEDUC-RX32 survey were performed by factor analysis (validity), Cronbach alpha coefficient, and G coefficient (reliability). The residents assessed their teachers and simultaneously voted for the “best teacher,” which was used as a gold standard for the receiver operating characteristic curves construction comparing their votes with the global score.

Results

A total of 28 residents answered 164 surveys. The global score was 6.23 ± 0.8 (scale from 1 to 7). The factor analysis showed six domains of the resident’s perception: (1) tutorial teaching, feedback, and independent learning; (2) communication and teamwork; (3) learning objectives; (4) respectful behavior; (5) radiological report; and (6) teaching and care support. The tutor’s strengths were related with respectful behavior and teamwork. The instrument is highly reliable with a Cronbach alpha of 0.937 and a G coefficient of 0.831 (with a minimum of 8 residents). The MEDUC-RX32 instrument has a sensitivity of 91.7% and specificity of 83.3% to identify tutors as best teachers with at least one vote with an area under the receiver operating characteristic curve of 0.931 with a cutoff of 5.94.

Conclusions

The MEDC-RX32 instrument is a multidimensional, valid, and highly reliable method to evaluate radiology teachers, identifying teachers with excellence in tutorial teaching in a postgraduate radiology program.

Introduction

Performance assessment of medical teachers constitutes an important tool in medical education. There are several instruments for clinical teacher evaluation, which includes the assessment of skills as curriculum planner, facilitator, and resource developer . The Maastricht Clinical Teaching Questionnaire is considered the most methodologically rigorous questionnaire in terms of its development and psychometric validation, and it is oriented to in-hospital teaching . Other tools include the Student Evaluation of Teaching in Outpatient Clinics , focused on postgraduate ambulatory teaching, the Stanford Faculty Development Program questionnaire , and the University of Michigan Global Rating Scale . There are also two questionnaires developed in our institution: Medicina Universidad Católica—30 items (MEDUC-30) and Medicina Universidad Católica—Postgraduate 14 (MEDUC-PG14) for assessment of undergraduate and postgraduate clinical teachers, respectively .

Despite the quality of work being performed in the setting of faculty evaluation in postgraduate specialties, as far as we know, there is no valid instrument available with this objective in the radiology field. In fact, the attributes related to “best teachers” are not established in radiology education yet. Therefore, a multidisciplinary working group was established to design and test an instrument (Medicina Universidad Católica—Radiology 32 items [MEDUC-RX32]) to assess radiology faculty teaching performance in the postgraduate setting .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Methods

Qualitative and Quantitative (Mixed) Research Methodology

Get Radiology Tree app to read full this article<

Figure 1, Development stages of Medicina Universidad Católica—Radiology 32 items (MEDUC-RX32). (Color version of the figure is available online).

Get Radiology Tree app to read full this article<

Table 1

MEDUC-RX32 Questionnaire (English Translation/Original Questions in Spanish) and the Mean Score by Items

Items Mean ± DS 1. He/she asked me questions that stimulated my critical thinking.

1. Él/ella me hizo preguntas que estimularon mi pensamiento crítico. 6.26 ± 1.01 2. He/she had a respectful demeanor with all members of the healthcare team.

2. Él/ella tuvo un trato deferente y respetuoso con todos los integrantes del equipo de salud. 6.78 ± 0.66 3. He/she reviewed the cost-benefit issues related with the use of different imaging modalities in patient management.

3. Él/ella planteó los problemas relacionados con el costo/ beneficio del uso de técnicas de imagen en las decisiones relativas al manejo de los pacientes. 6.38 ± 0.88 4. He/she promptly reported critical radiological findings to the members of the treating team.

4. Él/ella informó oportunamente al equipo tratante en casos con hallazgos radiológicos de urgencia. 6.62 ± 0.70 5. He/she stressed the importance of my radiological reports as a fundamental step in the diagnostic and therapeutic process of patient care.

5. Él/ella recalcó la importancia de mis informes radiológicos como parte fundamental del proceso diagnóstico y terapéutico. 6.37 ± 0.90 6. He/she demonstrated with his/her performance in the workplace that the radiologist is an important member of the healthcare team.

6. Él/ella me demostró con su forma de trabajar que el radiólogo es parte importante del equipo tratante. 6.53 ± 0.66 7. He/she collaborated in the management of clinical workload during times of high demand .

7. Él/ella colaboró con el manejo de la carga asistencial de la rotación en períodos de mayor demanda laboral. 6.39 ± 1.47 8. He/she informed me of my strengths and weaknesses in a timely manner.

8. Él/ella me comunicó mis avances y debilidades, sin esperar el fin de la rotación. 5.59 ± 1.44 9. He/she helped me to gradually develop my independent clinical decision-making ability.

9. Él/ella me ayudó a desarrollar gradualmente mi capacidad de tomar decisiones clínicas en forma autónoma. 6.23 ± 0.80 10. He/she valued my questions and opinions as relevant contributions to the radiological diagnostic process.

10. Él/ella valoró mis preguntas y aportes como contribuciones al proceso de diagnóstico radiológico. 6.37 ± 0.90 11. He/she progressively gave me the responsibility of creating the radiological report.

11. Él/ella me traspasó progresivamente la responsabilidad de redactar el informe radiológico. 6.22 ± 1.15 12. He/she stimulated me to independently update my knowledge and practices.

12. Él/ella me estimuló a actualizar mis conocimientos y prácticas por cuenta propia. 6.30 ± 0.85 13. He/she maintained a respectful demeanor toward the ordering physician and his/her diagnostic hypothesis, even if they were incorrect.

13. Él/ella mantuvo un comportamiento respetuoso con el médico solicitante del examen y sus hipótesis diagnósticas, aunque ellas fueran erradas. 6.73 ± 0.49 14. He/she taught me that team work is fundamental in order do my job properly (MRTs, RNs and administrative staff).

14. Él/ella me enseñó que para realizar bien mi trabajo debo trabajar en equipo, (tecnólogos, enfermeros, técnicos paramédicos y administrativos) 6.67 ± 0.61 15. He/she had the ability to tailor the teaching to my level of knowledge.

15. Él/ella tuvo la habilidad de adecuar su docencia a la etapa de aprendizaje en la que me encuentro. 6.44 ± 1.02 16. At the beginning of the rotation, he/she introduced me to the learning objectives of the rotation and the clinical and academic duties assigned to my level of training.

16. Al inicio de la rotación, él/ella me dio a conocer los objetivos de la rotación y las actividades docente-asistenciales correspondientes a mi nivel. 5.91 ± 1.46 17. He/she evaluated me according to the rotation learning objectives and considered my level of training.

17. Él/ella me evaluó de acuerdo a los objetivos de la rotación, considerando el nivel que me corresponde. 6.23 ± 0.89 18. He/she reviewed in detail indications and techniques of relevant imaging studies.

18. Él/ella entregó indicaciones y técnicas específicas para la realización de determinados exámenes. 6.29 ± 0.92 19. He/she taught me how to write reports, highlighting my strengths and weaknesses during this process.

19. Él/ella me enseñó a redactar informes, indicando mis fortalezas y debilidades en su ejecución. 5.98 ± 1.07 20. He/she created an environment that allowed appropriate exposure, discussion, and follow-up of relevant cases reviewed during clinical rotation.

20. Él/ella creó situaciones que permitieran la exposición, discusión y seguimiento de casos generados en las actividades asistenciales. 6.29 ± 0.85 21. He/she respected me as a person and a fellow physician.

21. Él/ella me respetó como persona y como médico. 6.84 ± 0.63 22. He/she made explicit the reasoning which led him/her to formulate the diagnostic hypothesis based on the image interpretation of the radiological examination.

22. Él/ella explicitó el razonamiento que le llevó a formular las hipótesis diagnósticas en base a las imágenes obtenidas durante el examen radiológico. 6.56 ± 074 23. He/she helped me to answer my questions and manage complex clinical cases on call.

23. Él/ella me ayudó a resolver mis dudas y manejar situaciones especialmente complejas durante los turnos de residencia. 6.48 ± 0.97 24. He/she acknowledged mistakes, both personal and those made by other physicians, as learning opportunities.

24. Él/ella asumió los errores propios y ajenos como oportunidades para el aprendizaje. 6.53 ± 0.87 25. He/she was open to answer questions and taught me in a non-threatening way.

25. Él/ella mostró apertura a recibir preguntas y me enseñó en forma no intimidante. 6.55 ± 0.83 26. He/she showed interest in my learning process.

26. Él/ella mostró interés por mi proceso de aprendizaje. 6.31 ± 1.13 27. He/she showed me how to give relevant information to the patient in a short, comprehensible and respectful manner, prior to an examination or procedure.

27. Él/ella me mostró cómo entregar, en forma breve, comprensible y respetuosa, información al paciente sobre el examen o procedimiento que se le practicará. 6.49 ± 0.88 28. He/she encouraged me to interact with other physicians and healthcare professionals, establishing an appropriate interdisciplinary relationship.

28. Él/ella me alentó a interactuar con otros médicos y profesionales de la salud en la relación interdisciplinaria. 6.56 ± 0.64 29. He/she displayed enthusiasm while he/she was teaching, encouraging me to develop my teaching skills.

29. Él/ella demostró entusiasmo por su labor docente, estimulándome a desarrollar mis habilidades docentes. 6.15 ± 1.17 30. He/she taught me the importance of presenting radiological cases in a clear and didactic manner for different types of audiences.

30. Él/ella me enseñó la importancia de presentar los casos radiológicos en forma clara y didáctica para diferentes audiencias. 6.16 ± 1.13 31. He/she oriented me in the appropriate way to search for web-based information to optimize the radiological interpretation of interesting or complex cases.

31. Él/ella me orientó en el empleo de métodos apropiados de búsqueda de información para optimizar la interpretación radiológica de casos interesantes o complejos. 5.84 ± 1.36 32. He/she gradually incorporated me in the image-guided procedures that are included in the rotation.

32. Él/ella me incorporó gradualmente en los procedimientos guiados por imagen que se incluyen en el programa de formación. 6.03 ± 1.34

SD, standard deviation; MEDUC-RX32, Medicina Universidad Católica—Radiology 32 items.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Subjects and Procedure

Get Radiology Tree app to read full this article<

Statistical Analyses

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Best Teachers’ Attributes

Get Radiology Tree app to read full this article<

Table 2

Attributes of Best Radiology Teachers. Teachers and Residents’ Perspectives

Teaching Attributes Expressed by Teachers and Residents_Teaching Attributes__Aspects Considered in Each Attribute_ 1. He/she promotes proactivity. Ability to stimulate the dynamism and initiative of residents to optimize the quality of the radiology reports 2. He/she establishes rapport with the professionals in the health system in which they work. Ability to establish positive relationships with people from different specialties and hierarchical ranks 3. He/she promotes ethical performance at work. Ability to communicate criteria within written radiological reports, timely and useful manner for physicians and patients 4. He/she transmits consciousness and value of interdisciplinary collaboration within the specialty of Radiology. Ability to demonstrate the relevance of knowledge from other areas of medicine to radiology and to generate learning in this regard 5. He/she collaborates with residents in periods of high pressure. Shows initiative to support the work of resident care at times of high demand 6. He/she delivers structured feedback. Ability to transmit messages in an assertive and timely manner to the resident in regard to their performance, ensuring that the message has been understood 7. He/she stimulates progressive decisional autonomy in residents. Ability to “allow” the resident to gradually move from close supervision to supervision advisory 8. He/she sets healthcare/educational priorities under stress. Ability to discriminate urgent cases from all the other important cases. The importance to validate, more or less, radiology reports quickly 9. He/she works well in a team environment. Ability to work with others to achieve common objectives which are beneficial for all participants 10. He/she adjusts their teaching to the heterogeneity and variability of the resident groups Ability to evaluate the stages of progress and goals of each resident to bring to them the appropriate training tools 11. He/she evaluates in an objective, consistent and different manner according to learning outcomes (competencies) Ability to design and implement learning assessment instruments in line with the different objectives considered 12. He/she demonstrates the rapid integration of information for immediate diagnosis. Ability to integrate and gather information with agility to improve an immediate diagnosis 13. He/she has skills performing exams in direct contact with patients. Ability to prepare, show, observe, and evaluate skills in interventional procedures 14. He/she teaches how to carry out imaging following appropriate protocols. Teaches how to indicate to a Medical Technologist how to perform certain tests needed 15. He/she teaches how to inform and validate reports. Teaches how to write useful, relevant, and timely reports 16. He/she uses clinical cases from day-to-day for self-learning as well as that of colleagues and residents. Coordinates efforts to generate instances based in learning experiences of the unit and to share that knowledge 17. He/she promotes and value peer learning. Recognizes the importance of knowledge and the practices of the more advanced residents and fellows transmitted to residents of a lower level of preparation 18. He/she transmits a “way of thinking,” to face the examination and come up with results. He/she is able to explain the structure, or line of thought, which enables them to reach a diagnosis or diagnostic hypothesis that is correct or plausible 19. He/she displays availability for the resident on call. Receives phone calls nightly or early morning without showing discomfort and resolves situations that overload the resident on call 20. He/she shows openness to receive questions and teaches in non-threatening ways to learn from their own and others’ mistakes. The teacher receives questions attentively. Revealing mistakes and showing how you can learn, allowing the teacher to be seen as a human being 21. He/she models a doctor-patient relationship. Demonstrated by their actions and attitudes, the manner in which a radiologist should be

Teaching Attributes Expressed Exclusively by Teachers 22. He/she transmits a critical position before receiving the diagnostic hypothesis, with an attitude of respect for colleagues who request imaging exams. Ability to comment in a respectful manner, the diagnostic hypothesis from colleagues, and try to understand the basis of their reasoning, using a critical thinking which is technically sound 23. He/she promotes a teaching role in residents. Creates situations in which the resident must teach others and share their knowledge (peer and near peer teaching) 24. He/she uses multimedia resources. Knowledge and use of computer tools applied to image analysis, reporting, and teaching in Radiology

Teaching Attributes Expressed Exclusively by Residents 25. He/she shows empathy. Ability to understand the residents or “to be able to understand the place of his/her colleagues” 26. He/she stimulates self-learning and continuous education. Encourages self-learning and highlights its value in the practice of Radiology 27. He/she shows respect for the residents as colleagues. Establishes a cordial relationship with the resident(s) (considered a relationship between colleagues)

Get Radiology Tree app to read full this article<

Application of the Instrument

Get Radiology Tree app to read full this article<

Construct Validity and Internal Consistency

Get Radiology Tree app to read full this article<

Figure 2, Scree plot from factor analysis of Medicina Universidad Católica—Radiology 32 items (MEDUC-RX32). The y represents the eigenvalue in function of the number of factors (x axis).

Table 3

Domains, Items, and Scores

Domain Items Residents RR (%) Mean and SD 1. Tutorial teaching, feedback, and independent learning ( 11 items ) 1, 7, 8, 9, 10, 11, 12, 19, 20, 29, 31 95.86 6.15 ± 1.1 2. Communication and teamwork ( 7 items ) 3, 14, 15, 22, 26, 27, 28 95.06 6.49 ± 0.84 3. Learning objectives ( 4 items ) 4, 16, 17, 24 84.09 6.32 ± 0.98 4. Respectful treatment ( 4 items ) 2, 13, 21, 25 100 6.73 ± 0.65 5. Radiological report ( 3 items ) 5, 6, 18 96.97 6.4 ± 0.83 6. Teaching and care support ( 3 items) 23, 30, 32 69.09 6.22 ± 1.15 Global questionnaire ( 32 items ) 96,36 6.23 ± 0.8

RR, response rate, SD, standard deviation.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Reliability Analyses

Get Radiology Tree app to read full this article<

Table 4

The Cronbach Alpha Coefficient of each Domain and the Global Questionnaire

Domain Cronbach Alpha 1. Tutorial teaching. Feedback, and independent learning 0.941 2. Communication and teamwork 0.847 3. Learning objectives 0.786 4. Respectful treatment 0.86 5. Radiology report 0.731 6. Teaching and care support 0.695Global 0.937

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 5

Estimated Variance Components of the Generalizability Study. The Total Variability Shows the Components of the Models which Produces more Variability of the Results (more Variability with Higher Percentages)

Variances

(I:D) × (E:T) Design Effect Degree of Freedom Variance Component Total Variability (%) T 11 0.0572969 1.8 E:T 72 0.8402472 26.5 D 5 0.2945405 9.3 I:D 12 0.0674722 2.1 TD 55 0.0608304 1.9 TI:D 132 0.0407204 1.3 ED:T 360 0.6008304 18.9 EI:TD 864 1.2029321 38.0

T, tutors; E, students; D, domains; I, items.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 6

Decision Study Results

Students (n°) Generalizability Coefficient Phi 20 0.92916 15 0.90772 10 0.86769 8 0.83103 7 0.76630 6 0.72400 5 0.69653 4 0.66300 3 0.62113

Get Radiology Tree app to read full this article<

Best Radiology Teacher

Get Radiology Tree app to read full this article<

Figure 3, Receiver operating characteristic (ROC) curves of “best radiology teacher”: (a) 30% of votes as best teacher, (b) 50% of votes as best teacher, (c) 70% of votes as best teacher. (Color version of the figure is available online).

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

References

  • 1. Stalmeijer R.E., Dolmans D.H., Wolfhagen I.H., et. al.: The Maastricht Clinical Teaching Questionnaire (MCTQ) as a valid and reliable instrument for the evaluation of clinical teachers. Acad Med 2010; 85: pp. 1732-1738.

  • 2. Zuberi R.W., Bordage G., Norman G.R.: Validation of the SETOC instrument—Student Evaluation of Teaching in Outpatient Clinics. Adv Health Sci Educ Theory Pract 2007; 12: pp. 55-69.

  • 3. Litzelman D.K., Stratos G.A., Marriott D.J., et. al.: Factorial validation of a widely disseminated educational framework for evaluating clinical teachers. Acad Med 1998; 73: pp. 688-695.

  • 4. Williams B.C., Litzelman D.K., Babbott S.F., et. al.: Validation of a global measure of faculty’s clinical teaching performance. Acad Med 2002; 77: pp. 177-180.

  • 5. Skeff K.M., Stratos G.A., Berman J., et. al.: Improving clinical teaching. Evaluation of a national dissemination program. Arch Intern Med 1992; 152: pp. 1156-1161.

  • 6. Bitran M., Mena B., Riquelme A., et. al.: [An instrument in Spanish to evaluate the performance of clinical teachers by students]. Rev Med Chil 2010; 138: pp. 685-693.

  • 7. Pizarro M., Solis N., Rojas V., et. al.: [Development of MEDUC-PG14 survey to assess postgraduate teaching in medical specialties]. Rev Med Chil 2015; 143: pp. 1005-1014.

  • 8. Huete A., Rojas V., Herrera C., et. al.: Desarrollo y validación del instrumento MEDUC-RX32, para la evaluación de docentes de programas de la especialidad de postítulo en radiología. Rev Chil Radiol 2014; 20: pp. 75-80.

  • 9. McLean M., Cilliers F., Van Wyk J.M.: Faculty development: yesterday, today and tomorrow. Med Teach 2008; 30: pp. 555-584.

  • 10. Benor D.E.: Faculty development, teacher training and teacher accreditation in medical education: twenty years from now. Med Teach 2000; 22: pp. 503-512.

  • 11. Castanelli D., Kitto S.: Perceptions, attitudes, and beliefs of staff anaesthetists related to multi-source feedback used for their performance appraisal. Br J Anaesth 2011; 107: pp. 372-377.

  • 12. Hattie J., Timperley H.: The power of feedback. Rev Educ Res 2007; 77: pp. 81-112.

  • 13. Schartel S.A.: Giving feedback—an integral part of education. Best Pract Res Clin Anaesthesiol 2012; 26: pp. 77-87.

  • 14. Boerboom T.B., Dolmans D.H., Jaarsma A.D., et. al.: Exploring the validity and reliability of a questionnaire for evaluating veterinary clinical teachers’ supervisory skills during clinical rotations. Med Teach 2011; 33: pp. 84-91.

  • 15. Baker K.: Clinical teaching improves with resident evaluation and feedback. Anesthesiology 2010; 113: pp. 693-703.

  • 16. Steinert Y., Mann K., Centeno A., et. al.: A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach 2006; 28: pp. 497-526.

  • 17. Green M.E., Ellis C.L., Frémont P., et. al.: Faculty evaluation in departments of family medicine: do our universities measure up?. Med Educ 1998; 32: pp. 597-606.

  • 18. O’Cathain A., Murphy E., Nicholl J.: Three techniques for integrating data in mixed methods studies. BMJ 2010; 341: pp. c4587.

  • 19. Field A.: Discovering statistics using SPSS for Windows.2nd ed.2005.Sage Publications LimitedLondon

  • 20. Field A.: Discovering statistics using SPSS for Windows.1st ed.2000.Sage Publications LimitedLondon

  • 21. Cattell R.B.: The scree test for a number of factors. Multivar Behav Res 1966; 1: pp. 245-276.

  • 22. Cronbach L.: Coefficient alpha and the internal structure of tests. Psychometrika 1951; 16: pp. 297-334.

  • 23. Brennan R.L.: Generalizability theory.1st ed.2001.SpringerNew York

This post is licensed under CC BY 4.0 by the author.