Home Adaptive Tutorials Versus Web-Based Resources in Radiology A Mixed Methods Comparison of Efficacy and Student Engagement
Post
Cancel

Adaptive Tutorials Versus Web-Based Resources in Radiology A Mixed Methods Comparison of Efficacy and Student Engagement

Rationale and Objectives

Diagnostic imaging is under-represented in medical curricula globally. Adaptive tutorials, online intelligent tutoring systems that provide a personalized learning experience, have the potential to bridge this gap. However, there is limited evidence of their effectiveness for learning about diagnostic imaging.

Materials and Methods

We performed a randomized mixed methods crossover trial to determine the impact of adaptive tutorials on perceived engagement and understanding of the appropriate use and interpretation of common diagnostic imaging investigations. Although concurrently engaged in disparate blocks of study, 99 volunteer medical students (from years 1–4 of the 6-year program) were randomly allocated to one of two groups. In the first arm of the trial on chest X-rays, one group received access to an adaptive tutorial, whereas the other received links to an existing peer-reviewed Web resource. These two groups crossed over in the second arm of the trial, which focused on computed tomography scans of the head, chest, and abdomen. At the conclusion of each arm of the trial, both groups completed an examination-style assessment, comprising questions both related and unrelated to the topics covered by the relevant adaptive tutorial. Online questionnaires were used to evaluate student perceptions of both learning resources.

Results

In both arms of the trial, the group using adaptive tutorials obtained significantly higher assessment scores than controls. This was because of higher assessment scores by senior students in the adaptive tutorial group when answering questions related to topics covered in those tutorials. Furthermore, students indicated significantly better engagement with adaptive tutorials than the Web resource and rated the tutorials as a significantly more valuable tool for learning.

Conclusions

Medical students overwhelmingly accept adaptive tutorials for diagnostic imaging. The tutorials significantly improve the understanding of diagnostic imaging by senior students.

A basic understanding of diagnostic imaging is vital for undergraduate medical students . In fact, application of this knowledge is often expected from students in their first postgraduate year . Medical imaging both requires and informs knowledge of anatomy, physiology, and pathology and allows students to develop an understanding of the role of radiology in diagnostic medicine . Failure to develop basic radiology skills also has significant health care and economic implications. In the United States, up to 30% of imaging studies are reported to be ordered inappropriately . This places a huge burden on the health budget and often results in unnecessary harm to patients .

Despite its clear importance, radiology is visited only briefly in many medical curricula . Indeed, many medical students believe that there is insufficient teaching of diagnostic imaging . This situation is exacerbated by the lack of accepted standard guidelines and curriculum in radiology for medical students at our institution, throughout Australia, and in many other countries, including the United States. As an image-centric specialty, radiology is well placed to use digital and online teaching solutions as a means of increasing student exposure to the core elements of the discipline .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and methods

Development of Adaptive Tutorials

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 1, (a) An interactive drag-and-drop question from the adaptive tutorial on computed tomography (CT) scans, requiring identification of salient features in an axial slice of the abdomen. (b) After three incorrect attempts despite feedback, the locations of the structures are highlighted for the student.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Participants

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Study Design

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Development of Online Assessments

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Development of Online Questionnaires

Get Radiology Tree app to read full this article<

Table 1

Subscales of the User Engagement Scale

Subscale Definition Aesthetic appeal The impression made by the visual appearance of the user interface Focused attention The ability to concentrate and absorb information Perceived usability The affective (eg, frustration) and cognitive (eg, effort) responses to the resource Novelty The level of interest and curiosity evoked Endurability The overall evaluation of the experience: the resource’s perceived success and whether the users would recommend it to others Felt involvement The feeling of being drawn in, being interested, and having fun

Get Radiology Tree app to read full this article<

Statistical Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Prior Academic Performance

Get Radiology Tree app to read full this article<

Outcomes of the Chest X-ray Knowledge Assessment

Get Radiology Tree app to read full this article<

Figure 2, Outcomes of examination-style knowledge assessments. All figures shown are mean ± standard error of the mean. (a) Overall assessment outcomes; (b) assessment outcomes for senior students only; (c) assessment outcomes specifically for questions related to the topics covered by the adaptive tutorial and the Diagnostic Imaging Pathways Web site. CXR, chest X-ray; CT, computed tomography. * P < .05. (Color version of figure is available online.)

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Outcomes of the CT Scan Knowledge Assessment

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Outcomes of the Online Questionnaires

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, According to Likert scale questionnaire responses, the adaptive tutorials were more engaging across all subscales of the User Engagement Scale (UES). Data shown are median, interquartile range, and range. Strongly disagree was coded as 1, strongly agree as 5; negatively phrased questions were reverse coded. (Color version of figure is available online.)

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

I really liked this tutorial—it was well structured, went through many basic concepts and was very clear … The best features were the context/information sections, and how they were followed up with questions, which were then followed by diagrams with answers.

Table 2

A Representative Selection of Open-Ended Questionnaire Responses on the Strengths of the Adaptive Tutorials as a Learning Resource

Comment Themes “I loved that the adaptive tutorial started from the basics of X-rays right through to the diagnosis and presentation of various case studies. It was a good balance of theory and practical. I also liked that there were mini tests with real X-ray pictures to ensure that we were learning correctly. I enjoyed the variety of case studies and believe they will be quite useful for diagnosing X-rays in the future.” EX, IN, FE “The adaptive tutorial supplied basic information needed to understand diagnostic imaging and tested this when continuing with the tutorial. The cases at the end were especially helpful after learning about imaging by providing some practical use of the information in scenarios.” EX, IN “The adaptive tutorial was quite interactive at times which made it more engaging, eg, when we had to answer questions. This was the most stimulating and interesting part, particularly when we had to place pins on specific areas in the X-rays–this was particularly helpful as it really checked whether or not we could understand and apply our knowledge about chest X-rays. The instant feedback is helpful as it allows you to learn.” IN, VI, FE “The order of the tutorial was great as it was useful to have the basics taught before moving on to interpreting the chest X-rays. The visuals were great and complemented the text well.” UI, VI

EX, helpful explanations; FE, feedback; IN, interactivity; UI, effective user interface; VI, visual learning experience.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Limitations

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Acknowledgments

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Barzansky B., Jonas H., Etzel S.: Educational programs in US medical schools, 1998-1999. JAMA 1999; 282: pp. 840-846.

  • 2. Pascual T.N.B., Chem R., Wang S.C., et. al.: Undergraduate radiology education in the era of dynamism in medical curriculum: an educational perspective. Eur J Radiol 2011; 78: pp. 319-325.

  • 3. Eisen L.A., Berger J.S., Hegde A., et. al.: Competency in chest radiography. J Gen Intern Med 2006; 21: pp. 460-465.

  • 4. Mirsadraee S., Mankad K., McCoubrie P., et. al.: Radiology curriculum for undergraduate medical studies–a consensus survey. Clin Radiol 2012; 67: pp. 1155-1161.

  • 5. Naeger D.M., Webb E.M., Zimmerman L., et. al.: Strategies for incorporating radiology into early medical school curricula. J Am Coll Radiol 2014; 11: pp. 74-79.

  • 6. Gottlieb R.H.: Imaging for whom: patient or physician?. Am J Roentgenol 2005; 185: pp. 1399-1403.

  • 7. Hillman B.J., Goldsmith J.C.: The uncritical use of high-tech medical imaging. N Engl J Med 2010; 363: pp. 4-6.

  • 8. Medical Benefits Reviews Task Group: Review of funding for diagnostic imaging services: final report [Internet].2011.Australian Government Department of Health & AgeingCanberra [cited 2014 Mar 28]. Available at: http://www.ranzcr.edu.au/component/docman/doc_download/1281-review-of-funding-for-diagnostic-imaging Accessed August 8, 2015

  • 9. Bhogal P., Booth T.C., Phillips A.J., et. al.: Radiology in the undergraduate medical curriculum—who, how, what, when, and where?. Clin Radiol 2012; 67: pp. 1146-1152.

  • 10. Subramaniam R.M., Beckley V., Chan M., et. al.: Radiology curriculum topics for medical students: students’ perspectives. Acad Radiol 2006; 13: pp. 880-884.

  • 11. Howlett D., Vincent T., Watson G., et. al.: Blending online techniques with traditional face to face teaching methods to deliver final year undergraduate radiology learning content. Eur J Radiol 2011; 78: pp. 334-341.

  • 12. Pinto A., Brunese L., Pinto F., et. al.: E-learning and education in radiology. Eur J Radiol 2011; 78: pp. 368-371.

  • 13. Kirschner P.A., Sweller J., Clark R.E.: Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educ Psychol 2006; 41: pp. 75-86.

  • 14. Crowley R.S., Medvedeva O.: A general architecture for intelligent tutoring of diagnostic classification problem solving. AMIA Annu Symp Proc 2003; pp. 185-189. Washington, DC

  • 15. Suebnukarn S., Haddawy P.: COMET: a collaborative tutoring system for medical problem-based learning. IEEE Intell Syst 2007; 22: pp. 70-77.

  • 16. Velan G., Ben-Naim D., Kumar R., et. al.: Adaptive tutorials using virtual slides to enhance learning of microscopic morphology.Richards G.Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education; 2009 Oct 26.2009.AACEChesapeake, VA:pp. 759-763.

  • 17. Alliance of Medical Student Educators in Radiology. AMSER national medical student curriculum in radiology. 2013. Available at: https://www.aur.org/uploadedFiles/Alliances/AMSER/AMSER-Learning-Objectives-Listed-by-Subject(1).pdf . Accessed April 2, 2015.

  • 18. O’Brien H.L., Toms E.G.: The development and evaluation of a survey to measure user engagement in e-commerce environments. J Am Soc Inf Sci Tech 2010; 61: pp. 50-69.

  • 19. O’Brien H.L., Toms E.G.: Examining the generalizability of the User Engagement Scale (UES) in exploratory search. Inf Process Manage 2013; 49: pp. 1092-1107.

  • 20. Maleck M., Fischer M.R., Kammer B., et. al.: Do computers teach better? A media comparison study for case-based teaching in radiology. Radiographics 2001; 21: pp. 1025-1032.

  • 21. Hammett R.J., Harris R.D.: Halting the growth in diagnostic testing. Med J Aust 2002; 177: pp. 124-125.

  • 22. Baird S., Özler B.: Examining the reliability of self-reported data on school participation. J Dev Econ 2012; 98: pp. 89-93.

This post is licensed under CC BY 4.0 by the author.