Home Linear Versus Web-Style Layout of Computer Tutorials for Medical Student Learning of Radiograph Interpretation
Post
Cancel

Linear Versus Web-Style Layout of Computer Tutorials for Medical Student Learning of Radiograph Interpretation

Rationale and Objective

We sought to determine which is more effective in increasing skill in radiograph interpretation: a linear (PowerPoint-style) computer tutorial that locks the student into a fixed path through the material or a branched (Web-style) version that allows random access.

Materials and Methods

We prepared a computer tutorial for learning how to interpret cervical spine radiographs. The tutorial has 66 screens including radiographs or graphics on almost every page and five unknown radiographs for the student to interpret. One version (linear) presents the material in a linear sequence with the unknown radiographs heading up “chapters” detailing an important aspect of the task. In the second (branched) version, the same 66 screens were accessed through hyperlinks in a frame beside the unknown radiographs. One hundred thirty-nine medical students at two sites participated in a randomized single-blinded controlled experiment. They interpreted cervical spine images as a pretest and then completed one of the two tutorial versions. Afterward, they did the same examination as a post-test.

Results

The tutorial was successful, in both layouts, in improving the subjects’ ability to interpret cervical spine radiograph images (effect size 2.1; 95% confidence interval 1.7−2.5). However, the layout did not make a difference to their gain in ability. Students in the linear group completed the tutorial in 17% less time ( P < .001) but were slightly less likely to rate the tutorial as “valuable.”

Conclusion

For these novice learners, computer tutorial layout does not affect knowledge gain. Students may be more satisfied with the linear layout, but in time-pressured situations, the Web-style layout may be preferable because it is more efficient.

The skill of radiograph interpretation includes recognition of visual features and the use of a procedure to ensure that all relevant information is considered. Students learn this skill through a combination of instruction and practice. Computer tutorials are an excellent medium for this learning since they facilitate aggregation of representative radiograph examples, and they allow novel instructional designs that are more interactive than textbooks or radiology teaching files ( ). There are a large number of online resources and CD-ROM computer tutorials available for this particular skill ( ). Unfortunately, rigorous evaluations of their educational impact are lacking.

These computer tutorials are generally laid out in one of two formats. The first type is a linear PowerPoint style of presentation that is behaviorist in that the goals are prespecified and the path of the student is largely determined ahead of time by the instructor. The behaviorist philosophy specifies that learning best takes place when a desired change in behavior is prespecified ( ). The educator provides a program of instruction specific to named objectives. Typical examples of instructional designs based on this “top-down” philosophy are lectures and programmed instruction. This method has the advantage of being very efficient.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Methods

Participants

Get Radiology Tree app to read full this article<

Computer Tutorials

Get Radiology Tree app to read full this article<

Figure 1, Schematic representation of the layouts of the two tutorials. In the linear group, the content of the tutorial was presented using serial screens of information. Unknown images are presented in sequence as one of the first screens for a given section of the tutorial. For example, the “Alignment” section begins with an image for which the key feature was malalignment of the cervical spine. For the Web-layout group, the content of the tutorial was presented using serial unknown images. The tutorial included the same number of unknown images but they were grouped together to form the backbone of the tutorial. Knowledge content could be accessed in any order from any of the unknown images. For each of the unknowns, a hyperlink was included that lead the student to the appropriate section for learning about the given anomaly.

Get Radiology Tree app to read full this article<

Linear tutorial

Get Radiology Tree app to read full this article<

Figure 2, Three screen captures from the linear cervical spine x-ray computer tutorial. (a) First, an unknown case is presented from the linear tutorial. The student is asked to click with the left mouse button over a feature that suggests that this radiograph may be abnormal. (b) Second, after the student clicks on the unknown screen, the tutorial first presents a textual explanation of the features. (c) Third, clicking on the “Outline” button of the textual explanation reveals this visual explanation where the features on the radiograph are segmented and interpreted.

Get Radiology Tree app to read full this article<

Web-style tutorial

Get Radiology Tree app to read full this article<

Figure 3, Screen captures from the Web-style layout computer tutorial. All screens can be accessed through hyperlinks arranged in a frame along the left-hand side of the screen. The right-arrow button takes the student to the next unknown case. The radiograph shows the same case of C1C2 malalignment as seen in Figure 2 .

Get Radiology Tree app to read full this article<

Experimental Design

Get Radiology Tree app to read full this article<

Outcome Measure

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Procedure

Get Radiology Tree app to read full this article<

Data Collection

Get Radiology Tree app to read full this article<

Data Analysis

Item scoring

Get Radiology Tree app to read full this article<

Item analysis

Get Radiology Tree app to read full this article<

Reliability analysis

Get Radiology Tree app to read full this article<

Outcome analysis

Get Radiology Tree app to read full this article<

Student path through tutorial

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Figure 4, Study flow diagram.

Get Radiology Tree app to read full this article<

Instrument

Item analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Reliability of instrument

Get Radiology Tree app to read full this article<

Outcomes

Classifying the images by specific lesion

Get Radiology Tree app to read full this article<

Table 1

Measured Outcomes for the Students Comparing the Two Intervention Groups

Branched Group (n = 69) Linear Group (n = 70) Overall (n = 139) Statistical Significance Scores: precisely identifies lesion (marks of seven abnormal findings) Pretest (SD) 0.9 (1.0) 0.8 (0.9) 0.9 (1.0) Post-test (SD) 3.9 (1.2) 3.9 (1.2) 3.9 (1.2) Difference (SD) 3.0 (1.4) 3.1 (1.5) 3.1 (1.4) Effect size ⁎ : pretest versus post-test (95% CI) 2.1 (1.7 to 2.6) 2.1 (1.7 to 2.5) 2.1 (1.7 to 2.5) F (1,137) = 615; P < 0.001 Effect size ⁎ † : branched versus linear (95% CI) −0.06 (−0.39 to +0.27) F (1,137) = 0.3; P = 0.61 Time on tutorial (min), mean (SD) 24.3 (9.7) 29.2 (8.2) 26.8 (8.8) 95% CI diff (2.3 to 8.0) Screens viewed, mean (SD) 112 (61) 135 (33) 124 (50) 95% CI diff (7 to 40) Time per screen viewed (s), mean (SD) 14 (5) 13 (4) 14 (4)P = NS Learning efficiency: Marks of 10 improved from pretest to post-test per minute spent on tutorial (SD) 0.12 (0.13) 0.06 (0.07) 0.09 (0.11)P = 0.004 ‡

Scores indicates whether the student was able to indicate the exact lesion reported by the radiologist.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Examination Scores Before and After the Computer Tutorial, Based on the Radiograph Finding

Finding (# of questions) View PreTest Score Mean (SEM) PostTest Score Mean (SEM) Difference (95% CI) Effect Size for Computer Tutorial Normal ( ) Lateral 0.63 (0.02) 0.31 (0.02) −0.32 (−0.26, −38) −0.64 C1/C2 Malalignment ( ) Odontoid 0.13 (0.03) 0.77 (0.04) +0.64 (0.55, 0.75) 1.3 Increased predental space ( ) Lateral 0.29 (0.03) 0.83 (0.02) +0.54 (0.47, 0.60) 1.1 Vertebral Subluxation ( ) Lateral 0.02 (0.01) 0.27 (0.03) +0.25 (0.19, 0.30) 0.70 Vertebral body crush ( ) Lateral 0.03 (0.01) 0.14 (0.03) +0.11 (0.05, 0.17) 0.39 Inadequate film–C7 not visible ( ) Lateral 0.08 (0.02) 0.81 (0.03) +0.72 (0.64, 0.80) 1.44

Scores represent the proportion of subjects who were able to correctly identify the level and nature of the abnormality.

Get Radiology Tree app to read full this article<

Classifying the images as normal versus abnormal

Get Radiology Tree app to read full this article<

Time and learning efficiency

Get Radiology Tree app to read full this article<

Subjective rating

Get Radiology Tree app to read full this article<

Qualitative analysis of paths

Get Radiology Tree app to read full this article<

Figure 5, Visual representations of the path taken by the students through the tutorial. In the graphics above, each row represents the log file of one student, while each cell in the grid represents a single screen viewed in the computer tutorial. The cells are color-coded according to the legend given here, where a given color represents a specific chapter in the tutorial. Only the first 60 screens viewed are shown. (See Results for interpretation.)

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Acknowledgments

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Erkonen W.E., D’Alessandro M.P., Galvin J.R., Albanese M.A., Michaelsen V.E.: Longitudinal comparison of multimedia textbook instruction with a lecture in radiology education. Acad Radiol 1994; 1: pp. 287-292.

  • 2. Jaffe C.C., Lynch P.J.: Computer-aided instruction in radiology: Opportunities for more effective learning. AJR Am J Roentgenol 1995; 164: pp. 463-467.

  • 3. Skinner B.F.: 1953.MacmillanNew York

  • 4. Bloom B.S.: 1956.Longmans, GreenNew York

  • 5. Gagné R.M., Briggs L.J.: 1974.Holt, Rinehart and WinstonNew York

  • 6. Jonassen D.H.: Objectivism versus constructivism: Do we need a new philosophical paradigm?. Educ Technol Res Dev 1991; 39: pp. 5-14.

  • 7. Viccellio P., Simon H., Pressman B.D., Shah M.N., Mower W.R., Hoffman J.R.: A prospective multicenter study of cervical spine injury in children. Pediatrics 2001; 108: pp. e20.

  • 8. Crocker L., Algina J.: Item Analysis.1986.Thomson LearningBelmont, CA:pp. 311-338.

  • 9. Wright B.D.: Rack and stack: Time 1 versus time 2. Rasch Measure Trans 2003; 17: pp. 905-906.

  • 10. Crocker L.M., Algina J.: Procedures for Estimating Reliability.1986.Thomson ScientificBelmont, CA:pp. 131-156.

  • 11. Cohen J.: 1988.Lawrence Erlbaum AssociatesMahwah, NJ

  • 12. Cumming G., Finch S.: A primer on the understanding, use, and calculation of confidence intervals that are based on central and noncentral distributions. Educ Psychol Measure 2001; 61: pp. 532-574.

  • 13. Kalb B., Gay S.B.: Internet resources for education in radiology. Acad Radiol 2003; 10: pp. S81-S86.

  • 14. Scarsbrook A.F., Graham R.N., Perriss R.W.: The scope of educational resources for radiologists on the internet. Clin Radiol 2005; 60: pp. 524-530.

  • 15. ACR Campus. Available at: http://campus.acr.org/acr/Index.aspx . Accessed April 24, 2007.

  • 16. Aunt Minnie.com. Radiology Learning Center. Available at http://www.auntminnie.com/index.asp?Sec=edu . Accessed April 24, 2007.

  • 17. Grunewald M., Heckemann R.A., Wagner M., Bautz W.A., Greess H.: ELERA: A WWW application for evaluating and developing radiologic skills and knowledge. Acad Radiol 2004; 11: pp. 1381-1388.

  • 18. Grunewald M., Heckemann R.A., Gebhard H., Lell M., Bautz W.A.: COMPARE radiology: Creating an interactive Web-based training program for radiology with multimedia authoring software. Acad Radiol 2003; 10: pp. 543-553.

  • 19. Angle J.F., Gay S.B., Hagspiel K.D., Spinosa D.J., Leung D.A., Matsumoto A.H.: Relative value of analog and digital teaching files. Acad Radiol 2002; 9: pp. 205-210.

  • 20. Blunt D., O’Regan D.: Using PACS as a teaching resource. Br J Radiol 2005; 78: pp. 483-484.

  • 21. Scarsbrook A.F., Foley P.T., Perriss R.W., Graham R.N.: Radiological digital teaching file development: An overview. Clin Radiol 2005; 60: pp. 831-837.

  • 22. Weinberger E., Jakobovits R., Halsted M.: MyPACS.net: A Web-based teaching file authoring tool. AJR Am J Roentgenol 2002; 179: pp. 579-582.

  • 23. Bartlett E.S., Maley J.E., Fajardo L.L.: Radiology residency eCurriculum developed in-house: Evaluation of benefits and weaknesses. Acad Radiol 2003; 10: pp. 657-663.

  • 24. Roubidoux M.A., Chapman C.M., Piontek M.E.: Development and evaluation of an interactive Web-based breast imaging game for medical students. Acad Radiol 2002; 9: pp. 1169-1178.

  • 25. Chew F.S., Relyea-Chew A.: Distributed Web-supported radiology clerkship for the required clinical clerkship year of medical school: Development, implementation, and evaluation. Acad Radiol 2002; 9: pp. 713-720.

  • 26. D’Alessandro M.P., Galvin J.R., Erkonen W.E., Albanese M.A., Michaelsen V.E., Huntley J.S., et. al.: The instructional effectiveness of a radiology multimedia textbook (HyperLung) versus a standard lecture. Invest Radiol 1993; 28: pp. 643-648.

  • 27. Lieberman G., Abramson R., Volkan K., McArdle P.J.: Tutor versus computer: A prospective comparison of interactive tutorial and computer-assisted instruction in radiology education. Acad Radiol 2002; 9: pp. 40-49.

  • 28. Howerton W.B., Enrique P.R., Ludlow J.B., Tyndall D.A.: Interactive computer-assisted instruction vs. lecture format in dental education. J Dent Hyg 2004; 78: pp. 10.

  • 29. Maleck M., Fischer M.R., Kammer B., et. al.: Do computers teach better?. Radiographics 2001; 21: pp. 1025-1032.

  • 30. Collins J., Dotti S.L., Albanese M.A.: Teaching radiology to medical students: an integrated approach. Acad Radiol 2002; 9: pp. 1046-1053.

  • 31. Mileman P.A., van den Hout W.B., Sanderink G.C.: Randomized controlled trial of a computer-assisted learning program to improve caries detection from bitewing radiographs. Dentomaxillofac Radiol 2003; 32: pp. 116-123.

  • 32. Greenhalgh T.: Computer assisted learning in undergraduate medical education. BMJ 2001; 322: pp. 40-44.

  • 33. Chumley-Jones H.S., Dobbie A., Alford C.L.: Web-based learning: Sound educational method or hype?. Acad Med 2002; 77: pp. S86-S93.

  • 34. Letterie G.S.: Medical education as a science: The quality of evidence for computer-assisted instruction. Am J Obstet Gynecol 2003; 188: pp. 849-853.

  • 35. Friedman C.P.: The research we should be doing. Acad Med 1994; 69: pp. 455-457.

  • 36. Cook D.A.: The research we still are not doing: An agenda for the study of computer-based learning. Acad Med 2005; 80: pp. 541-548.

  • 37. Valcke M., De Wever B.: Information and communication technologies in higher education: Evidence-based practices in medical education. Med Teach 2006; 28: pp. 40-48.

  • 38. Mayer R.E.: Models for understanding. Rev Educ Res 1989; 59: pp. 43-64.

  • 39. Large A.: Hypertext instructional programs and learner control: A research review. Educ Info 1996; 14: pp. 95-106.

  • 40. Schmidt R.A., Bjork R.A.: New conceptualizations of practice: Common principles in three paradigms suggest new concepts for training. Psychol Sci 1992; 3: pp. 207-217.

  • 41. Eisen L.A., Berger J.S., Hegde A., Schneider R.F.: Competency in chest radiography. J Gen Intern Med 2006; 21: pp. 460-465.

  • 42. Leblanc V.R., Brooks L.R., Norman G.R.: Believing is seeing: The influence of a diagnostic hypothesis on the interpretation of clinical features. Acad Med 2002; 77: pp. S67-S69.

  • 43. Myles-Worsley M., Johnston W.A., Simons M.A.: The influence of expertise on x-ray image processing. Learning Memory 1988; 14: pp. 553-557.

This post is licensed under CC BY 4.0 by the author.