Home Evaluation of Medical Student Experience Using Medical Student Created StudentPACS Flash Based PACS Simulator Tutorials for Learning Radiological Topics
Post
Cancel

Evaluation of Medical Student Experience Using Medical Student Created StudentPACS Flash Based PACS Simulator Tutorials for Learning Radiological Topics

Rationale and Objectives

With studies regularly containing hundreds of images, the authors believe that the ability to efficiently review numerous images and identify findings is an important skill to teach medical students. Using the StudentPACS Adobe Flash extension, created within their department, the authors created StudentPACS modules that provide users with a virtual picture archiving and communication system environment, in which findings can be selected by mouse, triggering questions with referenced answers. The aim was to assess medical students’ impressions of how learning from these modules compared to their personal experiences learning radiology from textbooks or static images.

Materials and Methods

StudentPACS modules were created by medical students on elective under the supervision of resident and attending radiologists. MS I to IV students were then asked to complete StudentPACS modules that tied in with their current coursework, followed by an anonymous survey. Approximately 293 students participated.

Results

The majority of students reported that StudentPACS modules were either equivalent to or better than learning from static images or textbooks (90 ± 3% [257 of 285], P < .00002), were not difficult to use (85 ± 4% [248 of 293], P < .00002), presented them with clinical content that tied in well with the depicted imaging (90 ± 3% [263 of 293], P < .00002), and taught them new information (69 ± 5% [202 of 293], P < .00002). Most respondents felt the StudentPACS modules presented information they would find useful in clinical practice (91 ± 3% [266 of 293], P < .00002), reported satisfactory experiences using StudentPACS modules as a source of self-directed learning material (79 ± 5% [232 of 293], P < .00002), and stated that they would use StudentPACS modules for learning different topics in the future (85.6 ± 4% [244 of 285], P < .00002).

Conclusion

Medical students found using StudentPACS modules at least equivalent to, if not better than, using static books or annotated images.

With the growing use of cross-sectional imaging studies that regularly contain hundreds to thousands of images, we believe that the ability to efficiently review numerous images and identify important findings is an important skill that is not readily taught through the use of substantially smaller sets of static or textbook images. This lack of training can be further compounded for many medical students by limited experience directly reviewing studies using picture archiving and communication system (PACS) workstations.

To allow students the opportunity to experience searching for an abnormality on an image within a large set of images (ie, “looking for a needle in a haystack”), our department created an Adobe Flash (Adobe Systems Inc, Mountain View, CA) extension called StudentPACS that allowed us to design Flash-based radiology tutorials. These tutorials are unique in that they allow students to scroll through a series of images and, using the mouse, select abnormalities previously identified by our team . Selecting these abnormalities triggers a series of question-and-answer sets aimed to teach students some aspect of radiology pertaining to that case. As part of a radiology elective, medical students created a number of these tutorials under the supervision of resident and attending radiologists. Once each tutorial was vetted by a faculty radiologist, it was made freely available on the elective Web site ( http://studentpacs.com ).

Get Radiology Tree app to read full this article<

Materials and methods

Creating the Modules

Get Radiology Tree app to read full this article<

Figure 1, StudentPACS (SPACs) module creation process during typical elective. PACS, picture archiving and communication system.

Figure 2, StudentPACS Flash extension in use.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, Sample online tutorial listing from StudentPACS Web site ( http://studentpacs.com ).

Get Radiology Tree app to read full this article<

Survey

Get Radiology Tree app to read full this article<

Table 1

StudentPACS Modules Presented to Medical Students for Each Course

StudentPACS Module Difficulty ∗ Course Level 16-year-old with chest pain Low Gross anatomy MS I course Introduction to clinical medicine MS II course Abdominal mass Medium Pediatrics MS III clerkship Headache and mental status change Medium Psychology MS III clerkship Headache in 54-year-old smoker Easy Family medicine MS III clerkship Abdominal pain, nausea, and vomiting Medium Surgery MS III clerkship Flank pain, urinary tract infection, and fever Medium Medicine MS III clerkship Pregnancy Medium Obstetrics and gynecology MS III clerkship Bacteremia and abdominal pain Medium Intensive care unit MS IV rotation Headache in 35-year-old Difficult Neurology MS IV rotation Neck trauma Medium Emergency room MS IV rotation

Get Radiology Tree app to read full this article<

Table 2

Summary of Collected Responses and Computed Statistics

Question Please rate the ease with which you were able to search through frames for abnormal radiographic findings. Easy Neutral Difficult Count 119 129 45 Answered 293 % 40.6% 44.0% 15.4% NR 0 CI 6% 6% 4% Combined easy and neutral responses Not difficult Difficult Count 248 45 Answered 293 % 85% 15% NR 0 CI 4% 4%P__z 11.85938663 −11.85938663 >0.99998 .00002 How do you prefer to learn about radiographic abnormalities? Searching through multiple unlabeled image frames and looking for findings myself (e.g. similar to using a PACS) Viewing static images with findings already highlighted (e.g. jpegs/textbook) No preference % 48.5% 37.9% 13.7% Count 142 111 40 Answered 293 CI 6% 6% 4% NR 0 Combined static images with highlighted findings and no preference responses Static images Not static images % 38% 62% Count 111 182 Answered 293 CI 6% 6%P NR 0z −4.147864289 4.147864289 0.99998 .00002 Combined unlabeled images and no preference responses Unlabeled images Not unlabeled images % 48% 52% Count 142 151 Answered 293 CI 6% 6%P NR 0z −0.525785614 0.525785614 0.30153 .6985 Compared to learning radiology from a text book or static image, do you consider learning from StudentPACS modules: A better learning experience An equivalent learning experience A worse learning experience % 54.0% 36.1% 9.8% Answered 285 Count 154 103 28 NR 8 CI 6% 6% 3% Combined better and equivalent responses Better Not better % 54% 46% Answered 285 Count 154 131 NR 8 CI 6% 6%P__z 1.362402419 −1.362402419 0.91308 .0869 Combined equivalent & worse responses Not worse Worse % 90% 10% Answered 285 Count 257 28 NR 8 CI 3% 3%P__z 13.5647893 −13.5647893 >0.99998 .00002 Did the clinical correlations tie in well with the images reviewed in the module? Yes No % 89.8% 10.2% Count 263 30 Answered 293 CI 3% 3%P NR 0z 13.62532524 −13.62532524 0.99998 .00002 Did you learn any new information from the Student PACS module you just completed? Yes No % 68.9% 31.1% Count 202 91 Answered 293 CI 5% 5%P NR 0z 6.470317767 −6.470317767 >0.99998 .00002 Did the Student PACS module present information you think may be useful in clinical practice? Yes No % 90.8% 9.2% Count 266 27 Answered 293 CI 3% 3%P NR 0z 13.9676701 −13.9676701 >0.99998 .00002 As a source of self-directed educational material, are you satisfied with your experience using Student PACS modules? Yes No % 79.2% 20.8% Count 232 61 Answered 293 CI 5% 5%P NR 0 9.996469777 −9.996469777 >0.99998 .00002 Please rate your overall satisfaction with using a Student PACS module. Satisfied Neutral Dissatisfied % 66% 22% 12% Answered 293 Count 193 64 36 NR 0 CI 5% 5% 4% Combined satisfied and neutral Not dissatisfied Dissatisfied % 88% 12% Answered 293 Count 257 36 NR 0 CI 4% 4%P__z 12.91095786 −12.91095786 >0.99998 .00002 Combined dissatisfied and neutral Satisfied Not satisfied % 66% 34% Answered 293 Count 193 100 NR 0 CI 5% 5%P__z 5.433118012 −5.433118012 >0.99998 .00002 Would you use a StudentPACS module for learning another topic in the future? Yes No % 85.6% 14.4% Count 244 41 Answered 285 CI 4% 4%P NR 8z 12.01994343 −12.01994343 >0.99998 .00002

CI, confidence interval; NR, nonresponders; PACS, picture archiving and communication system.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Survey Analysis

Get Radiology Tree app to read full this article<

CI=±1.96P(1−P)n−−−−−−√, CI

=

±

1.96

P

(

1

P

)

n

,

where P is the percentage with a specific response and n is the total number of responses for the question. For nonbinomial variables, student responses were grouped to yield binomial variables. To test the hypothesis that the observed percentage P was different (larger or smaller) than random guessing (for which P = .5), z scores were calculated for all binomial variables using the formula

z=P−.5(.5×.5)n√, z

=

P

.5

(

.5

×

.5

)

n

,

and the significance values of the one-sided tests ( P values) were determined by comparing the corresponding z score to the standard normal z -score distribution ( Table 2 ).

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Table 3

Self-Reported Level of Computer Expertise by Medical School Year ( n = 289 ∗ )

Level of Expertise MS I MS II MS III MS IV Response Percentage Response Count Limited 0 (0%) 2 (1%) 1 (1%) 0 (0%) 1.0% 3 Basic 0 (0%) 13 (8%) 2 (2%) 1 (4%) 5.5% 16 Average 8 (42%) 98 (64%) 61 (68%) 14 (54%) 62.6% 181 Advanced 11 (58%) 41 (27%) 26 (29%) 11 (42%) 30.8% 89 Total 19 154 90 26

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Acknowledgments

Get Radiology Tree app to read full this article<

References

  • 1. Tseng I, Kuo S, Patel K, et al. Faculty development: moving PACS into teaching and physician communication without the hardware. Presented at: Annual meeting of the Radiological Society of North America; Chicago, IL; 2006.

  • 2. Mullins M.E., Mehta A., Patel H., et. al.: Impact of PACS on the education of radiology residents: the residents’ perspective. Acad Radiol 2001; 8: pp. 67-73.

  • 3. Towbin A.J., Paterson B., Chang P.J.: A computer-based radiology simulator as a learning tool to help prepare first-year residents for being on call. Acad Radiol 2007; 14: pp. 1271-1283.

  • 4. Alvarez A., Gold G.E., Tobin B., et. al.: Software tools for interactive instruction in radiologic anatomy. Acad Radiol 2006; 13: pp. 512-517.

  • 5. Huang C.: Changing learning with new interactive and media-rich instruction environments: virtual labs case study report. Comput Med Imaging Graph 2003; 27: pp. 157-164.

  • 6. Grunewald M, Ketelsen D, Heckemann RA, et al. www.tnt-radiology.de : teach and be taught radiology: implementation of a web-based training program based on user preferences as determined by survey. Acad Radiol 2006; 13:461–468.

  • 7. Novelline R.A., Scheiner J.D., Mehta A., et. al.: Preparing medical students for a filmless environment: instruction on the preparation of electronic case presentations from PACS. Acad Radiol 2001; 8: pp. 266-268.

  • 8. Durfee S.M., Jain S., Shaffer K.: Incorporating electronic media into medical student education: a survey of AMSER members on computer and web use in radiology courses. Acad Radiol 2003; 10: pp. 205-210.

  • 9. Ketelsen D., Schrödl F., Knickenberg I., et. al.: Modes of information delivery in radiologic anatomy education: impact on student performance. Acad Radiol 2007; 14: pp. 93-99.

This post is licensed under CC BY 4.0 by the author.