Home Flexible Image Evaluation
Post
Cancel

Flexible Image Evaluation

Rationale and Objectives

Studies have highlighted the potential of handheld viewing devices for rapid diagnosis and increased smartphone usage among physicians and radiologists is known as is the clinical applicability of hand-held devices for computed tomography (CT) spinal injury cases. Magnetic resonance (MR), however, is the accepted gold standard for spinal imaging, providing visualization of both ligament and spinal cord pathology. This study investigated the diagnostic accuracy of the iPad, the most probable alternative display device outside the radiology environment and financially viable alternative, when reviewing emergency spinal MR images, in comparison with secondary-class LCD devices in the case of the interpretation of CT and MR imaging examinations.

Materials and Methods

In total 31 MR cases including both positives ( n = 13) containing one of four possible presentations: spinal cord compression, cauda equine syndrome, spinal cord hemorrhage, or spinal cord edema and controls ( n = 18) were reviewed. Ziltron iPad software facilitated the display of cases and the receiver operating characteristic (ROC) analysis. Thirteen American Board of Radiology board-certified radiologists reviewed all cases on both displays. Standardized viewing conditions were maintained.

Results

Dorfman-Berbaum-Metz multireader-multicase (DBM MRMC) analysis was performed including random readers/random cases, fixed readers/random cases and random readers/fixed cases. No differences of statistical significance ( P ≤ .05) could be found in terms of area under the curve, sensitivity and specificity between the iPad and secondary-class display.

Conclusion

The iPad performed with equal diagnostic accuracy when compared with the secondary-class LCD device after DBM MRMC analysis, demonstrating the iPad as an option to aid initial review of MR spinal emergency cases.

There has been much debate in recent years surrounding the application of handheld devices such as personal digital assistants (PDAs), smartphones, and more recently the Apple iPad, in health care. The potential applications for such devices and their utility in medicine is clear with studies in 2003 and 2005 suggesting that 46% of nonradiology attending physicians and trainees in one tertiary care academic medical center and approximately 45% of randomly selected active and training radiologists who were members of the Radiological Society of North America were using PDAs . Aside from scheduling and calendar applications, the nonradiology physicians used their devices for accessing drug information programs, medical references, and medical calculators. In the 2005 study, only 24.6% of surveyed radiologists had a radiology application installed on their devices, whereas many remained skeptical about the potential utility for PDAs to be used to view entire imaging studies directly from a picture archiving and communication system (PACS) . The radiologists identified memory capacity, software availability, and screen resolution as the important factors influencing any decision to purchase a PDA. There is much anecdotal evidence to suggest that the usage rates amongst physicians and radiologists is much greater following ongoing developments in smartphone technology, along with the introduction of the iPhone and iPad . Since its launch in April 2010, the iPad itself has generated significant interest in terms of its role in medicine and its potential application for the display of radiological images.

Although these devices are used in modern medicine as outlined, the iPad with its larger display size and superior contrast ratio to other handheld devices warrants closer investigation in terms of its utility in radiology. Many of the other previously identified limitations of handheld devices such as user interface, inherently low resolution, poor connectivity, slow data transfer, available software, processor speed, memory, data security, and Digital Imaging and Communications in Medicine (DICOM) compatibility have now been overcome or have at least progressed . According to these studies, the consensus is that such handheld devices have the greatest potential in terms of accessing radiological images remote to the radiology department or indeed remote to the institution for initial review purposes and can be used to discern primary pathologies but should not be used to help prepare radiological reports. The application of such technology to primary diagnosis of emergency radiology examinations has been explored by several authors for a range of clinical scenarios and handheld technologies. Toomey at al have explored the use of PDAs and the iTouch for detection of orthopaedic fractures on radiographs and intracranial hemorrhage on computed tomography (CT) , Choudhri et al have undertaken some preliminary work exploring the utility of the iPhone for the review of abdominal CT for the evaluation of acute appendicitis , whereas Rosenberg explored the impact of reviewing CT brain examinations for the neurosurgical triage of patients .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and methods

Overview

Get Radiology Tree app to read full this article<

Equipment

Get Radiology Tree app to read full this article<

Table 1

Comparison of the Two Display Devices Used for the Study

Specification Type ViewSonic (VP201m) iPad Maximum luminance ∗ 157 cdm −2 300.1 cdm −2 Minimum luminance ∗ 0.4 cdm −2 0.45 cdm −2 Contrast ratio ∗ 392:1 667:1 Display resolutions 1200 × 1600 pixels 1024 × 768 pixels Screen type LCD LED backlit Screen size (in) 20.1 (51.0) 9.7 (24.3) Interaction method Mouse Multitouch touchscreen

LCD, liquid crystal display; LED, light-emitting diode.

Values in parentheses are centimeters.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Images

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Software

Get Radiology Tree app to read full this article<

Figure 1, A user interacting with a T1-weighted sagittal image from a positive cervicodorsal spine magnetic resonance imaging using the Ziltron iPad software employed in this study.

Figure 2, The Ziltron graphical user interface employed showing a T2-weighted sagittal image from a positive lumbar spine magnetic resonance imaging.

Get Radiology Tree app to read full this article<

Observers

Get Radiology Tree app to read full this article<

Table 2

Binormal (Dorfman-Berbaum-Metz Multireader-Multicase) and Empirical/Trapezoidal (Ziltron) Area under the Receiver Operating Characteristic Curve, Sensitivity and Specificity Scores and Difference for all Readers

Reader iPad ViewSonic VP201m AUC (Binormal) AUC (Empirical) Sensitivity Specificity AUC (Binormal) AUC (Empirical) Sensitivity Specificity AUC Difference (Binormal) 1 m 0.943 0.91 0.92 0.72 0.890 0.86 0.85 0.67 −0.053 2 m 0.843 0.82 0.85 0.72 0.907 0.89 0.92 0.89 0.064 3 i 0.844 0.82 0.62 0.83 0.886 0.86 0.85 0.83 0.042 4 m 0.921 0.89 0.85 0.78 0.870 0.85 0.85 0.67 −0.051 5 i 0.902 0.90 0.85 0.89 0.867 0.85 0.62 0.94 −0.035 6 m 0.870 0.85 0.77 0.67 0.809 0.78 0.85 0.61 −0.061 7 i 0.895 0.83 0.92 0.67 0.926 0.87 0.92 0.78 0.031 8 m 0.950 0.92 0.92 0.89 0.933 0.90 0.77 0.89 −0.017 9 m 0.893 0.75 0.85 0.56 0.863 0.83 0.92 0.56 −0.030 10 m 0.874 0.82 0.85 0.72 0.877 0.81 0.77 0.83 0.003 11 i 0.689 0.75 0.77 0.72 0.861 0.86 0.62 0.89 0.172 12 i 0.938 0.91 1.0 0.67 0.887 0.85 0.85 0.83 −0.051 13 i 0.852 0.77 0.77 0.78 0.950 0.88 0.92 0.83 0.098 Mean 0.878 (0.0675) 0.842 (0.061) 0.842 (0.095) 0.740 (0.094) 0.887 (0.0367) 0.853 (0.032) 0.824 (0.104) 0.786 (0.120) 0.009 (0.070)

AUC, area under the curve.

Subscript indicates which display device was used for first viewing (m = monitor; i = iPad).

Analysis of variance empirical AUC: P = .55, sensitivity P = .654, specificity P = .285.

*Standard deviations shown in parentheses.

Get Radiology Tree app to read full this article<

Statistical Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Table 3

Results of Dorfman-Berbaum-Metz Multireader-Multicase Analysis

Analysis 95% Confidence Intervals_F__P_ IPad (Mean AUC = 0.878) ViewSonic VP201m (Mean AUC = 0.887) Difference (Mean AUC = −0.009) Random readers and cases (0.785–0.972) (0.791–0.982) (−0.051 to 0.034) 0.19 .6696 Fixed readers, random cases (0.783–0.973) (0.786–0.988) (−0.041 to 0.023) 0.29 .5961 Random readers, fixed cases (0.837–0.919) (0.865–0.909) (−0.051 to 0.034) 0.19 .6696

AUC, area under the curve.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Sensitivity and Specificity

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusion

Get Radiology Tree app to read full this article<

Acknowledgments

Get Radiology Tree app to read full this article<

References

  • 1. McLeod T.G., Ebbert J.O., Lymp J.F.: Survey assessment of personal digital assistant use among trainees and attending physicians. J Am Med Inform Assoc 2003; 10: pp. 605-607.

  • 2. Boonn W.W., Flanders A.E.: Informatics in radiology (infoRAD): survey of personal digital assistant use in radiology. RadioGraphics 2005; 25: pp. 537-541.

  • 3. Avitzur O.: How neurologists are using the newest tablets—in and out of the clinic. Neurology Today 2010; 10: pp. 22-24.

  • 4. Berger E.: The iPad: gadget or medical Godsend?. Ann Emerg Med 2010; 56: pp. 21-22A.

  • 5. Shah K., Siegel E., Patel A.: Radiographic applications of the Apple iPad: is portable, multitouch radiology in our future?. Proc Radiol Soc N Am 2010; SSC08–03

  • 6. Weiss F., Jeudy J., Sahin A., et. al.: TB or not TB? Answering an age old question using a modern device: the iPad. Proc Radiol Soc N Am 2010; SSC08–07

  • 7. Flanders A.E., Wiggins R.H., Gozum M.E.: Handheld computers in radiology. RadioGraphics 2003; 23: pp. 1035-1047.

  • 8. McEntee M.F., Toomey R.J.: Handheld devices for emergency radiologic consultation. Hosp Imaging Radiol Eur 2010; 5: Available at: http://www.hospitalradiologyeurope.com/default.asp?title=Handheld%5Fdevices%5Ffor%5Femergency%5Fradiological%5Fconsultation&page=article.display&article.id=25033

  • 9. Raman B., Raman R., Raman l, et. al.: Radiology on handheld devices: image display, manipulation, and PACS integration issues. Radiographics 2004; 24: pp. 299-310.

  • 10. Toomey R.J., Ryan J.T., McEntee M.F., et. al.: Diagnostic efficacy of handheld devices for emergency radiologic consultation. AJR Am J Roentgenol 2010; 194: pp. 469-474.

  • 11. Choudhri A.F., Radvany M.G.: Initial experience with a handheld device digital imaging and communications in medicine viewer: OsiriX mobile on the iPhone. J Digital Imaging 2011; 24: pp. 184-189.

  • 12. Choudhri A., Carr T., Ho C., et. al.: Handheld device review of abdominal CT for the evaluation of acute appendicitis. Proc Radiol Soc N Am 2009; SSE09–03

  • 13. Rosenberg M.S., Bullard T., Ladde J., et. al.: Can digital photographs of CT images obtained and transferred by cell phone be used to predict the need to transfer to tertiary care center. Ann Emerg Med 2010; 56: pp. S57.

  • 14. American Association of Physicists in Medicine, Task Group 18. AAPM On-line Report No. 3. Assessment of display performance for medical imaging systems [online]. American Association of Physicists in Medicine Task Group 18 Imaging Informatics Subcommittee 2005. Available at: http://www.aapm.org/pubs/reports/OR_03.pdf .

  • 15. Food and Drug Administration. Press release: FDA clears first diagnostic radiology application for mobile devices. Food and Drug Administration. February 4, 2011. Available at: http://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm242295.htm .

  • 16. Brennan P.C., McEntee M., Evanoff M., et. al.: Ambient lighting: effect of illumination on soft-copy viewing of radiographs of the wrist. AJR Am J Roentgenol 2007; 188: pp. W177-W180.

  • 17. Dorfman D.D., Berbaum K.S., Metz C.E.: Receiver operating characteristic rating analysis: generalization to the population of readers and patients with the jackknife method. Invest Radiol 1992; 27: pp. 723-731.

  • 18. Dorfman D.D., Berbaum K.S., Lenth R.V., et. al.: Monte Carlo validation of a multireader method for receiver operating characteristic discrete rating data: Factorial experimental design. Acad Radiol 1998; 5: pp. 591-602.

  • 19. Hillis S.L., Berbaum K.S.: Power estimation for the Dorfman-Berbaum-Metz method. Acad Radiol 2004; 11: pp. 1260-1273.

  • 20. Hillis S.L., Obuchowski N.A., Schartz K.M., et. al.: A comparison of the Dorfman-Berbaum-Metz and Obuchowski-Rockette methods for receiver operating characteristic (ROC) data. Stat Med 2005; 24: pp. 1579-1607.

  • 21. Hillis S.L.: Monte Carlo validation of the Dorfman-Berbaum-Metz method using normalized pseudovalues and less data-based model simplification. Acad Radiol 2005; 12: pp. 1534-1541.

  • 22. Hillis S.L.: A comparison of denominator degrees of freedom for multiple observer ROC analysis. Stat Med 2007; 26: pp. 596-619.

  • 23. Hillis S.L., Berbaum K.S., Metz C.E.: Recent developments in the Dorfman-Berbaum-Metz procedure for multireader ROC study analysis. Acad Radiol 2008; 15: pp. 647-661.

  • 24. Johnston W.K., Patel B.N., Low R.K., et. al.: Wireless teleradiology for renal colic and renal trauma. J Endourol 2005; 19: pp. 32-36.

  • 25. Modi J., Sharma P., Earl A., et. al.: iPhone-based teleradiology for the diagnosis of acute cervico-dorsal spine trauma. Can J Neurol Sci 2010; 37: pp. 849-854.

  • 26. Lowe J., Brennan P., Evanoff M., et. al.: Variations in performance of LCDs are still evident after DICOM gray-scale standard display calibration. AJR Am J Roentgenol 2010; 195: pp. 181-187.

  • 27. Krupinski E.A.: Medical grade vs off-the-shelf color displays: influence on observer performance and visual search. J Digit Imaging 2009; 22: pp. 363-368.

  • 28. Sun H., Nemecek A.N.: Optimal management of malignant spinal cord compression. Haematol Oncol Clin N Am 2010; 24: pp. 537-551.

  • 29. Samphao S., Eremin J.M., Eremin O.: Oncological emergencies: clinical importance and principles of management. Eur J Cancer Care 2010; 19: pp. 707-713.

  • 30. Demaeral P.: Magnetic resonance imaging of spinal cord trauma: a pictorial review. Neuroradiology 2006; 48: pp. 223-232.

  • 31. Bozzo A., Marcoux J., Radhakrishna M., et. al.: The role of magnetic resonance imaging in the management of acute spinal cord injury. J Neurotrauma 2010; 28: pp. 1401-1411.

  • 32. Goldberg A.L., Kershah S.M.: Advances in imaging of vertebral and spinal cord injury. J Spinal Cord Med 2010; 33: pp. 105-116.

  • 33. Winters M.E., Kluetz P., Zilberstein J.: Back pain emergencies. Med Clin N Am 2006; 90: pp. 505-523.

  • 34. Sheerin F., Collison K., Quaghebeur G.: Magnetic resonance imaging of acute intramedullary myelopathy: radiological differential diagnosis for the on-call radiologist. Clin Radiol 2009; 64: pp. 84-94.

  • 35. Rankey D., Leach J.L., Leach S.D.: Emergency MRI utilization trends at a tertiary care academic medical center: baseline data. Acad Radiol 2008; 15: pp. 438-443.

  • 36. Metz C.E.: Some practical issues of experimental design and data analysis in radiological ROC studies. Invest Radiol 1989; 24: pp. 234-245.

This post is licensed under CC BY 4.0 by the author.