Home A Guide to Stereoscopic 3D Displays in Medicine
Post
Cancel

A Guide to Stereoscopic 3D Displays in Medicine

Stereoscopic displays can potentially improve many aspects of medicine. However, weighing the advantages and disadvantages of such displays remains difficult, and more insight is needed to evaluate whether stereoscopic displays are worth adopting. In this article, we begin with a review of monocular and binocular depth cues. We then apply this knowledge to examine how stereoscopic displays can potentially benefit diagnostic imaging, medical training, and surgery. It is apparent that the binocular depth information afforded by stereo displays 1) aid the detection of diagnostically relevant shapes, orientations, and positions of anatomical features, especially when monocular cues are absent or unreliable; 2) help novice surgeons orient themselves in the surgical landscape and perform complicated tasks; and 3) improve the three-dimensional anatomical understanding of students with low visual-spatial skills. The drawbacks of stereo displays are also discussed, including extra eyewear, potential three-dimensional misperceptions, and the hurdle of overcoming familiarity with existing techniques. Finally, we list suggested guidelines for the optimal use of stereo displays. We provide a concise guide for medical practitioners who want to assess the potential benefits of stereo displays before adopting them.

Stereoscopic displays can be found in applications ranging from cinema to medical imaging to scientific visualization . Because these displays can convey more accurate depth information than nonstereo displays, they can potentially benefit several aspects of image-based medicine. In particular, stereo imaging could 1) make complicated shapes and structures easier to identify, 2) aid the user in assessing large data sets, and 3) through its integration in virtual-reality modules, decrease the cost of training and health care in general. However, the technology has drawbacks, including equipment complexity and the current necessity for eyewear. Stereo displays therefore must demonstrate a clear advantage over existing techniques if they are to be widely adopted. We provide a resource for evaluating and implementing stereo displays for a wide range of medical applications. We begin by discussing the unique visual cues afforded by stereoscopic displays. We then discuss the specific advantages and disadvantages of existing medical applications of stereo displays, and possible reasons why it has not been adopted more widely. The general advantages and disadvantages are also summarized, followed by tips for the optimal use of stereo displays to help avoid misperceptions and visual fatigue.

Depth and displays

All stereo displays operate on the same basic mechanism: a unique image is presented to each eye. The differences between the images (known as a stereo pair) are interpreted by the visual system as depth information in a process known as stereopsis. The relative positions of an object’s projections onto the two retinas are used by the visual system to recover the distance to that object . Stereopsis is not the only source of depth information in typical images. In fact, the name three-dimensional (3D) display is a misnomer because images have always conveyed 3D information. The key distinction in 3D displays is that they provide stereoscopic depth information in addition to the monocular depth cues attainable with any display. Therefore, for the remainder of this review we will use the term stereoscopic display or stereo display in place of the more common 3D display .

To differentiate between stereoscopic and conventional displays, we begin with a discussion of the visual depth cues afforded by each technology.

Monocular Depth Cues

Any conventional display can present monocular depth cues. These are cues to depth that are useful to the visual system, even if they are acquired by only one eye. The depth cues most relevant to our discussion are perspective projection, occlusion, familiar size, shading, and the motion-based cues known as structure from motion and motion parallax.

“Perspective projection,” or how a 3D scene is projected onto a two-dimensional (2D) image plane, offers several depth cues. Objects that are farther away from the imaging device are projected to smaller sizes than objects that are close. Parallel lines that recede into the scene (such as the lines on a road) project to converging lines in an image. Portions of the parallel lines that are spaced farther apart on the image surface are closer to the observer than portions that are spaced closer together. This phenomenon is evident with the parallel lines on the wall in the right-hand portion of Figure 1 . Perspective projection also produces texture gradients. Texture patches, like those that make up wallpaper, will project to have larger and more widely spaced elements on the image when they are closer to the imaging device. The orientation of texture elements can also reveal 3D shape .

Figure 1, Several examples of monocular depth cues. Perspective projection causes the more distance supports in the handrail on the right side of the image to project to smaller sizes in the image. Occlusions make it obvious that the tractor is closer to the camera than the board that it is partially blocking from view. Shadows reveal the three-dimensional shape of the shovel on the front of the tractor. Familiar size also makes it possible to estimate the relative distances to the red car and the people in the scene.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Stereoscopic Depth

Get Radiology Tree app to read full this article<

Figure 2, Stereogram of a field of dots with floating patches. To view the stereogram, hold the page at arm’s length and either (a) cross your eyes so the left eye is directed at the left image and the right eye is directed at the middle image ( divergent fusing ), or (b) point the left eye at the right image and the right eye at the middle image ( cross fusing ). In both cases, once fusion is attained, four images will be visible. With divergent fusing, attend to the image second from the left. With cross fusing, attend to the image second from the right. Without fusing the images, it is impossible to tell which of the patches is closer. However, once the images are fused, their depth ordering becomes apparent, as well as the three-dimensional positions of the random dots.

Get Radiology Tree app to read full this article<

Stereo Imaging and Virtual Reality

Get Radiology Tree app to read full this article<

Medical applications of stereo displays

Get Radiology Tree app to read full this article<

Diagnostics

Get Radiology Tree app to read full this article<

Ophthalmic imaging

Get Radiology Tree app to read full this article<

Mammography

Get Radiology Tree app to read full this article<

Figure 3, Example mammography stereogram. See viewing instructions from Figure 2 . Viewed individually, the images in each panel do not provide depth information. But viewed stereoscopically, the three-dimensional (3D) structure of the tissue becomes apparent, including the location of the mass on the left side. Reproduced from Getty D & Green P, “Clinical applications for stereoscopic 3D displays”, published in the Journal of the Society for Information Display, Vol. 15, No. 6, with permissions by the authors and The Society for Information Display.

Get Radiology Tree app to read full this article<

Vascular imaging

Get Radiology Tree app to read full this article<

Figure 4, Example angiographic stereogram. The connectedness and depth ordering of the vessels are most easily understood under stereoscopic viewing. Images courtesy of Maksim Shapiro ( http://neuroangio.org ).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Orthopedic imaging

Get Radiology Tree app to read full this article<

Medical Training

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 5, Example image from stereoscopic laparoscopy. The lack of strong perspective cues like straight lines and right angles and unfamiliar lighting makes it difficult to recover three-dimensional shape information from a single image. When viewed stereoscopically, however, the tissues’ shapes become evident. In particular, note how the thin flap of tissue is perceived as more separated from the background during stereoscopic viewing. Images courtesy of Peter Mountney and colleagues (94) .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Surgical Planning

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Laparoscopy

Get Radiology Tree app to read full this article<

Telesurgery

Get Radiology Tree app to read full this article<

Augmented Reality Surgery

Get Radiology Tree app to read full this article<

Figure 6, Stereogram of an augmented reality system in use on a surgical phantom. The overlaid computer-generated imagery provides a virtual view directly into the surgical site. Note that the use of stereo imagery more convincingly makes the overlaid graphic appear to be a hole, rather than a texture projected onto the surface of the phantom. Images are from Figure 5 of Fuchs H et al. “Augmented reality visualization for laparoscopic surgery,” in the Proceedings of the First International Conference on Medical Image Computing and Computer-Assisted Intervention, Vol. 1496. Copyright Springer-Verlag Berlin Heidelberg (1998) and appears with kind permission from Henry Fuchs and Springer Science+Business Media B.V.

Get Radiology Tree app to read full this article<

Summary of benefits

Get Radiology Tree app to read full this article<

Summary of drawbacks

Get Radiology Tree app to read full this article<

Hardware

Get Radiology Tree app to read full this article<

Resistance to New Technology

Get Radiology Tree app to read full this article<

Lost Detail from Slice Data

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Viewer Discomfort

Get Radiology Tree app to read full this article<

Figure 7, Variations in vergence and accommodation with natural viewing and typical stereoscopic displays. (a) The eyes’ vergence and accommodative states are coupled in natural viewing. Here, vergence and accommodation are both set to the far corner of an open-hinge stimulus. The light from the edges of the hinge are physically closer to the eyes than the far corner, so they appear out of focus. (b) On a typical stereo display, vergence and accommodation are uncoupled. Vergence can vary through the three-dimensional scene (here it is trained on the corner of the hinge), but accommodation must remain fixed on the surface of the display to keep the image sharp. Note that the entire hinge is imaged sharply, because all of the light is originating at the surface of the display. The mismatch between the vergence and accommodative states of the eyes has been proven to be a source of discomfort and fatigue with stereoscopic displays. (c) Partially blurred retinal image. (d) Completely sharp retinal image. Figure adapted with permission from David Hoffman and colleagues (79) .

Get Radiology Tree app to read full this article<

Stereoscopic Misperceptions

Get Radiology Tree app to read full this article<

Figure 8, Common distortions with stereoscopic displays. (a) A viewer at the correct viewing location, observing a stereoscopic image of a 3D cross-section of a human skull. Because the observer is in the correct location, he or she correctly perceives the shape of the skull. However, sitting too far (b) or too close (c) to the display causes the perceived shape to expand or compress, respectively. Stereoscopic viewing software sometimes allows one to adjust the spacing between the left and right images. But (d) shows how increasing the spacing can expand objects in depth and move them farther away. Finally, (e) depicts how 3D shapes shift and shear as the observer moves to the left and right of the optimal viewing location. See the section Avoiding Stereoscopic Misperceptions with Stereoscopic Displays for tips on avoiding such misperceptions. Skull model created by W.E. Lorenson (95) , based on data from the Visible Human Project (96) .

Get Radiology Tree app to read full this article<

Avoiding misperceptions with stereoscopic displays

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 9, The keystone effect. When the bodies of a pair of stereo cameras are set to converge, vertical magnification of the two images’ projections vary with horizontal position. The mismatch causes regions of misalignment and can produce misperceptions.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Tips and guidelines

Get Radiology Tree app to read full this article<

Tip 1: Parallel Cameras

Get Radiology Tree app to read full this article<

Tip 2: Do Not Flip Images

Get Radiology Tree app to read full this article<

Tip 3: Keep Eyes Centered and Parallel Relative to Display

Get Radiology Tree app to read full this article<

Tip 4: Minimize Vergence-accommodation Conflict

Get Radiology Tree app to read full this article<

Tip 5: Appropriate Use of Pictorial Blur

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Acknowledgment

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Lipton L.: Foundations of the stereoscopic cinema: a study in depth.1982.Van Nostrand ReinholdNew York, NY

  • 2. Chan H.P., Goodsitt M.M., Helvie M.A., et. al.: ROC study of the effect of stereoscopic imaging on assessment of breast lesions. Med Phys 2005; 32: pp. 1001-1009.

  • 3. Fröhlich B., Barrass S., Zehner B., et. al.: Exploring geo-scientific data in virtual environments. In: VIS ’99: Proceedings of the conference on Visualization ’99.1999.IEEE Computer Society PressLos Alamitos, CA, USA p. 169–173

  • 4. Palmer S.: Vision science: from photos to phenomenology.1999.MIT PressCambridge, MA

  • 5. Sun J., Perona P.: Where is the sun?. Nat Neurosci 1998; 1: pp. 183-184.

  • 6. O’Shea JP, Banks MS, Agrawala M. The assumed light direction for perceiving shape from shading. In: APGV ’08: Proceedings of the 5th symposium on Applied perception in graphics and visualization. New York, NY. p. 135–142.

  • 7. Simons K.: Stereoacuity norms in young children. Arch Ophthalmol-Chic 1981; 99: pp. 439-445.

  • 8. Ahmed J., Ward T., Bursell S., et. al.: The sensitivity and specificity of nonmydriatic digital stereoscopic retinal imaging in detecting diabetic retinopathy. Diabetes Care 2006; 29: pp. 2205-2209.

  • 9. Abramoff M., Alward W., Greenlee , et. al.: Automated segmentation of the optic disc from stereo color photographs using physiologically plausible features. Invest Ophthalmol Vis Sci 2007; 48: pp. 1665-1673.

  • 10. Bergua A., Mardin C.Y., Horn F.K.: Tele-transmission of stereoscopic images of the optic nerve head in glaucoma via internet. Telemed e-Health 2009; 15: pp. 439-444.

  • 11. Getty D., D’Orsi C., Pickett R.: Stereoscopic digital mammography: Improved accuracy of lesion detection in breast cancer screening. Lecture Notes Comp Sci 2008; 5116: pp. 74-79.

  • 12. Hernandez A., Basset O., Bremond A., et. al.: Stereoscopic visualization of three-dimensional ultrasonic data applied to breast tumours. Eur J ultrasound 1998; 8: pp. 51-65.

  • 13. Tanaka C., Fujii H., Ikeda T., et. al.: Stereoscopic scintigraphic imaging of breast cancer sentinel lymph nodes. Breast Cancer 2007; 14: pp. 92-99.

  • 14. Nelson T., Ji E., Lee J., et. al.: Stereoscopic evaluation of fetal bony structures. J Ultrasound Med 2008; 27: pp. 15-24.

  • 15. Sollenberger R., Milgram P.: Effects of stereoscopic and rotational displays in a three-dimensional path-tracing task. Human Factors J Human Factors Ergon Soc 1993; 3: pp. 483-499.

  • 16. Serra L, Hern N, Choon C, et al. Interactive vessel tracing in volume data. SI3D ’97: Proceedings of the 1997 Symposium on Interactive 3D graphics 1997.

  • 17. Sun Z., Squelch A., Bartlett A., et. al.: 3D stereoscopic visualization of fenestrated stent grafts. Cardiovasc Intervent Radiol 2009; 32: pp. 1053-1058.

  • 18. Zhou L., Wang Y., Goh L., et. al.: Stereoscopic visualization and editing of automatic abdominal aortic aneurysms (AAA) measurements for stent graft planning. Proc SPIE 2006; 6055: pp. 57-65.

  • 19. Sun Z., Lawrence-Brown M.: CT virtual endoscopy and 3D stereoscopic visualisation in the evaluation of coronary stenting. Biomed Imaging Intervention J 2009; 5: pp. 1-6.

  • 20. Moll T., Douek P., Finet G., et. al.: Clinical assessment of a new stereoscopic digital angiography system. Cardiovasc Intervent Radiol 1998; 21: pp. 11-16.

  • 21. Sekiguchi R., Satake M., Oyama H., et. al.: Stereoscopic visualization system for clinical angiography. Studies Health Technol Informatics 1996; 29: pp. 690-693.

  • 22. Kickuth R., Hartung G., Laufer U., et. al.: Stereoscopic 3D CT vs standard 3D CT in the classification of acetabular fractures: an experimental study. Br J Radiol 2002; 75: pp. 422.

  • 23. Luursema J., Verwey W., Kommers P., et. al.: Optimizing conditions for computer-assisted anatomical learning. Interacting Computers 2006; 18: pp. 1123-1138.

  • 24. Nicholson D., Chalk C., Funnell W., et. al.: Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model. Med Educ 2006; 40: pp. 1081.

  • 25. Luursema J., Verwey W., Kommers P., et. al.: The role of stereopsis in virtual anatomical learning. Interacting Computers 2008; 20: pp. 455-460.

  • 26. Tendick F., Downes M., Cavusoglu M., et. al.: Development of virtual environments for training skills and reducing errors in laparoscopic surgery. Proc SPIE Int Symp Biol Optics (BIOS’98) 1998; pp. 36-44.

  • 27. Wong G., Zhu C., Ahuja A., et. al.: Craniotomy and clipping of intracranial aneurysm in a stereoscopic virtual reality environment. Neurosurgery 2007; 61: pp. 564.

  • 28. Pieper S., Delp S., Rosen J., et. al.: Virtual environment system for simulation of leg surgery. Pro SPIE 1991; 1457: pp. 188.

  • 29. Tuggy M.L.: Virtual reality flexible sigmoidoscopy simulator training: impact on resident performance. J Amer Bd Family Practice/Amer Bd Family Practice 1998; 11: pp. 426-433.

  • 30. Taffinder N., Smith S.G., Huber J., et. al.: The effect of a second-generation 3D endoscope on the laparoscopic precision of novices and experienced surgeons. Surg Endosc 1999; 13: pp. 1087-1092.

  • 31. Patel H., Ribal M., Arya M., et. al.: Is it worth revisiting laparoscopic three-dimensional visualization? A validated assessment. Urology 2007; 70: pp. 47-49.

  • 32. Tevaearai H.T., Mueller X.M., von Segesser L.K.: 3-D vision improves performance in a pelvic trainer. Endoscopy 2000; 32: pp. 464-468.

  • 33. Ilgner J., Park J., Labbé D., et. al.: Using a high-definition stereoscopic video system to teach microscopic surgery. Proc SPIE-IS&T Electron Imaging 2007; 6490: pp. 81-87.

  • 34. Prystowsky J., Regehr G., Rogers D., et. al.: A virtual reality module for intravenous catheter placement. Am J Surg 1999; 177: pp. 171-175.

  • 35. Johnson W., Rickel J., Stiles R., et. al.: Integrating pedagogical agents into virtual environments. Presence 1998; 7: pp. 523-546.

  • 36. Hu Y. The role of three-dimensional visualization in surgical planning of treating lung cancer. Engineering in Medicine and Biology Society, 2005 IEEE-EMBS 2005 27th Annual International Conference of the 2005; 646–649.

  • 37. Fishman E., Kuszyk B., Heath D., et. al.: Surgical planning for liver resection. Computer 1996; 29: pp. 64-72.

  • 38. Wigmore S., Redhead D., Yan X., et. al.: Virtual hepatic resection using three-dimensional reconstruction of helical computed tomography angioportograms. Ann Surg 2001; 23: pp. 221-226.

  • 39. Hemminger B.M., Molina P.L., Egan T.M., et. al.: Assessment of real-time 3D visualization for cardiothoracic diagnostic evaluation and surgery planning. J Digital Imaging 2005; 18: pp. 145-153.

  • 40. Lee S., Shinohara H., Matsuki M., et. al.: Preoperative simulation of vascular anatomy by three-dimensional computed tomography imaging in laparoscopic gastric cancer surgery. J Amer Coll Surg 2003; 197: pp. 927-936.

  • 41. Xia J., Ip H.H., Samman N., et. al.: Computer-assisted three-dimensional surgical planning and simulation: 3D virtual osteotomy. Int J Oral Maxillofacial Surg 2000; 29: pp. 11-17.

  • 42. Kikinis R., Gleason P., Moriarty T., et. al.: Computer-assisted interactive three-dimensional planning for neurosurgical procedures. Neurosurgery 1996; 38: pp. 640-651.

  • 43. Gering D., Nabavi A., Kikinis R., et. al.: An integrated visualization system for surgical planning and guidance using image fusion and interventional imaging. J Magnetic Reson Imaging 2001; 13: pp. 967-975.

  • 44. Kockro R., Serra L., Tseng-Tsai Y., et. al.: Planning and simulation of neurosurgery in a virtual reality environment. Neurosurgery 2000; 46: pp. 118-135.

  • 45. Hernes T., Ommedal S., Lie T.: Stereoscopic navigation-controlled display of preoperative MRI and intraoperative 3D ultrasound in planning and guidance of neurosurgery: new technology for minimally invasive image-guided surgery approaches. Minim Invasive Neurosurg 2003; 46: pp. 129-137.

  • 46. Ng I., Hwang P.Y.K., Kumar D., et. al.: Surgical planning for microsurgical excision of cerebral arterio-venous malformations using virtual reality technology. Acta Neurochir 2009; 151: pp. 453-463.

  • 47. Rosahl S., Gharabaghi A., Hubbe U., et. al.: Virtual reality augmentation in skull base surgery. Skull Base 2006; 16: pp. 59-66.

  • 48. Burt D.: Virtual reality in anaesthesia. Br J Anaesth 1995; 75: pp. 472-480.

  • 49. Tendick F., Downes M., Goktekin T., et. al.: A virtual environment testbed for training laparoscopic surgical skills. Presence: Teleoperators & Virtual Environments 2000; 9: pp. 236-255.

  • 50. Hofmeister J., Frank T., Cuschieri A., et. al.: Perceptual aspects of two-dimensional and stereoscopic display techniques in endoscopic surgery: review and current problems. Surg Innovation 2001; 8: pp. 12-24.

  • 51. Crosthwaite G., Chung T., Dunkley P., et. al.: Comparison of direct vision and electronic two-and three-dimensional display systems on surgical task efficiency in endoscopic surgery. Brit J Surg 1995; 82: pp. 849-851.

  • 52. Hanna G., Shimi S., Cuschieri A.: Randomised study of influence of two-dimensional versus three-dimensional imaging on performance of laparoscopic cholecystectomy. Lancet 1998; 351: pp. 248-251.

  • 53. McDougall E.M., Soble J.J., Wolf J.S., et. al.: Comparison of three-dimensional and two-dimensional laparoscopic video systems. J Endourol 1996; 10: pp. 371-374.

  • 54. Blavier A., Nyssen A.S.: Influence of 2D and 3D view on performance and time estimation in minimal invasive surgery. Ergonomics 2009; 52: pp. 1342-1349.

  • 55. Mueller-Richter U., Limberger A., Weber P.: Comparison between three-dimensional presentation of endoscopic procedures with polarization glasses and an autostereoscopic display. Surg Endosc 2003; 17: pp. 1432-2218.

  • 56. Jourdan I., Dutson E., Garcia A., et. al.: Stereoscopic vision provides a significant advantage for precision robotic laparoscopy. Br J Surg 2004; 91: pp. 879-885.

  • 57. Pietrabissa A., Scarcello E., Carobbi A., et. al.: Three-dimensional versus two-dimensional video system for the trained endoscopic surgeon and the beginner. Endoscopic Surg Allied Technol 1994; 2: pp. 315-317.

  • 58. Blavier A., Gaudissart Q., Cadiere G., et. al.: Impact of 2D and 3D vision on performance of novice subjects using da Vinci robotic system. Acta Chir Belgica 2006; 106: pp. 662-664.

  • 59. Byrn J., Schluender S., Divino C., et. al.: Three-dimensional imaging improves surgical performance for both novice and experienced operators using the da Vinci robot system. The Amer J Surg 2007; 193: pp. 519-522.

  • 60. Maurer C.J., Sauer F., Hu B., et. al.: Augmented reality visualization of brain structures with stereo and kinetic depth cues: system description and initial evaluation with head phantom. Proc SPIE: Medical Imaging 2001; 6: pp. 445-456.

  • 61. Nikou C., Digioia A., Blackwell M., et. al.: Augmented reality imaging technology for orthopaedic surgery. Operative Techniques Orthopaed 2000; 10: pp. 82-86.

  • 62. Wendt M., Sauer F., Khamene A., et. al.: A head-mounted display system for augmented reality: Initial evaluation for interventional MRI. RöFo Fortschr Geb Rontgenstr Neuen Bildgeb Verfahr 2003; 3: pp. 418-421.

  • 63. Rosenthal M., State A., Lee J., et. al.: Augmented reality guidance for needle biopsies: an initial randomized, controlled trial in phantoms. Med Image Anal 2002; 6: pp. 313-320.

  • 64. Wacker F., Vogt S., Khamene A., et. al.: An augmented reality system for MR image-guided needle biopsy: initial results in a swine model. Radiology 2006; 238: pp. 497-504.

  • 65. Edwards P., Johnson L., Hawkes D., et. al.: Clinical experience and perception in stereo augmented reality surgical navigation. Proc Intl Workshop on Med Imaging and Augmented Reality 2004; 3150: pp. 369-376.

  • 66. Bajura M, Fuchs H, Ohbuchi R. Merging virtual objects with the real world: seeing ultrasound imagery within the patient. Proc 19th Annu Conference on Computer Graphics and Interactive Techniques 1992; 203–210.

  • 67. Fuchs H, Livingston M, Raskar R, et al. Augmented reality visualization for laparoscopic surgery. Proc First Intl Conf on Medical Image Computing and Computer-Assisted Intervention 1998; 934–943.

  • 68. Zhai S., Buxton W., Milgram P.: The partial-occlusion effect: Utilizing semitransparency in 3D human-computer interaction. ACM Transactions on Computer-Human Interaction (TOCHI) 1996; 3: pp. 254-284.

  • 69. Chan H., Goodsitt M., Hadjiiski L., et. al.: Effects of magnification and zooming on depth perception in digital stereomammography: an observer performance study. Phys Med Biol 2003; 48: pp. 3721-3734.

  • 70. Goodsitt M., Chan H., Hadjiiski L.: Stereomammography: evaluation of depth perception using a virtual 3d cursor. Med Phys 2000; 27: pp. 1305.

  • 71. Novotny P, Kettler D, Jordan P, et al. Stereo display of 3D ultrasound images for surgical robot guidance. IEEE Intl Conf of the Engineering in Medicine and Biology Society, New York, NY 2006.

  • 72. van Bergen P., Kunert W., Bessell J., et. al.: Comparative study of two-dimensional and three-dimensional vision systems for minimally invasive surgery. Surg Endosc 1998; 12: pp. 948-954.

  • 73. Hu T., Allen P., Nadkarni T., et. al.: Insertable stereoscopic 3D surgical imaging device with pan and tilt. Intl J Robotics Res 2009; 28: pp. 1373-1386.

  • 74. Matusik W., Pfister H.: 3D TV: a scalable system for real-time acquisition, transmission, and autostereoscopic display of dynamic scenes. ACM Transactions on Graphics 2004; 23: pp. 814-824.

  • 75. Maidment A.D.A., Bakic P.R., Albert M.: Effects of quantum noise and binocular summation on dose requirements in stereoradiography. Med Phys 2003; 30: pp. 3061-3071.

  • 76. Ilgner J.F.R., Kawai T., Shibata T., et. al.: Evaluation of stereoscopic medical video content on an autostereoscopic display for undergraduate medical education. Pro SPIE 2006; 6055: pp. 605506.

  • 77. Lambooij M., IJsselsteijn W., Fortuin M., et. al.: Visual discomfort and visual fatigue of stereoscopic displays: a review. J Imaging Sci Technol 2009; 53: pp. 030201. p. 14

  • 78. Martens T.G., Ogle K.N.: Observations on accommodative convergence; especially its nonlinear relationships. Amer J Ophthalmol 1959; 47: pp. 455-463.

  • 79. Hoffman D.M., Girshick A.R., Akeley K., et. al.: Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. J Vision 2008; 8: pp. 33.

  • 80. Vishwanath D., Girshick A.R., Banks M.S.: Why pictures look right when viewed from the wrong place. Nature Neurosci 2005; 8: pp. 1401-1410.

  • 81. Woods A.J., Docherty T., Koch R.: Image distortions in stereoscopic video systems. SPIE: Stereoscopic Displays and Applications IV 1993; 1915: pp. 36-48.

  • 82. Held RT, Banks MS. Misperceptions in stereoscopic displays: a vision-science perspective. In: APGV ’08: Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization. New York, NY, 2008, p. 23–32.

  • 83. Getty D., Green P.: Clinical applications for stereoscopic 3-D displays. J SID 2007; 15: pp. 377-384.

  • 84. Bernstein J. The five senses of man: 91 woodcuts. London: Harry S. King & Co., 1876.

  • 85. Balassy C., Prokop M., Weber M., et. al.: Flat-panel display (LCD) versus high-resolution gray-scale display (CRT) for chest radiography: an observer preference study. AJR Amer J Roentgenol 2005; 184: pp. 752-756.

  • 86. Pisano E.D., Seibert J.A., Andriole K.P., et. al.: Practice guideline for determinants of image quality in digital mammography.2007.American College of RadiologyReston, VA

  • 87. Watt S.J., Akeley K., Ernst M.O., et. al.: Focus cues affect perceived depth. J Vis 2005; 5: pp. 834-862.

  • 88. Love G., Hoffman D., Hands P., et. al.: High-speed switchable lens enables the development of a volumetric stereoscopic display. Optics Express 2009; 17: pp. 15716-15725.

  • 89. Cole F., DeCarlo D., Finkelstein A., et. al.: Directing gaze in 3D models with stylized focus. Eurographics Symp Rendering 2006; pp. 377-387.

  • 90. DiPaola S., Riebe C., Enns J.: Rembrandt’s textural agency: a shared perspective in visual art and science. Leonardo 2010; 4: pp. 145-151.

  • 91. Hillaire S, Lécuyer A, Cozot R, et al. Depth-of-field blur effects for first-person navigation in virtual environments. In: VRST ’07: Proceedings of the 2007 ACM symposium on Virtual reality software and technology. 2007; 203–206.

  • 92. Hillaire S., Lecuyer A., Cozot R., et. al.: Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments. Virtual Reality Conference, 2008 VR ’08 IEEE 2008; pp. 47-50.

  • 93. Held R.T., Cooper E.A., O’Brien J.F., et. al.: Using blur to affect perceived distance and size. ACM Transactions on Graphics 2010; 29: pp. 1-16.

  • 94. Mountney P., Stoyanov D., Yang G.: Three-dimensional tissue deformation recovery and tracking. Signal Proc Magazine IEEE 2010; 27: pp. 14-24.

  • 95. Lorensen WE. Marching through the visible man. In: Proceedings of the 6th Conference on Visualization ’95; VIS ’95. Washington, DC.

  • 96. Ackerman M.: The visible human project. Proc IEEE 1998; 86: pp. 504-511.

This post is licensed under CC BY 4.0 by the author.