Home Imaging Informatics
Post
Cancel

Imaging Informatics

There are rapid changes occurring in the health care environment. Radiologists face new challenges but also new opportunities. The purpose of this report is to review how new informatics tools and developments can help the radiologist respond to the drive for safety, quality, and efficiency. These tools will be of assistance in conducting research and education. They not only provide greater efficiency in traditional operations but also open new pathways for the delivery of new services and imaging technologies. Our future as a specialty is dependent on integrating these informatics solutions into our daily practice.

The health care environment is undergoing rapid change, whether secondary to health care reform , natural organic changes, or accelerated technological advances. The economics of health care, changes in the demographics of our population, and the rapidly evolving socioeconomic environment all contribute to a world that presents the radiologist with new challenges. New models of health care, including accountable care organizations, are emerging . Our profession must adapt; the traditional approach to delivering imaging services may not be viable. Despite the challenges, there are new opportunities presenting themselves in parallel. There are new and exciting information technologies (ITs) to offer our patients that can contribute to improving their health and that can position our profession to better tackle the challenges that lie ahead.

We will argue that new informatics tools and developments can help the radiology profession respond to the drive for safety, quality and efficiency. New research realms, both clinical and molecular, require sophisticated informatics tools. The health of the individual and an emerging focus on population health require IT solutions. We will start with a description of some fundamental informatics building blocks and progress to explore new and rapidly evolving applications of interest to radiologists.

A brief look backward

Radiology information systems (RIS) and picture archiving and communications systems (PACS), commonplace tools, are relatively recent developments. In 1983, the first American College of Radiology (ACR)–National Electrical Manufacturers Association (NEMA) Committee met to develop the ACR-NEMA standard , first published in 1985. In 1993, the rapid rise in the number of digital modalities and the parallel development of robust networking technology prompted the development of digital imaging and communications in medicine (DICOM) 3.0 .

Before RIS and PACS, consider how one viewed images, including cross-sectional exams of several hundred images. How were they displayed, archived, and moved about a department? We had film, dark rooms, light boxes, multichangers, and film libraries requiring numerous personnel. How were copies provided for consultation? How did clinicians see the exams they ordered? Historical exams were often stored off site and not available for days. Exams were often “borrowed” and out of circulation or out right lost. How did one manage an office or a department, schedule exams, and bill for one’s services? These steps took place at a much slower pace than today.

Our new technologies have been “disruptive”. Certain jobs have disappeared (eg, file room clerks). The number of “schedulers” has usually diminished. The number of radiologists required to read a defined volume of exams has diminished, as PACs has resulted in increased productivity.

Into the Future!

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Radiology Practice: Current State and into the Next Decade

Get Radiology Tree app to read full this article<

Table 1

Workflow and Information Technology (IT)Tools

Task IT Tool Description Order and schedule Electronic medical record: radiology order entry clinical decision support The right exam for the right reason RadLex Playbook Standard exam dictionary Interpretation Postprocessing Thin client; integrated into picture archiving and communications systems Cloud-based postprocessing High-end shared services Computer-assisted diagnosis Radiologist decision support Online tools: point of service Reporting Structured reporting Common reproducible ways of ensuring certain pieces of information are always present Natural language processing (NLP) Data mine free text Annotation and image markup Discrete information within the Image rather than the report Archive Local Enterprise Cloud Economies of scale; disaster recovery Vendor-neutral archive Multiple sources Image/report exchange Images/reports securely anywhere, anytime Health information exchange Personal health record Smartphone/tablets Quality Peer review Radiation dosimetry Regulatory reporting/certification Research Comparative effectiveness Data mining: metadata, NLP Education Interactive: audience participation Shareable Content Object Reference Model Repurposed, tailored to individual Real-time: during the interpretation

Get Radiology Tree app to read full this article<

Informatics tools: the fundamental building blocks

Get Radiology Tree app to read full this article<

Standards

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Standardized Terminology

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Image Metadata

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

IT infrastructure: the underpinnings of radiology operations

Ordering, Scheduling, Exam Protocols, and Billing

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Radiology Order Entry Clinical Decision Support

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Making CDS Operational

Get Radiology Tree app to read full this article<

Figure 1, The workflow (a,b) starts when a clinician enters an order for an imaging exam into the electronic medical record (EMR). In the past, the order would have been sent directly into a radiology information system (RIS) and scheduled. In the new workflow, the EMR first sends the order to another module or system, the radiology clinical decision support (CDS). Here, the order is evaluated to determine if it is appropriate, using a reference source such as the American College of Radiology Appropriateness Criteria. If the evaluation results in a high score, the order is sent directly to the RIS. If the order receives an intermediate or a low score, a message is returned to the EMR (c* or d*), indicating that this might not be the best choice. Alternative examinations may be suggested, and in some systems references may be provided. (c,d) Different styles of returning this information. The clinician may continue with the original order or choose one of the suggestions. MR, magnetic resonance; CT, computed tomography; MRA, MR angiography; CTA, CT angiography; IV, intravenous. (Figures 1c and 1d courtesy of the National Decision Support Company [ACR Select]). (Color version of figure is available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Interpreting the image

Decision Support for the Radiologist

Get Radiology Tree app to read full this article<

Figure 2, There are a variety of tools that provide the radiologist with decision support. These include online search tools and point of service tools, integrated into the radiology reporting process. (a) myRSNA is a radiology portal hosted by the Radiological Society of North America (RSNA). It offers a variety of services including a robust search function. One can bookmark references and even read some for continuing medical education credit online. (b) ARRS Goldminer offers a unique approach in searching. It has indexed the text of figure captions. It can search for terms included in the captions and brings back the figures, captions, and articles in which they are included. (c) This figure is taken from the interface of a voice recognition dictation product. It embeds a “wizard” to search terms on the fly. The user interface provides a list of the internet sites it has available to search. Some of these may require the user to have an additional license. (Figure 2c is courtesy of Nuance, taken from their Powerscribe 360 product). (Color version of figure is available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Computer-Assisted Diagnosis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, Computer-assisted diagnosis (CAD): a sample set of images is provided from a computed tomography (CT) lung nodule CAD. A volumetric representation is provided indicating where the potential nodules are located (a) . Each individual axial section that includes a nodule is also presented (b) . The candidate nodule is circled, and volumetric and density measurements are provided. If an historical exam is present, this system can perform temporal comparisons (c) . Each of these images is sent as part of a series to picture archiving and communications systems (PACS) (d) . If there are multiple axial images, they are included as a single series. A table (report) listing all the nodules is also sent as a series to PACS. (Color version of figure is available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

A New Level of Decision Support

Get Radiology Tree app to read full this article<

New paradigms in reporting

The New Narrative Report

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Reporting the Metadata: AIM

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 4, AIM (annotation and image markup) is a new tool to expose image metadata and make it accessible for a variety of applications. (a) Image metadata. This shows an example image and excerpts from the radiology report. The graphical symbols drawn by the radiologist on the image (“markups” indicating measurements of a lesion—quantitative data) and the statements in the report about the patient, type of exam, technique, date, imaging observations, and anatomic localization (semantic data) collectively comprise the image metadata (“annotation”). These image metadata, if stored in a standardized, machine-accessible format, greatly enable many computer applications to help radiologists in their daily work. (b) ePAD-rich Web client. The ePAD application provides a platform-independent and thin client implementation of an AIM-compliant image viewing workstation. Information about lesions that are marked up and reported by radiologists is captured and stored in AIM XML (or DICOM-SR [digital imaging and communications in medicine–structured report]). User-definable templates capture semantic information about lesions, such as shown in this case for oncology reporting, the type of lesion (target), anatomic location (liver), and type of imaging exam (baseline evaluation). (c) Radiology image information summarization application levering the utility of AIM-encoded image metadata. A cancer lesion–tracking application has queried AIM annotations created on different imaging studies (in this case, from three studies on April 3, June 6, and August 6, 2008). The application automatically calculates the sum of each target lesion measured on each imaging study date and summarizes the results in a table ( left ) and graph ( right ). Using metadata from the AIM annotations, the application also displays alternative response measures such as maximum length ( red line ) or cross-sectional area ( black line ) of the measured lesions. (Color version of figure is available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Radiology in the cloud

Image and Report Exchange

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 5, Integrating the Healthcare Enterprise (IHE) describes a series of profiles known as XDS or Cross Enterprise Document Sharing. There is a variant to accommodate the large files that comprise images, known as XDS-I. IHE describes sets of transactions based on common standards so that multiple parties can design systems that can easily interact—true interoperability. The XDS profiles describe a “document source,” where a piece of patient data is created, and a “document consumer,” which is the destination for the data when exchanging with a remote site. There are set of intermediaries that handle the exchange.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

CAD Everywhere

Get Radiology Tree app to read full this article<

Miscellaneous functions in a radiology practice

Quality

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 6, (a) Integrating the Healthcare Enterprise (IHE) includes a Radiation Exposure Monitoring (REM) profile. It describes how to collect dose information from a modality, and store it locally. It also describes a set of transactions to share it with an external registry. (b) A graphical representation demonstrates the flow of the dosimetry information from modalities to a local archive, an analytics application, and ultimately a national registry. (Color version of figure is available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Research

Comparative Effectiveness Research

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Research Recruitment

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Big Data

Get Radiology Tree app to read full this article<

Education

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Figure 7, (a) This cycle of imaging demonstrates how the science of radiology supports the best practices of patient care and provides new knowledge and feedback to continually advance medical science. The informatics tools described throughout this review are the enablers of this cycle. (b) A practical example demonstrates how clinical decision support (CDS) leads the clinician to the best exam and how decision support tools assist the radiologist in making a specific diagnosis. New structured reporting and NLP tools quickly help to direct the patient to ongoing clinical trials. AIM, annotation and image markup; CDS, clinical decision support; CT, computed tomography; C-CT, CT without contrast; EMR, electronic medical record; MR, magnetic resonance; NLP, natural language processing; RIS, radiology information systems.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. American Recovery and Reinvestment Act of 2009. Public Law 111–115. Available at http://www.ignet.gov/pande/leg/PL1115.pdf .

  • 2. Lexa F.J.: Drivers of Health Reform in the United States: 2012 and beyond. J Am Coll Radiol 2012; 9: pp. 689-693.

  • 3. Rawson J.V.: Roots of health care reform. J Am Coll Radiol 2012; 9: pp. 684-688.

  • 4. Reinertsen J.L.: The Moreton Lecture: choices faced by radiology in the era of accountable health care. J Am Coll Radiol 2012; 9: pp. 620-624.

  • 5. Bidgood W.D., Horii S.C.: Introduction to the ACR-NEMA DICOM standard. Radiographics 1992 Mar; 12: pp. 345-355.

  • 6. Pianykh O.S.: Digital imaging and communications in medicine (DICOM): a practical introduction and survival guide. Kindle edition.2008.Springer-VerlagBerlin Heidelberg

  • 7. Webb G.: Making the cloud work for healthcare: cloud computing offers incredible opportunities to improve healthcare, reduce costs and accelerate ability to adopt new IT services. Health Manage Technol 2012; 33: pp. 8-9.

  • 8. Shrestha R.B.: Imaging on the cloud. Appl Radiol 2011; 40: pp. 8-12.

  • 9. Langer S.G.: Challenges for data storage in medical imaging research. J Digit Imaging 2011; 24: pp. 203-207.

  • 10. IHE available at www.ihe.net . Accessed October 28, 2012.

  • 11. IHE Technical Framework. Available at http://wiki.ihe.net/index.php?title=Frameworks . Accessed October 28, 2012.

  • 12. Rubin D.L.: Creating and curating a terminology for radiology: ontology modeling and analysis. J Digit Imaging 2008; 21: pp. 355-362.

  • 13. Hong Y., Kahn C.E.: Analysis of RadLex coverage and term co-occurrence in radiology reporting templates. J Digital Imaging 2012; 25: pp. 56-62.

  • 14. Rubin D.L., Flanders A., Kim W., et. al.: Ontology-assisted analysis of web queries to determine the knowledge radiologists seek. J Digital Imaging 2011; 24: pp. 160-164.

  • 15. Langlotz C.P.: RadLex: a new method for indexing online educational materials. Radiographics 2006; 26: pp. 1595-1597.

  • 16. Marwede D., Schulz T., Kahn T.: Indexing thoracic CT reports using a preliminary version of a standardized radiological lexicon (RadLex). J Digital Imaging 2008; 21: pp. 363-370.

  • 17. Kahn C.E., Rubin D.L.: Improving radiology image retrieval through automated semantic indexing of figure captions. J Am Med Informat Assoc 2009; 16: pp. 380-386.

  • 18. A TRIP™ Initiative. Available at http://www.siimweb.org/index.cfm?id=766 . Accessed June 19, 2013.

  • 19. SIIM Workflow Initiative in Medicine (SWIM™). Available at http://www.siimweb.org/trip . Accessed June 19, 2013

  • 20. Hunt D.L., Haynes R.B., Hanna S.E., et. al.: Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. JAMA 1998; 280: pp. 1339-1346.

  • 21. Randolph A.G., Haynes R.B., Wyatt J.C., et. al.: Users’ guides to the medical literature: XVIII. How to use an article evaluating the clinical impact of a computer-based clinical decision support system. JAMA 1999; 282: pp. 67-74.

  • 22. Broverman C.A.: Standards for clinical decision support systems. J Healthcare Inform Manage 1999; 13: pp. 23-31.

  • 23. Kuperman G.J., Sittig D.F., Shabot M.M.: Clinical decision support for hospital and critical care. J Healthcare Inform Manage 1999; 13: pp. 81-96.

  • 24. Bates D.W., Kuperman G.J., Wang S., et. al.: Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc 2003; 10: pp. 523-530.

  • 25. Borgestede JP. Presentation to American Board of Radiology Foundation, August 6-7, 2009, Reston, Va. Available at http://www.abrfoundation.org/forms/borgestede_summit.pdf .

  • 26. Thrall J. Presentation to American Board of Radiology Foundation, August 6-7, 2009, Reston, Va. Available at http://www.abrfoundation.org/forms/thrall_summit.pdf .

  • 27. Duszak R., Berlin J.W.: Utilization management in radiology, part 1: rationale, history, and current status. J Am Coll Radiol 2012; 9: pp. 694-699.

  • 28. Duszak R., Berlin J.W.: Utilization management in radiology, part 2: perspectives and future directions. J Am Coll Radiol 2012; 9: pp. 700-703.

  • 29. Amis E.S., Butler P.F., Applegate E.A., et. al.: American College of Radiology white paper on radiation dose in medicine. J Am Coll Radiol 2007; 4: pp. 272-284.

  • 30. Berrington de Gonzalez A., Darby S.: Risk of cancer from diagnostic x-rays: estimates for the UK and 14 other countries. Lancet 2004; 363: pp. 345-351.

  • 31. Brenner D.J., Hall E.J.: Computed tomography: an increasing source of radiation exposure. N Engl J Med 2007; 357: pp. 2277-2284.

  • 32. Bershow B, Courneya P, Vinz C. Decision-support for more appropriate ordering of high-tech diagnostic imaging scans. Presentation at ICSI Coloquium 2009 Institute for Clinical Systems Improvement. May 4-6 2009, St. Paul, MN.

  • 33. Iglehart J.K.: The new era of medical imaging – progress and pitfalls. N Engl J Med 2006; 354: pp. 2822-2828.

  • 34. Iglehart J.K.: Health insurers and medical-imaging policy — a work in progress. N Engl J Med 2009; 360: pp. 1030-1037.

  • 35. PL Yong and LA Olsen 2009 IOM Roundtable on Evidence-Based Medicine. The healthcare imperative: lowering costs and improving outcomes. Brief summary of the workshop. Prepublication copy: uncorrected proofs. National Academy of Sciences Press. Available at http://www.nap.edu/catalog/12750.html .

  • 36. Taragin B.H., Feng L., Ruzal-Shapiro C.: Online radiology appropriateness survey: results and conclusions from an academic internal medicine residency. Acad Radiol 2003; 10: pp. 781-785.

  • 37. Si Lee: Does radiologist recommendation for follow-up with the same imaging modality contribute substantially to high-cost imaging volume?. Radiology 2007; 242: pp. 857-864.

  • 38. Gazelle G.S., Halpern E.F., Ryan H.S., et. al.: Utilization of diagnostic medical imaging: comparison of radiologist referral versus same-specialty referral. Radiology 2007; 245: pp. 517-522.

  • 39. Blackmore C.C., Mecklenburg R.S., Kaplan G.S.: Effectiveness of clinical decision support in controlling inappropriate imaging. J Am Coll Radiol 2011; 8: pp. 19-25.

  • 40. Rosenthal D.I., Weilburg J.B., Schultz T., et. al.: Radiology order entry with decision support: initial clinical experience. J Am Coll Radiol 2006; 3: pp. 799-806.

  • 41. Khorasani R.: Computerized physician order entry and decision support: improving the quality of care. Radiographics 2001; 21: pp. 1015-1018.

  • 42. Otero H.J., Ondategui-Parra S., Nathanson E.M., et. al.: Utilization management in radiology: basic concepts and applications. J Am Coll Radiol 2006; 3: pp. 351-357.

  • 43. Sistrom C.L.: The appropriateness of imaging: a comprehensive conceptual framework. Radiology 2009; 251: pp. 637-649.

  • 44. Sistrom C.L., Dang P.A., Weilburg J.L., et. al.: Effect of computerized order entry with integrated decision support on the growth of outpatient procedure volumes: seven-year time series analysis. Radiology 2009; 251: pp. 147-155.

  • 45. American College of Rradiology. Appropriateness criteria. Available at http://www.acr.org/Quality-Safety/Appropriateness-Criteria . Accessed October 28, 2012.

  • 46. Kahn C.E., Thao C.: GoldMiner: a radiology image search engine. AJR Am J Roentgenol 2007; 188: pp. 1475-1478.

  • 47. Johnson P.T., Fishman E.K.: Computed tomography dataset post processing: from data to knowledge. Mt Sinai J Med 2012; 79: pp. 412-421.

  • 48. Dromain C., Boyer B., Ferré R., et. al.: Computed-aided diagnosis (CAD) in the detection of breast cancer. Eur J Radiol 2012 Aug 29; Epub ahead of print

  • 49. Lee K.H., Goo J.M., Park C.M., et. al.: Computer-aided detection of malignant lung nodules on chest radiographs: effect on observers’ performance. Kor J Radiol 2012; 13: pp. 564-571.

  • 50. Perumpillichira J.J., Yoshida H., Sahani D.V.: Computer-aided detection for virtual colonoscopy. Cancer Imaging 2005; 5: pp. 11-16.

  • 51. Yuan M.J., Watson and healthcare: How natural language processing and semantic search could revolutionize clinical decision support.April 12, 2011.Watson and healthcare Trademarks © Copyright IBM Corporation 2011 available at http://www.ibm.com/developerworks/industry/library/ind-watson/

  • 52. Reiner Bruce: Radiology report innovation: the antidote to medical imaging commoditization. J Am Coll Radiol 2012; 9: pp. 455-457.

  • 53. Kahn C.E., Langlotz C.P., Burnside E.S., et. al.: Toward best practices in radiology reporting. Radiology 2009; 252: pp. 852-856.

  • 54. American College of Radiology: ACR practice guideline for communication of diagnostic imaging findings.2005.American College of RadiologyReston, VA

  • 55. Lakhani P., Kim W., Langlotz C.P.: Automated detection of critical results in radiology reports. J Digit Imaging 2012; 25: pp. 30-36.

  • 56. Lakhani P., Langlotz C.P.: Automated detection of radiology reports that document non-routine communication of critical or significant radiology results. J Digital Imaging 2010; 23: pp. 647-657.

  • 57. Lakhani P., Langlotz C.P.: Documentation of nonroutine communications of critical or significant radiology results: a multiyear experience at a tertiary hospital. J Am Coll Radiol 2010; 7: pp. 782-790.

  • 58. Dreyer Keith J., Kalra Mannudeep K., Maher Michael M., et. al.: Application of recently developed computer algorithm for automatic classification of unstructured radiology reports: validation study. Radiology 2005; 234: pp. 323-329.

  • 59. Rubin DL, Mongkolwat P, Kleper V, et al. Medical imaging on the semantic web: annotation and image markup. 2008 AAAI Spring Symposium Series, Semantic Scientific Knowledge Integration; Stanford University. Available at: http://stanford.edu/∼rubin/pubs/Rubin-AAAI-AIM-2008.pdf .

  • 60. caBIG in-vivo imaging workspace. Annotation and Image Markup (AIM). Available at: https://cabig.nci.nih.gov/tools/AIM . (Accessed December 26, 2008).

  • 61. Channin D.S., Mongkolwat P., Kleper V., et. al.: The Annotation and Image Markup (AIM) Project; version 2.0 update.2010.Scientific Paper, Society for Imaging Informatics in Medicine Annual Scientific MeetingMinneapolis, MN

  • 62. Jaffe T.A., Wickersham N.W., Sullivan D.C.: Quantitative imaging in oncology patients: part 1, radiology practice patterns at major U.S. cancer centers. AJR Am J Roentgenol 2010; 195: pp. 101-106.

  • 63. Abajian A.C., Levy M., Rubin D.L.: Informatics in radiology: improving clinical work flow through an AIM database: a sample web-based lesion tracking application. Radiographics 2012; 32: pp. 1543-1552.

  • 64. Rubin D.L., Rodriguez C., Shah P., et. al.: iPad: semantic annotation and markup of radiological images. AMIA Annu Symp Proc 2008; pp. 626-630.

  • 65. Zimmerman S.L., Kim W., Boonn W.W.: Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML. Radiographics 2011; 31: pp. 881-887.

  • 66. Rubin D.L., Korenblum D., Yeluri V., et. al.: Semantic annotation and image markup in a commercial PACS workstation.2010.Presented at the 96th annual scientific meeting of the RSNAChicago, IL

  • 67. Rubin DL, Snyder A, eds. ePAD: A cross-platform semantic image annotation tool. Presented at the 97th annual scientific meeting of the RSNA; 2011; Chicago, IL.

  • 68. Flanders AE: Medical image and data sharing: are we there yet?. Radiographics 2009; 29: pp. 1247-1251.

  • 69. Mendelson D.S., Bak P.R.G., Menschik E., et. al.: Informatics in radiology: image exchange: IHE and the evolution of image sharing. Radiographics 2008; 28: pp. 1817-1833.

  • 70. Mendelson D.S.: Image sharing: where we’ve been, where we’re going. Appl Radiol 2011; 40: pp. 6-11.

  • 71. Sodickson A., Opraseuth J., Ledbetter S.: Outside imaging in emergency department transfer patients: CD import reduces rates of subsequent imaging utilization. Radiology 2011; 260: pp. 408-413.

  • 72. Flanagan P.T., Relyea-Chew A., Gross J.A., et. al.: Using the Internet for image transfer in a regional trauma network: effect on CT repeat rate, cost, and radiation exposure. J Am Coll Radiol 2012; 9: pp. 648-656.

  • 73. RSNA Image Share Internet Database Network for Patient-Controlled Medical Image Sharing Data NIBIB/NHLBI Contract HHSN26800900060C

  • 74. Central Alabama Health Image Exchange. HHS Grant: 1RC2EB011412-01.

  • 75. Personally Controlled Sharing of Medical Images in the Rural and Urban Southeast North Carolina. NIBIB/NHLBI Grant: 1 RC2 EB011406-01

  • 76. Aug. 22, 2007.American College of Cardiology, Health Information Management Systems Society, and the Radiological Society of North AmericaChicago, IL Available at: http://www.ihe.net/Technical_Framework Accessed July 27, 2011

  • 77. IHE Radiology Technical Framework Volume 1 (IHE RAD TF-1) Integration Profiles Revision 11.0 – Final Text 20 July 24, 2012 Available at: http://www.ihe.net/Technical_Framework/upload/IHE_RAD_TF_Vol1_FT.pdf .

  • 78. American College of Radiology. Dose index registry. Available at: http://www.acr.org/Quality-Safety/National-Radiology-Data-Registry/Dose-Index-Registry .

  • 79. The American College of Radiology DIR Mapping Tool User Guide. August 16, 2012. Available at http://www.acr.org/∼/media/ACR/Documents/PDF/QualitySafety/NRDR/DIR/MappingToolUserGuideDIR.pdf .

  • 80. American College of Radiology. RADPEER. Available at: http://www.acr.org/Quality-Safety/RADPEER .

  • 81. Federal Coordinating Council for Comparative Effectiveness Research: Report to the President and Congress.June 2009.Department of Health and Human ServicesWashington, DC Available at: http://www.hhs.gov/recovery/programs/cer/cerannualrpt.pdf

  • 82. Conway P.H., Clancy C.: Comparative-effectiveness research — implications of the Federal Coordinating Council’s Report. N Engl J Med 2009; 361: pp. 328-330.

  • 83. Report to the President and the Congress on comparative effectiveness research. Available at: http://www.hhs.gov/recovery/programs/cer/execsummary.html . Accessed September 25, 2012.

  • 84. Brown B.B.: Delphi process: a methodology used for the elicitation of opinions of experts.1968.Rand CorporationSanta Monica, CA

  • 85. Dalkey N.C.: The Delphi method: an experimental study of group opinion. Document No. RM-5888-PR.1969.RAND CorporationSanta Monica, CA

  • 86. Brook RH. The RAND/UCLA appropriateness method. In: McCormick KA, Moore SR, Siegel RA, eds. Methodology perspectives. AHCPR publication No. 95-0009. Rockville, MD: Public Health Service, U.S. Department of Health and Human Services, p. 59–70.

  • 87. Begley S. The best medicine: a quiet revolution in comparative effectiveness research just might save us from soaring medical costs. Sci Am 2011; 305: pp. 50-55.

  • 88. Schadt E.E., Linderman M.D., Sorenson J., et. al.: Cloud and heterogeneous computing solutions exist today for the emerging big data problems in biology. Nat Rev Genet 2011; 12: pp. 224. Epub 2011 Feb 8

  • 89. Schadt E.E., Linderman M.D., Sorenson J., et. al.: Computational solutions to large-scale data management and analysis. Nat Rev Genet 2010; 11: pp. 647-657.

  • 90. Njuguna N., Adam E., Flanders A.E., et. al.: Informatics in radiology: envisioning the future of e-learning in radiology: an introduction to SCORM1. Radiographics 2011; 31: pp. 1173-1179. discussion 1179–1180. Epub 2011 May 5

  • 91. Chang P.: Invited commentary. RadioGraphics 2011; 31: pp. 1179-1180.

This post is licensed under CC BY 4.0 by the author.