Home Automated Radiology-Pathology Module Correlation Using a Novel Report Matching Algorithm by Organ System
Post
Cancel

Automated Radiology-Pathology Module Correlation Using a Novel Report Matching Algorithm by Organ System

Objectives and Rationale

Radiology-pathology correlation is time-consuming and is not feasible in most clinical settings, with the notable exception of breast imaging. The purpose of this study was to determine if an automated radiology-pathology report pairing system could accurately match radiology and pathology reports, thus creating a feedback loop allowing for more frequent and timely radiology-pathology correlation.

Methods

An experienced radiologist created a matching matrix of radiology and pathology reports. These matching rules were then exported to a novel comprehensive radiology-pathology module. All distinct radiology-pathology pairings at our institution from January 1, 2016 to July 1, 2016 were included ( n = 8999). The appropriateness of each radiology-pathology report pairing was scored as either “correlative” or “non-correlative.” Pathology reports relating to anatomy imaged in the specific imaging study were deemed correlative, whereas pathology reports describing anatomy not imaged with the particular study were denoted non-correlative.

Results

Overall, there was 88.3% correlation (accuracy) of the radiology and pathology reports ( n = 8999). Subset analysis demonstrated that computed tomography (CT) abdomen/pelvis, CT head/neck/face, CT chest, musculoskeletal CT (excluding spine), mammography, magnetic resonance imaging (MRI) abdomen/pelvis, MRI brain, musculoskeletal MRI (excluding spine), breast MRI, positron emission tomography (PET), breast ultrasound, and head/neck ultrasound all demonstrated greater than 91% correlation. When further stratified by imaging modality, CT, MRI, mammography, and PET demonstrated excellent correlation (greater than 96.3%). Ultrasound and non-PET nuclear medicine studies demonstrated poorer correlation (80%).

Conclusion

There is excellent correlation of radiology imaging reports and appropriate pathology reports when matched by organ system. Rapid, appropriate radiology-pathology report pairings provide an excellent opportunity to close feedback loop to the interpreting radiologist.

Introduction

Radiology-pathology correlation is an essential component of learning radiology . Accurate and timely feedback, such as that provided by radiology-pathology correlation, is a crucial element in developing expertise and accuracy in diagnosis . However, aside from breast imaging, rigorous radiology-pathology correlation is haphazardly performed, with the majority of the correlation requiring the radiologist to actively seek pathologic results via the medical record or discussion with clinicians . Unfortunately, this process results in inadequate radiology-pathology correlation, missed opportunities for valuable feedback to radiologists, and may also cause the propagation of inaccurate information.

We created an automated radiology-pathology module to convey radiology and pathology reports to the interpreting radiologist and trainee, allowing the radiologists to receive appropriate feedback for all pathology results available following the interpretation of imaging studies. The module alerts the reporting radiologist to matching pathologic pairings via both a computerized radiology-pathology module integrated with Picture Archiving and Communication System (PACS) and via a secured e-mail. The radiologist then has the opportunity to review the imaging study through the radiology-pathology module in our PACS while simultaneously making a decision about concordance of the radiology interpretation with the pathologic diagnosis. Prior studies have acknowledged the value of radiology-pathology correlations by testing modules that use natural language processing, with one study reporting moderate accuracy (71%) in pairing appropriate reports . However, despite the potential benefits of these systems, they have not yet been routinely utilized by the radiology community as high accuracy in matching reports is necessary to allow for seamless integration into a radiologist’s daily workflow.

Get Radiology Tree app to read full this article<

Methods

Radiology-Pathology Report Matching

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 1, Primordial communicator demonstrating 10 New Pathology Follow-ups in the radiologist's queue displays conveniently on the radiologist's dashboard. Clicking on “>” launches the radiology-pathology module.

Figure 2, Radiology-pathology module demonstrating ease of access to imaging and associated pathology report. Clicking the x-ray (denoted by the arrow) brings the appropriate imaging study up for radiologist's review.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Subjects

Get Radiology Tree app to read full this article<

Radiology-Pathology Correlation Assessment

Get Radiology Tree app to read full this article<

Radiologist Survey

Get Radiology Tree app to read full this article<

Figure 3, Radiologist survey assessing the efficacy of the radiology-pathology e-mail notification system, radiologist satisfaction, and pathology follow-up.

Get Radiology Tree app to read full this article<

Statistics

Get Radiology Tree app to read full this article<

TABLE 1

Correlative Results of the Radiology-Pathology Module Separated by Specific Examination Type

Exam Type % Correlative ( n ) % Non-correlative ( n ) Total ( n ) MRI breast 100 (64) 0 (0) 64 US breast 100 (587) 0 (0) 587 Mammogram 99.9 (805) 0.1 (1) 806 PET 99.2 (236) 0.8 (2) 238 MRI A/P 98.7 (1032) 1.3 (14) 1046 CT chest 98.5 (258) 1.5 (4) 262 CT abdomen/pelvis 97.4 (1174) 2.6 (31) 1205 CT head/neck/face 95 (264) 5.0 (14) 278 MRI MSK 94.6 (53) 5.4 (3) 56 CT MSK 92 (23) 8 (2) 25 MRI brain 91.9 (249) 8.1 (22) 271 US head/neck 91.1 (1117) 8.9 (109) 1226 US pelvis 89.3 (941) 10.7 (113) 1054 IR 81.7 (398) 18.3 (89) 487 MRI cardiac/chest 80 (8) 20 (2) 10 NM (non-PET) 80 (228) 20.0 (57) 285 US MSK 80 (12) 20 (3) 15 CT spine 76.5 (13) 23.5 (4) 17 MRI spine 74.6 (44) 25.4 (15) 59 US abdomen (peds only) 71.4 (45) 28.6 (18) 63 XR fluoroscopy 68.8 (154) 31.2 (70) 224 US abdomen (all) 38.8 (284) 61.2 (448) 732 US abdomen (adult only) 35.7 (239) 64.3 (430) 669 US chest 33.3 (1) 66.7 (2) 3 US vascular 6.7 (2) 93.3 (28) 30 XR swallow 0 (0) 100.00 (19) 19 Total 88.3 11.7 (1052) 8999

A/P, anterior-posterior; CT, computed tomography; IR, infrared; MRI, magnetic resonance imaging; MSK, musculoskeletal; NM, nuclear medicine; peds, pediatrics; PET, positron emission tomography; US, ultrasound; XR, x-ray.

TABLE 2

Correlative Results of the Radiology-Pathology Module Separated by Specific Modality

Exam Type % Correlative ( n ) % Non-correlative ( n ) Total ( n ) Mammogram 99.9 (805) 0.1 (1) 806 PET 99.2 (236) 0.8 (2) 238 All CT 96.9 (1732) 3.1 (55) 1787 All MRI 96.3 (1450) 3.7 (56) 1506 IR 81.7 (398) 18.3 (89) 487 All US 80.7 (2944) 19.3 (703) 3647 NM (non PET) 80 (228) 20 (57) 285 All XR 63.4 (154) 36.6 (89) 243

CT, computed tomography; IR, infrared; MRI, magnetic resonance imaging; NM, nuclear medicine; PET, positron emission tomography; US, ultrasound; XR, x-ray.

Get Radiology Tree app to read full this article<

Results

Radiology-Pathology Correlation Assessment

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Radiologist Survey

Get Radiology Tree app to read full this article<

TABLE 3

Radiologist Survey Results Assessing the Radiology-Pathology E-mail Notification System

Survey Question Scale How easy is it to use the new Rad-Path e-mail application?1 Hard__2 Somewhat hard__3 Not hard__4 Easy__5 Very easy 0 1 1 10 6 How many Rad-Path e-mail notifications do you receive per week?0–55–10>10 5 6 7 On average, how long did it take to look up findings (eg, surgical note, pathology note) for a case you read prior to the start of Rad-Path e-mail notifications?<5 min__5–10 min__>10 min 6 8 4 How satisfied are you in terms of how the imaging information is presented on the Rad-Path e-mails?1 Very unsatisfied__2 Unsatisfied__3 Indifferent__4 Satisfied__5 Very satisfied 1 1 2 10 4 How satisfied are you in terms of how the clinical information is presented on the Rad-Path e-mails?1 Very unsatisfied__2 Unsatisfied__3 Indifferent__4 Satisfied__5 Very satisfied 1 2 1 11 3 How have your imaging interpretations been influenced by the implementation of Rad-Path e-mails?No change__Change to wording__Highlighting specific pathologies 9 4 5 What is/will be the biggest benefit of the Rad-Path e-mail notification system?Improving imaging interpretation__Performing research studies__Department quality assurance/quality improvement__Collecting teaching cases 12 1 4 1 What changes would you like to see (if any) to the Rad-Path notification process? Speed up link to update Perhaps include history/physical, op/procedure note Clicking on the e-mail should not require going to website Improve specificity Turn off op notes

OP, operative; Rad-Path, radiology-pathology.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 4

Radiologist Survey Results Assessing Impact of the Radiology-Pathology E-mail Notification System on Radiologist Satisfaction and Pathology Follow-up

Survey Question Scale How many cases per week did you look up findings (eg, surgical note, pathology note):0–55–10>10Prior to_ the start of Rad-Path e-mails? 9 5 4_After_ the start of Rad-Path e-mail notifications? 6 9 3 Which type of cases did you typically follow up:_“Interesting” cases“Unusual” cases__All__Tumors__Prior to the start of Rad-Path e-mails? 13 4 1After the start of Rad-Path e-mails? 4 1 12 1 How satisfied were you in terms of following up cases:1 Very unsatisfied__2 Unsatisfied__3 Indifferent__4 Satisfied__5 Very satisfied__Prior to Rad-Path e-mails? 2 7 3 5 1After the start of the Rad-Path e-mails? 1 0 1 6 10

Rad-Path, radiology-pathology.

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusion

Get Radiology Tree app to read full this article<

Take-home Points

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Murphey M.D., Madewell J.E., Olmsted W.W., et. al.: A history of radiologic pathology correlation at the Armed Forces Institute of Pathology and its evolution into the American Institute for Radiologic Pathology. Radiology 2012; 262: pp. 623-634.

  • 2. Delic J.A., Fuhrman C.R., Trejo Bittar H.E.: Pulmonary alveolar microlithiasis: AIRP best cases in radiologic-pathologic correlation. Radiographics 2016; 36: pp. 1334-1338.

  • 3. Kantartzis S.N., Dacic S., Strollo D.C.: AIRP best cases in radiologic-pathologic correlation: pulmonary sarcoidosis complicated by aspergilloma formation. Radiographics 2012; 32: pp. 469-473.

  • 4. Walton W.J., Flores R.R.: Desmoplastic small round cell tumor of the kidney: AIRP best cases in radiologic-pathologic correlation. Radiographics 2016; 36: pp. 1533-1538.

  • 5. Hartman M., Silverman J., Spruill L., et. al.: Radiologic-pathologic correlation—an advanced fourth-year elective: how we do it. Acad Radiol 2016; 23: pp. 889-893.

  • 6. Syed M.: Black box thinking: why most people never learn from their mistakes—but some do.2015.PenguinNew York (NY)

  • 7. Sorace J., Aberle D.R., Elimam D., et. al.: Integrating pathology and radiology disciplines: an emerging opportunity?. BMC Med 2012;

  • 8. Rubin D.L., Desser T.S.: A data warehouse for integrating radiologic and pathologic data. J Am Coll Radiol 2008; 5: pp. 210-217.

  • 9. Greenes R., Bauman R., Robboy S., et. al.: Immediate pathologic confirmation of radiologic interpretations by computer feedback. Radiology 1978; 127: pp. 381-383.

  • 10. Arnold C.W., Wallace W.D., Chen S., et. al.: RadPath: a web-based system for integrating and correlating radiology and pathology findings during cancer diagnosis. Acad Radiol 2016; 23: pp. 90-100. PubMed PMID 26521686

  • 11. Kelehan L., Kalaria A., Filice R.: Keeping abreast of breast imagers: radiology pathology correlation for the rest of us. SIIM Scientific Session2016.

  • 12. American Institute for Radiologic Pathology : Four Week Radiologic Pathology Correlation Course. American College of Radiology; Available at http://www.airp.org/resident-courses/four-week-course

This post is licensed under CC BY 4.0 by the author.