Rationale and Objectives
The aims of this study were to review the role of a quality information program (QUIP) as a semiautomated educational feedback mechanism and to review common errors in cross-sectional abdominal and pelvic studies as an initiative for continuing medical education and improving patient care.
Materials and Methods
Abdominal and pelvic errors identified by QUIP and cases collected from morbidity and mortality conferences were reviewed. Errors were classified and graded to levels of morbidity.
Results
There were 222 errors in 218 patients over 4 years in this study. One hundred thirteen (51%) were identified after the introduction of QUIP (January to December 2009). One hundred thirty-eight studies (61%) were read independently, while 84 (39%) were double-read. Sixty-five percent of errors (145 of 222) were false-negatives, of which 45 (31%) were “satisfaction-of-search” errors. There were 62 cognitive errors (28%), nine technical errors (4%), eight communication errors (4%), six ordering errors (3%), and five false-positives identified. Seventy-six percent of errors were identified on computed tomography ( n = 168); fewer cases involved ultrasound ( n = 20 [9%]) and magnetic resonance ( n = 34 [15%]). Forty-one percent resulted in no changes to patient outcomes. Forty-percent caused minor patient morbidity, and 19% caused major patient morbidity, including three cases (1%) that likely contributed to patient death.
Conclusions
Most abdominopelvic errors in this study were classified as false-negatives. Many can be attributed to satisfaction-of-search errors. Implementing a simple, semiautomated QUIP allows timely feedback regarding errors to radiologists. This may improve the quality of health care while allowing radiologists the opportunity to learn from each case they are involved in.
With the increasing use of diagnostic imaging for patient diagnosis, management, and follow-up, the role and scope of radiologists in image synthesis are expanding. Identifying relevant findings pertaining to patient symptoms as well as reporting significant incidental or unexpected findings are among the key responsibilities of our specialty.
At our teaching institution, we have developed a quality initiative program (QUIP), which is a confidential, semiautomated method of documenting and following up on subsequently identified errors. QUIP includes an e-mail template accessible from a desktop Outlook (Microsoft Corporation, Redmond, WA) public e-mail folder or anonymously through the agency of a transcriptionist ( Fig 1 ). The Outlook folder is part of a virtual PC that is accessible on the third screen of our picture archiving and communication system (McKesson Corporation, San Francisco, CA) but is a separate system. This shortcut to the virtual machine servers opens up a virtual session as if on a regular PC. It does not use any of the picture archiving and communication system’s computing resources. The Outlook e-mail template is preaddressed to appropriate recipients, including the section chief (eg, of abdominopelvic radiology) and the administrative assistant. It has fields that can be quickly filled in with case identifiers, the date of the study in question, and a brief account of the event requiring attention (such as inappropriate protocolling, interpretation issues, or transcription errors). Finally, the specific QUIP case is also addressed to the radiologist (or radiologists) who created the original report and would therefore most benefit from this information ( Fig 1 ). It is sent in the form of an email with the title “QUIP” to allow easy identification. The recipient radiologist is required to reply to the section chief by checking off one of the four options at the bottom of the standardized QUIP form. Action options include (1) reviewing the case, (2) making an addendum to the original report, (3) contacting the referring clinician (if the finding is of sufficient clinical importance) or (4) other, in rare cases where there may be extenuating circumstances involved.
Figure 1
Screen-captured image of a standardized quality initiative program e-mail demonstrating information that is usually included in the e-mail, including standardized wording and editable variables.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Methods and materials
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Results
Get Radiology Tree app to read full this article<
Table 1
Rates of QUIP Reporting by Quarter During the First Year of Implementation (2009)
Quarter Number of QUIP Reports January to March 24 April to June 23 July to September 29 October to December 37
QUIP, quality information program.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Table 2
Classification and Rates of Error Types
Error Type_n_ % Perceptual: false-negative 145 65 Satisfaction of search 45 20 (31% of false-negatives) Cognitive 62 28 Technical or other 9 4 Communication 8 4 Ordering issue 6 3 Perceptual: false positive 5 2
The sum of the errors by type exceeds the number of error cases identified, because several cases involved more than one type of error.
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Table 3
Rates of Reported Errors by Organ or Anatomic Structure in the Abdomen and Pelvis
Organ_n_ % Liver 37 17 Kidneys 29 13 Lymph nodes 26 12 Large bowel/appendix 26 12 Bones 23 10 Omentum/peritoneum 16 7 Vascular 14 6 Gynecologic organs 14 6 Pancreas 13 6 Subcutaneous tissue 9 4 Gallbladder/biliary system 7 3 Small bowel 6 3 Adrenal 6 3 Lung bases 5 2 Stomach 4 2 Spleen 2 1 Bladder 1 <1 Esophagus 1 <1 Scrotum 1 <1 Prostate 1 <1
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Table 4
Evaluation of Errors by Patient Morbidity and Mortality
Degree of Morbidity and Mortality_n_ % No change to patient outcome 91 41 Minor morbidity 89 40 Moderate morbidity 39 18 Mortality 3 1
Get Radiology Tree app to read full this article<
Discussion
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Conclusions
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
References
1. Renfrew D.L., Franken E.A., Berbaum K.S., et. al.: Error in radiology: classification and lessons in 182 cases presented at a problem case conference. Radiology 1992; 183: pp. 145-150.
2. Fitzgerald R.: Error in radiology. Clin Radiol 2001; 56: pp. 938-946.
3. Pinto A., Brunese L.: Spectrum of diagnostic errors in radiology. World J Radiol 2010; 2: pp. 377-383.
4. Berbaum K.S.: Difficulty of judging retrospectively whether a diagnosis has been “missed.”. Radiology 1995; 194: pp. 582-583.
5. Fitzgerald R.: Radiological errors: analysis, standard setting, targeted instruction and team-working. Eur Radiol 2005; 15: pp. 1760-1767.
6. McCreadie G., Oliver T.B.: Eight CT lessons that we learned the hard way: an analysis of current patterns of radiological error and discrepancy with particular emphasis on CT. Clin Radiol 2009; 64: pp. 491-499.
7. Potchen J.: Measuring observer performance in chest radiology: some experiences. J Am Coll Radiol 2006; 3: pp. 423-432.
8. Doubilet P., Herman P.G.: Interpretation of radiographs: effect of clinical history. AJR Am J Roentgenol 1981; 137: pp. 1055-1058.
9. Kruskal J., Anderson S., Yam C., et. al.: Strategies for establishing a comprehensive quality and performance improvement program in a radiology department. Radiographics 2009; 29: pp. 315-329.
10. Talner L.B., D’Agostino H.: The department of radiology quality assessment meeting. An unexpected teaching bonus. Invest Radiol 1994; 29: pp. 378-380.
11. Pinto A., Acampora C., Pinto F., et. al.: Learning from diagnostic errors: a good way to improve education in radiology. Eur J Radiol 2011; 78: pp. 372-376.
12. Berlin L.: Errors in judgment. AJR Am J Roentgenol 1996; 166: pp. 1259-1261.
13. Romano L., Giovine S., Guidi G., et. al.: Hepatic trauma: CT findings and considerations based on our experience in emergency diagnostic imaging. Eur J Radiol 2004; 50: pp. 59-66.
14. Lv P, Lin XZ, Li J, et al. Differentiation of small hepatic hemangioma from small hepatocellular carcinoma: recently introduced spectral CT method. Radiology. In press.
15. Caumo F., Brunelli S., Zorzi M., et. al.: Benefits of double reading of screening mammograms: retrospective study on a consecutive series. Radiol Med 2011; 116: pp. 575-583.
16. Wakeley C.J., Jones A.M., Kabala J.E., et. al.: Audit of the value of double reading magnetic resonance imaging films. Br J Radiol 1995; 68: pp. 358-360.
17. Berg W., Blume J.D., Cormack J., et. al.: Operator dependence of physician-performed whole-breast US: lesion detection and characterization. Radiology 2006; 241: pp. 355-365.
18. Brown M.A., Sirlin C.B., Hoyt D.B., et. al.: Screening ultrasound in blunt abdominal trauma. J Intensive Care Med 2003; 18: pp. 253-260.
19. Degani A, Weiner EL. Human factors of flight-deck checklists: the normal checklist. Available at: http://www.bluecoat.org/reports/Degani_90_Checklist.pdf . Accessed May 8, 2011.
20. Edhemovic I., de Gara C.J., Temple W.J., et. al.: The computer synoptic operative report :a leap forward in the science of surgery. Ann Surg Oncol 2004; 11: pp. 941-947.