Rationale and Objectives
It is common practice in academic hospitals for radiology residents to provide preliminary interpretations for radiologic examinations performed in the emergency department (ED) during off-hours. In this study, we used a software program called Minerva to identify and track discrepancies between resident and faculty interpretation of ED studies. The objective was to determine if missed case conferences could reduce the number of resident discrepancies related to the types of cases reviewed.
Materials and Methods
We used Minerva to identify and grade faculty-modified resident preliminary reports as minor or major discrepancies depending on whether the discrepancy had the potential to affect patient management or outcome. Minor and major discrepancy rates were calculated for all residents to evaluate call performance, establish benchmarks, and develop interventions to reduce the number of discrepant cases.
Results
The total discrepancy rate for all residents ( n = 22) was 2.6% with a standard deviation (SD) of 0.7%. The average major discrepancy rate for all residents was 1.1% with a SD of 0.4%. Trend analysis of missed cases was used to generate topic-specific resident missed case conferences on acromioclavicular joint separation injuries, elbow joint effusions, and osteochondral fractures, which resulted in an overall 64% decrease in the number of missed cases related to these injuries.
Conclusions
The systematic evaluation of resident discrepancies using a simple software application provides a competency-based metric to assess call performance, establish benchmarks, and develop missed case conferences. This process is expected to result in further reduction in resident discrepancy rates and missed cases.
Improving quality and safety in radiology is becoming increasingly important given the dramatic conclusions of the Institute of Medicine’s report “To Err is Human,” which states that an estimated 100,000 lives per year are lost due to medical errors . Studies subsequent to the Institute of Medicine’s report identify suboptimal radiology processes and the lack of useful outcome data in the radiology literature as contributors to the overwhelming number of medical errors and the associated economic costs, estimated as more than $38 billion annually . A critical component to improving radiology quality and safety is the process of defining radiology quality metrics and developing the information technology systems to quantify and track quality metrics. Peer review is a methodology used to evaluate radiologist performance with the ultimate goal of reducing errors and improving patient care . Although many departments have established peer review systems in place for evaluating and reporting radiologist performance, there are few equivalent programs for evaluating and tracking radiology resident performance.
As part of an ongoing quality assurance project, we have developed a database application called Minerva that accesses the Radiology Information System (RIS) database, identifies all preliminary interpretations provided by residents during independent call, and allows grading and tracking of minor and major discrepancies between resident and faculty interpretations. Discrepancy rates were calculated and used as a competency-based metric to evaluate call performance, establish benchmarks, and track trends in discrepant cases. Trend analysis of discrepant cases was used to generate topic-specific resident missed case conferences. The purpose of this study was to determine if residents missed case conferences focusing on specific types of missed cases could reduce the number of missed cases, thereby improving discrepancy rates.
Materials and methods
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Results
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Table 1
Emergency Department Studies Interpreted by Radiology Residents with Percentage of Minor and Major Discrepancies by Modality
Modality Total Number % Minor % Major Conventional radiography 8996 1.03 0.92 Neuro computed tomography 1045 2.20 1.82 Cardiothoracic and body computed tomography 1900 1.89 1.26 Ultrasound 1196 4.10 1.17
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Discussion
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
Get Radiology Tree app to read full this article<
References
1. Stefl M.E.: To err is human: building a safer health system in 1999. Front Health Serv Manage 2001; 18: pp. 1-2.
2. Johnson C.D., Krecke K.N., Miranda R., et. al.: Quality initiatives: developing a radiology quality and safety program: a primer. Radiographics 2009; 29: pp. 951-959.
3. Kruskal J.B., Anderson S., Yam C.S., et. al.: Strategies for establishing a comprehensive quality and performance improvement program in a radiology department. Radiographics 2009; 29: pp. 315-329.
4. Beam C.A., Layde P.M., Sullivan D.C.: Variability in the interpretation of screening mammograms by US radiologists. Findings from a national sample. Arch Intern Med 1996; 156: pp. 209-213.
5. Johnson C.D., Harmsen W.S., Wilson L.A., et. al.: Prospective blinded evaluation of computed tomographic colonography for screen detection of colorectal polyps. Gastroenterology 2003; 125: pp. 311-319.
6. Aakre K.T., Johnson C.D.: Plain-radiographic image labeling: a process to improve clinical outcomes. J Am Coll Radiol 2006; 3: pp. 949-953.
7. Halsted M.J.: Radiology peer review as an opportunity to reduce errors and improve patient care. J Am Coll Radiol 2004; 1: pp. 984-987.
8. Tieng N., Grinberg D., Li S.F.: Discrepancies in interpretation of ED body computed tomographic scans by radiology residents. Am J Emerg Med 2007; 25: pp. 45-48.
9. Meyer R.E., Nickerson J.P., Burbank H.N., et. al.: Discrepancy rates of on-call radiology residents’ interpretations of CT angiography studies of the neck and circle of Willis. AJR Am J Roentgenol 2009; 193: pp. 527-532.
10. Wechsler R.J., Spettell C.M., Kurtz A.B., et. al.: Effects of training and experience in interpretation of emergency body CT scans. Radiology 1996; 199: pp. 717-720.
11. Wysoki M.G., Nassar C.J., Koenigsberg R.A., et. al.: Head trauma: CT scan interpretation by radiology residents versus staff radiologists. Radiology 1998; 208: pp. 125-128.
12. Carney E., Kempf J., DeCarvalho V., et. al.: Preliminary interpretations of after-hours CT and sonography by radiology residents versus final interpretations by body imaging radiologists at a level 1 trauma center. AJR Am J Roentgenol 2003; 181: pp. 367-373.
13. Ruchman R.B., Jaeger J., Wiggins E.F., et. al.: Preliminary radiology resident interpretations versus final attending radiologist interpretations and the impact on patient care in a community hospital. AJR Am J Roentgenol 2007; 189: pp. 523-526.
14. Erly W.K., Berger W.G., Krupinski E., et. al.: Radiology resident evaluation of head CT scan orders in the emergency department. AJNR Am J Neuroradiol 2002; 23: pp. 103-107.
15. Cooper V.F., Goodhartz L.A., Nemcek A.A., et. al.: Radiology resident interpretations of on-call imaging studies: the incidence of major discrepancies. Acad Radiol 2008; 15: pp. 1198-1204.
16. Walls J., Hunter N., Brasher P.M., et. al.: The DePICTORS Study: discrepancies in preliminary interpretation of CT scans between on-call residents and staff. Emerg Radiol 2009; 16: pp. 303-308.