Home Outcomes Knowledge May Bias Radiological Decision-making
Post
Cancel

Outcomes Knowledge May Bias Radiological Decision-making

Rationale and Objectives

This research investigates whether an expectation of abnormality and prior knowledge might potentially influence the decision-making of radiologists, and discusses the implications for radiological expert witness testimony.

Materials and Methods

This study was a web-based perception experiment.

A total of 12 board-certified radiologists were asked to interpret 40 adult chest images (20 abnormal) twice and decide if pulmonary lesions were present. Before the first viewing, a general clinical history was given for all images: cough for 3 + weeks . This was called the “defendants read.”

Two weeks later, the radiologists were asked to view the same dataset (unaware that the dataset was unchanged). For this reading, the radiologists were given the following information for all images: “ These images were reported normal but all of these patients have a lung tumour diagnosed on a subsequent radiograph 6 months later. ” They were also given the lobar location of the newly diagnosed tumor. This was called the “expert witness read.”

Results

There was a significant difference in location-based sensitivity (W = −45, P = 0.02) between the two conditions with nodule detection increasing under the second condition. Specificity increased outside the lobe of interest (W = 727, P = < 0.0001) and decreased within the lobe of interest (W = −237, P = 0.03) significantly in the “expert witness” read. Case-based sensitivity and case-based specificity were unaffected.

Conclusions

This study showed evidence that increased clinical information affects the performance of radiologists. This effect may bias expert witnesses in radiological malpractice litigation.

Introduction

Get Radiology Tree app to read full this article<

“The initial chest x-ray should have been reported as abnormal. If it had been, he would have been diagnosed at the time and sought treatment. The defence argued that the tumour on the original x-ray was only obvious to see once the later films had been considered, as any reviewing radiologist expert would know exactly where to look.”

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and Methods

Subjects

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 1

Radiologist Demographics

Board Certified (Yrs) Country Certified 15 Australia 9 UK 12 USA 5 UK 15 USA 12 USA 11 Bangladesh 4 France 8 Romania 25 Australia 3 Egypt 8 Czech Republic Median 10 Mean 10.58

Get Radiology Tree app to read full this article<

Image Bank

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Location and Size of Nodules

Case Conspicuity Size (mm) Size (Pixels) Location 1 3 25 89.25 RUL 2 3 8 28.56 LUL 3 3 15 53.55 LLL 4 3 10 35.70 LLL 5 3 15 53.55 LUL 6 3 6 22.42 RUL 7 3 10 35.70 RUL 8 2 15 53.55 RUL 9 2 21 74.97 RUL 10 2 25 89.25 RLL 11 2 20 71.40 LLL 12 2 15 53.55 RUL 13 2 15 53.55 RUL 14 2 20 71.40 LLL 15 2 20 71.40 RLL 16 1 22 78.54 RUL 17 1 10 35.70 RUL 18 1 10 35.70 RUL 19 1 15 53.55 LLL 20 1 8 28.56 RUL

LUL Left upper lobe, RUL Right upper lobe LLL Left lower lobe RLL Right lower lobe.

Get Radiology Tree app to read full this article<

Viewing

Get Radiology Tree app to read full this article<

Figure 1, Image for undirected search (condition 1). (Color version of figure available online).

Get Radiology Tree app to read full this article<

Subject Instructions

Get Radiology Tree app to read full this article<

Figure 2, Choice of confidence levels given after mouse click. (Color version of figure available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, Directed search with tumor location indication (condition 2). (Color version of figure available online).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Table 3

Location Sensitivity, Case Sensitivity, and Specificity Data Are Shown with P Values for Intercondition Comparison Using the Wilcoxon Signed-rank Test

Condition 1 Condition 2 Condition 1 Condition 2 Condition 1 Condition 2 Loc Sensitivity Loc Sensitivity Case Sensitivity Case Sensitivity Specificity Specificity Cough Tumor Cough Tumor Cough Tumor 1 0.50 0.60 0.85 0.95 0.20 0.05 2 0.30 0.20 0.60 0.20 0.80 0.90 3 0.40 0.30 0.80 0.90 0.20 0.35 4 0.10 0.10 0.70 0.50 0.45 0.35 5 0.30 0.50 0.55 0.65 0.70 0.60 6 0.35 0.55 0.65 0.80 0.80 0.60 7 0.45 0.45 0.65 0.60 0.65 0.90 8 0.25 0.50 0.80 0.65 0.75 0.50 9 0.45 0.55 0.65 0.65 0.70 0.85 10 0.45 0.50 1.00 0.90 0.20 0.30 11 0.35 0.50 0.80 0.95 0.50 0.30 12 0.30 0.65 0.60 0.70 0.70 0.45 Median 0.35 0.50 0.67 0.67 0.67 0.47 Mean 0.35 0.45 0.72 0.67 0.55 0.51P value 0.02** 0.79 0.45

Table 4

Incorrect Mouse Clicks outside Lobe of Interest

Wilcoxon Matched-pairs Signed Rank Test First Condition Second Condition Number of cases 40 40 Number of incorrect mouse clicks 219 68 Median 5.0 1.0 Mean 5.47 1.62P value_P_ < 0.0001 Sum of signed ranks (W) 727.0

Table 5

Incorrect Mouse Clicks inside Lobe of Interest

Wilcoxon Matched-pairs Signed Rank Test First Condition Second Condition Number of cases 40 40 Number of incorrect mouse clicks 115 153 Median 2.0 3.0 Mean 2.85 3.77P value_P_ = 0.03 Sum of signed ranks (W) −237.0

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 6

JAFROC Analysis between Each Condition Using the Student t Test

Reader Condition 1 Condition 2 JAFROC JAFROC Cough Tumor 1 0.64 0.68 2 0.53 0.57 3 0.44 0.53 4 0.58 0.65 5 0.58 0.68 6 0.53 0.51 7 0.35 0.30 8 0.30 0.25 9 0.31 0.37 10 0.65 0.69 11 0.41 0.48 12 0.46 0.51 Median 0.49 0.52 Mean 0.48 0.52P value 0.03* Test statistic F(1,11) = 0.16

JAFROC, jackknife free-response receiver characteristics.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 7

Nodule Location Sensitivity by Conspicuity of the Nodule, Where 3 Is the Most Obvious and 1 Is the Most Subtle

Conspicuity 3 Read 1 Conspicuity 3 Read 2 Conspicuity 2 Read 1 Conspicuity 2 Read 2 Conspicuity 1 Read 1 Conspicuity 1 Read 2 1 6 6 2 2 2 4 2 5 6 1 2 0 2 3 4 6 0 2 1 2 4 6 6 1 1 2 2 5 5 6 2 2 0 3 6 5 5 0 2 1 3 7 5 4 1 0 0 0 8 2 2 0 0 0 0 9 6 3 2 1 0 2 10 6 4 2 2 1 3 11 6 7 1 1 2 2 12 5 7 0 3 2 3 Median 5 6 1 2 1 2 Mean 5.1 5.2 1 1.5 0.9 2.1P 0.9 0.2 0.01**

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 8

False Positive Decisions on the Lobe of Interest on Normal Images Only

Reader False Positives Read 1 False Positives Read 2 1 4 16 2 3 8 3 0 9 4 6 2 5 2 6 6 2 9 7 3 2 8 7 7 9 9 7 10 2 3 11 9 12 12 4 13 Mean 4.2 7.8 Median 3.5 7.5P 0.04**

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Fitzgerald R.: Error in radiology. Clin Radiol 2001; 56: pp. 938-946.

  • 2. Berlin L.: Radiologic errors and malpractice: a blurry distinction. AJR Am J Roentgenol 2007; 189: pp. 517-521.

  • 3. Dickinson J.: The essentials of expert witnessing for the radiology nurse. J Radiol Nurs 2015; 34: pp. 8-12.

  • 4. Bal B.S.: The expert witness in medical malpractice litigation. Clin Orthop Relat Res 2009; 467: pp. 383-391.

  • 5. Campbell P.: I knew it all along. MDU J 2012; 28: pp. 7-9.

  • 6. Berlin L.: Malpractice issues in radiology hindsight bias. AJR Am J Roentgenol 2000; 175: pp. 597-601.

  • 7. Baker S.R., Whang J.S., Luk L., et. al.: The demography of medical malpractice suits against radiologists. Radiology 2013; 266: pp. 539-547.

  • 8. American Medical Association : Physician characteristics and distribution in the US.2010 edn.2010.American Medical AssociationChicago, ILpp. 30-31. 97–149

  • 9. Physician Insurers Association of America : Claim trend analysis study.2004.Physician Insurers Association of AmericaRockville, MD

  • 10. Fileni A., Magnavita N.: A 12-year follow-up study of malpractice claims against radiologists in Italy. Radiol Med 2006; 111: pp. 1009-1022.

  • 11. Saber-Tehrani A.S., Lee H.W., Matthews S.C., et. al.: 20-Year summary of US malpractice claims for diagnostic errors from 1985–2005. (abstract). Proceedings of the fourth Annual Diagnostic Error in Medicine Conference. Chicago, IL: Johns Hopkins University School of Medicine2011.

  • 12. Kundel H.L.: Perception errors in chest radiography. Semin Respir Med 1989; 10: pp. 203-210.

  • 13. Stavem K., Foss T., Botnmark O., et. al.: Inter-observer agreement in audit of quality of radiology requests and reports. Clin Radiol 2004; 59: pp. 1018-1024.

  • 14. Baker S.R., Patel R.H., Yang L.: Malpractice suits in chest radiology an evaluation of the histories of 8265 radiologists. J Thorac Imaging 2013; 28: pp. 388-391.

  • 15. Berlin L., Hendrix R.W.: Perceptual errors and negligence. AJR Am J Roentgenol 1998; 170: pp. 863-867.

  • 16. Reed W.M., Chow S.C., Chew L.E., et. al.: Can prevalence expectations drive radiologists’ behaviour?. Acad Radiol 2013; 21: pp. 450-453.

  • 17. Eisenberg R.: Expert witness testimony: issues of ethics, equality, qualifications etc for being an expert witness. Radiological Society of North America. Scientific Assembly and Annual Meeting, Chicago, IL2014.

  • 18. Brent R.L.: The irresponsible expert witness: a failure of biomedical graduate education and professional accountability. Pediatrics 1982; 70: pp. 754-762.

  • 19. Wright M.S., Kluth S., Dobbins D.: Mock jurors’ assessments of blind experts in criminal trials, accessed 30.11.2015. Available at: http://www.lawschool.cornell.edu/SELS/upload/MockJurorsAssessmentsofBlindExpertsinCriminalTrials_paper.pdf Accessed November 22, 2015

  • 20. Shiraishi J., Katsuragawa S., Ikezoe J., et. al.: Development of a digital image database for chest radiographs with and without a lung nodule: receiver operating characteristic analysis of radiologists‘ detection of pulmonary nodules. AJR Am J Roentgenol 2000; 174: pp. 71-74.

  • 21. Haygood T.M., Ryan J., Brennan P.C., et. al.: On the choice of acceptance radius in free-response observer performance studies. Br J Radiol 2012; 86:

  • 22. University of Aukland. The Wilcoxon Rank-sum Test. Available at: http://www.stat.auckland.ac.nz/~wild/ChanceEnc/Ch10.wilcoxon.pdf Accessed November 22, 2015

  • 23. Muhm J.R., Miller W.E., Fontana R.S., et. al.: Lung cancer detected during a screening program using four-month chest radiographs. Radiology 1983; 148: pp. 609-615.

  • 24. Parker T.W., Kelsey C.A., Moseley R.D., et. al.: Directed versus free search for nodules in chest radiographs. Invest Radiol 1982; 17: pp. 152-155.

  • 25. Summerfield C., Koechlin E.A.: Neural representation of prior information during perceptual inference. Neuron 2008; 59: pp. 336-347.

  • 26. Durand D.J., Robertson C.T., Agarwal G., et. al.: Expert witness blinding strategies to mitigate bias in radiology malpractice cases: a comprehensive review of the literature. J Am Coll Radiol 2014; 11: pp. 868-873.

  • 27. Brady A., Laoide R.Ó., McCarthy P., et. al.: Discrepancy and error in radiology: concepts, causes and consequences. Ulster Med J 2012; 81: pp. 3-9.

  • 28. Berlin L.: Hindsight bias. AJR Am J Roentgenol 2000; 175: pp. 597-601.

  • 29. Caldwell C., Seamone E.R.: Excusable neglect in malpractice suits against radiologists: a proposed jury instruction to recognize the human condition. Ann Health Law 2007; 16: pp. 43-77. Winter

  • 30. Croskerry P.: Achieving quality in clinical decision-making: cognitive strategies and detection of bias. Acad Emerg Med 2008; 9: pp. 1184-1204.

  • 31. Gunderman R.B.: Biases in radiologic reasoning. AJR Am J Roentgenol 2009; 192: pp. 561-564.

  • 32. Rachlinski J.J.: A positive psychological theory of judging in hindsight. Univ Chic Law Rev 1998; 1: Spring

  • 33. Harley E.M., Carlsen K.A., Loftus G.R.: The “saw-it-all-along” effect: demonstrations of visual hindsight bias. J Exp Psychol Learn Mem Cogn 2004; 30: pp. 960-968.

  • 34. Fischoff B.: Hindsight not equal to foresight—Effect of outcome knowledge on judgement under uncertainty. J Exp Psychol Hum Percept Perform 1975; 1: pp. 288-299.

  • 35. Harley E.M.: Hindsight bias in legal decision making. Soc Cogn 2007; 25: pp. 48-63. Special Issue: The Hindsight Bias

  • 36. Arkes H.R., Wortmann R.L., Saville P.D., et. al.: Hindsight bias among physicians weighing the likelihood of diagnoses. J Appl Psychol 1981; 66: pp. 252-254.

  • 37. Erly W.K., Tran M., Dillon R.C., et. al.: Impact of hindsight bias on interpretation of nonenhanced computed tomographic head scans for acute stroke. J Comput Assist Tomogr 2010; 34: pp. 229-232.

  • 38. Haygood T.M., Qing Liu M.A., Galvan E.M., et. al.: Memory for previously viewed radiographs and the effect of prior knowledge of memory task. Acad Radiol 2013; 20: pp. 1598-1603.

  • 39. Soh B.P., Lee W., McEntee M.F., et. al.: Screening mammography: test set data can reasonably describe actual clinical reporting. Radiology 2013; 268: pp. 46-53.

This post is licensed under CC BY 4.0 by the author.