Home Chest X-ray Interpretation by Radiographers Is Not Inferior to Radiologists
Post
Cancel

Chest X-ray Interpretation by Radiographers Is Not Inferior to Radiologists

Highlights

  • Chest X-ray interpretation by reporting radiographers is noninferior to radiologists.

  • Chest X-ray reporting by reporting radiographers can increase diagnostic capacity.

  • Maximizing the use of reporting radiographers could streamline patient pathways.

Rationale and Objectives

Chest X-rays (CXR) are one of the most frequently requested imaging examinations and are fundamental to many patient pathways. The aim of this study was to investigate the diagnostic accuracy of CXR interpretation by reporting radiographers (technologists).

Methods

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Introduction

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Methods

Get Radiology Tree app to read full this article<

Study Design

Get Radiology Tree app to read full this article<

Setting

Get Radiology Tree app to read full this article<

Case Selection

Get Radiology Tree app to read full this article<

Participants

Get Radiology Tree app to read full this article<

Reference Standard Diagnosis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Test Methods

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 1

Reporting Guidance Adapted From Robinson (1999)

Findings To Be Considered Normal Findings To Be Considered Abnormal

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 2

Participant Experience and Volume of Chest X-rays Reported Annually

Experience (y) Consultant Radiologists Reporting Radiographers Volume Volume <5,000 5,001–9,999 ≥10,000 <5,000 5,001–9,999 ≥10,000 0–5 2 N/A N/A 1 4 1 6–9 4 N/A N/A 1 2 1 ≥10 2 2 N/A N/A N/A N/A

N/A, not applicable (data from one reporting radiographer missing).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 1, Sensitivity and specificity of consultant radiologists and reporting radiographers (with 95% confidence intervals).

TABLE 3

Diagnostic Accuracy Figure of Merit Values of Consultant Radiologists and Reporting Radiographers

Reporting Practitioner Number of Cases Figure of Merit (95% CI) Unweighted Weighted Consultant radiologist 1055 0.788 (0.766–0.811) 0.786 (0.764–0.808) Reporting radiographer 1158 0.828 (0.808–0.847) 0.830 (0.811–0.849)

CI, confidence interval.

Figure 2, Unweighted JAFROC (jack-knife alternate free-response receiver operating characteristic) curves for consultant radiologists and reporting radiographers. CR, consultant radiologist; RR, reporting radiographer.

Figure 3, Weighted JAFROC (jack-knife alternate free-response receiver operating characteristic) curves for consultant radiologists and reporting radiographers. CR, consultant radiologist; RR, reporting radiographer.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 4

Diagnostic Accuracy of Practitioners According to Experience and Volume of Chest X-rays Interpreted

Consultant Radiologists Reporting Radiographers Volume Volume Experience (y) <5,000 5,001–9,999 ≥10,000 <5,000 5,001–9,999 ≥10,000 0–5 0.809 N/A N/A 0.839 0.839 0.803 6–9 0.760 N/A N/A 0.824 0.844 0.822 ≥10 0.787 0.813 N/A N/A N/A N/A

N/A, not applicable.

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Strengths and Limitations

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Comparison with Literature

Get Radiology Tree app to read full this article<

TABLE 5

Summary of Summary of Studies That Have Used Alternate Free-response Receiver Operating Characteristic (AFROC) or Jack-knife Alternate Free-response Receiver Operating Characteristic (JAFROC) Methodology for Assessment of Chest X-ray Diagnostic Accuracy

Study Number of Participants Practitioner Characteristics Number of Chest X-rays Normal:Abnormal Ratio Simulated or Natural Nodules Nature of Intervention/Comparison Observer Performance Area Under Curve (AFROC) and FoM (JAFROC) Control Intervention Current study 21 10 CRs 106 1:1 Natural—range of pathologies Direct comparison of CRs and RRs CR mean FoM = 0.786 11 RRs RR mean FoM = 0.830 Manning et al. 21 8 CRs 120 1:2 ?natural Eye tracking AFROC (expert) 5 RRs before/after 6 mo training RR after AUC = 0.82; CR AUC = 0.80 8 UG radiographers (naïve) Donovan and Litchfield 40 Naïve (nonmedical) 30 1:1 24 natural, 4 simulated Eye tracking study, comparison with observer experience Naïve mean FoM = 0.41 UG radiographers First UG mean FoM = 0.60 Experts (CR and RR) Third UG mean FoM = 0.71 Experts mean FoM = 0.72

AUC, area under the curve; CR, consultant radiologist; FoM, figure of merit; RR, reporting radiographer; UG, undergraduate.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusions

Get Radiology Tree app to read full this article<

Supplementary Data

Get Radiology Tree app to read full this article<

Appendix S1

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Appendix S2

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Royal College of Radiologists : Clinical radiology UK workforce report 2011.2012.Royal College of RadiologistsLondon

  • 2. Royal College of Radiologists : Patients waiting too long for test results.2014.Royal College of RadiologistsLondon

  • 3. Royal College of Radiologists : Clinical radiology UK workforce census 2015 report.2016.Royal College of RadiologistsLondon

  • 4. Canadian Institute for Health Information : Health care in Canada: a focus on wait times.2012.Canadian Institute for Health InformationOttawa, Canada

  • 5. Queensland Department of Health : Inquiry into Gold Coast X-ray reporting. Queensland, Australia2014.

  • 6. Woznitza N., Piper K., Rowe S., et. al.: Optimizing patient care in radiology through team-working: a case study from the United Kingdom. Radiography 2014; 20: pp. 258-263.

  • 7. Snaith B., Milner R.C., Harris M.A.: Beyond image interpretation: capturing the impact of radiographer advanced practice through activity diaries. Radiography 2016; 22: pp. e233-e238.

  • 8. Beardmore C., Woznitza N., Goodman S.: The radiography workforce current challenges and changing needs.2016.College of RadiographersLondon

  • 9. Snaith B., Hardy M., Lewis E.F.: Radiographer reporting in the UK: a longitudinal analysis. Radiography 2015; 21: pp. 119-123.

  • 10. Buskov L., Abild A., Christensen A., et. al.: Radiographers and trainee radiologists reporting accident radiographs: a comparative plain film-reading performance study. Clin Radiol 2013; 68: pp. 55-58.

  • 11. Sheft D.J., Jones M.D., Brown R.F., et. al.: Screening of chest roentgenograms by advanced roentgen technologists. Radiology 1970; 94: pp. 427-429.

  • 12. Flehinger B.J., Melamed M.R., Heelan R.T., et. al.: Accuracy of chest film screening by technologists in the New York early lung cancer detection program. AJR Am J Roentgenol 1978; 131: pp. 593-597.

  • 13. Piper K., Cox S., Paterson A., et. al.: Chest reporting by radiographers: findings of an accredited postgraduate programme. Radiography 2014; 20: pp. 94-99.

  • 14. Woznitza N., Piper K., Burke S., et. al.: Adult chest radiograph reporting by radiographers: preliminary data from an in-house audit programme. Radiography 2014; 20: pp. 223-229.

  • 15. Ben Shimol S., Dagan R., Givon-Lavi N., et. al.: Evaluation of the World Health Organization criteria for chest radiographs for pneumonia diagnosis in children. Eur J Pediatr 2012; 171: pp. 369-374.

  • 16. Ostensen H.: Diagnostic imaging: what is it? When and how to use it where resources are limited?.2001.World Health OrganizationGeneva

  • 17. Chakraborty D.P.: Recent advances in observer performance methodology: Jackknife free-response ROC (JAFROC). Radiat Prot Dosimetry 2005; 114: pp. 26-31.

  • 18. College of Radiographers : Preliminary clinical evaluation and clinical reporting by radiographers: policy and practice guidance.2013.College of RadiographersLondon

  • 19. Royal College of Radiologists , Society and College of Radiographers : Team working in clinical imaging.2012.Royal College of Radiologists and the Society and College of RadiographersLondon

  • 20. Woznitza N., Piper K., Burke S., et. al.: Agreement between expert thoracic radiologists and the chest radiograph reports provided by consultant radiologists and reporting radiographers in clinical practice: review of a single clinical site. Radiography 2018; In press

  • 21. Obuchowski N.A.: Sample size tables for receiver operating characteristic studies. AJR Am J Roentgenol 2000; 175: pp. 603-608.

  • 22. The Health and Social Care Information Centre : Hospital episode statistics for England. Inpatient statistics, 2011-122012.

  • 23. Chakraborty D.: The FROC, AFROC and DROC variants of the ROC analysis.Van Metter R.Beutel J.Knundel H.Handbook of medical imaging.2000.SPIE PressBellingham, WA:pp. 771-796.

  • 24. Chakraborty D.: Recent developments in FROC methodology.Samei E.Krupinski E.A.The handbook of medical image perception and techniques.2010.Cambridge University PressNew York, USA:pp. 216-239.

  • 25. Sadler G.R., Lee H.C., Lim R.S., et. al.: Recruitment of hard-to-reach population subgroups via adaptations of the snowball sampling strategy. Nurs Health Sci 2010; 12: pp. 369-374.

  • 26. de Lacey G., Morley S., Berman L.: Chest x-ray: a survival guide.2008.ElsevierSpain

  • 27. Haygood T.M., Ryan J., Brennan P.C., et. al.: On the choice of acceptance radius in free-response observer performance studies. Br J Radiol 2013; 86: 42313554

  • 28. Littlefair S., Mello-Thoms C., Reed W., et. al.: Increasing prevalence expectation in thoracic radiology leads to overcall. Acad Radiol 2016; 23: pp. 284-289.

  • 29. Chakraborty D.P.: How to conduct a free-response study 2005. Available at: http://www.devchakraborty.com/HowToConduct.html Accessed June 26, 2012

  • 30. Chakraborty D.P.: A status report on free-response analysis. Radiat Prot Dosimetry 2010; 139: pp. 20-25.

  • 31. Chakraborty D.P.: JAFROC [program]. 4.2 version. Pennsylvania, USA2014.

  • 32. Robinson P., Wilson D., Coral A., et. al.: Variation between experienced observers in the interpretation of accident and emergency radiographs. Br J Radiol 1999; 72: pp. 323-330.

  • 33. Tourassi G.: Receiver operating characteristic analysis: basic concepts and practical applications.Samei E.Krupinski E.A.The handbook of medical image perception and techniques.2010.Cambridge University PressNew York, USA:pp. 187-203.

  • 34. Baldwin D.: Pulmonary nodules again? The 2015 British Thoracic Society guidelines on the investigation and management of pulmonary nodules. Clin Radiol 2016; 71: pp. 18-22.

  • 35. Donovan T., Litchfield D.: Looking for cancer: expertise related differences in searching and decision making. Appl Cogn Psychol 2013; 27: pp. 43-49.

  • 36. Litchfield D., Ball L.J., Donovan T., et. al.: Viewing another person’s eye movements improves identification of pulmonary nodules in chest x-ray inspection. J Exp Psychol Appl 2010; 16: pp. 251-262.

  • 37. Manning D., Barker-Mill S.C., Donovan T., et. al.: Time-dependent observer errors in pulmonary nodule detection. Br J Radiol 2006; 79: pp. 342-346.

  • 38. Manning D., Ethell S., Donovan T., et. al.: How do radiologists do it? The influence of experience and training on searching for chest nodules. Radiography 2006; 12: pp. 134-142.

  • 39. Sonnex E.P., Tasker A.D., Coulden R.A.: The role of preliminary interpretation of chest radiographs by radiographers in the management of acute medical problems within a cardiothoracic centre. Br J Radiol 2001; 74: pp. 230-233.

  • 40. Buissink C., Thompson J.D., Voet M., et. al.: The influence of experience and training in a group of novice observers: a jackknife alternative free-response receiver operating characteristic analysis. Radiography 2014; 20: pp. 300-305.

  • 41. Ekpo E.U., Egbe N.O., Akpan B.E.: Radiographers’ performance in chest X-ray interpretation: the Nigerian experience. Br J Radiol 2015; 88: 20150023

  • 42. Irwig L., Bossuyt P., Glasziou P., et. al.: Designing studies to ensure that estimates of test accuracy will travel.Knottnerus J.A.Buntinx F.The evidence base of clinical diagnosis: theory and methods of diagnostic research.2009.Blackwell PublishingSingapore:pp. 96-117.

  • 43. Hardy M., Flintham K., Snaith B., et. al.: The impact of image test bank construction on radiographic interpretation outcomes: a comparison study. Radiography 2016; 22: pp. 166-170.

  • 44. Brealey S., Scally A.J.: Methodological approaches to evaluating the practice of radiographers’ interpretation of images: a review. Radiography 2008; 14: pp. e46-e54.

  • 45. Kashani H., Varon C.A., Paul N.S., et. al.: Diagnostic performance of a prototype dual-energy chest imaging system ROC analysis. Acad Radiol 2010; 17: pp. 298-308.

  • 46. Schalekamp S., Karssemeijer N., Cats A.M., et. al.: The effect of supplementary bone-suppressed chest radiographs on the assessment of a variety of common pulmonary abnormalities: results of an observer study. J Thorac Imaging 2016; 31: pp. 119-125.

  • 47. Yamada Y., Jinzaki M., Hasegawa I., et. al.: Fast scanning tomosynthesis for the detection of pulmonary nodules: diagnostic performance compared with chest radiography, using multidetector-row computed tomography as the reference. Invest Radiol 2011; 46: pp. 471-477.

  • 48. Zachrisson S., Vikgren J., Svalkvist A., et. al.: Effect of clinical experience of chest tomosynthesis on detection of pulmonary nodules. Acta Radiol 2009; 50: pp. 884-891.

  • 49. de Hoop B., De Boo D.W., Gietema H.A., et. al.: Computer-aided detection of lung cancer on chest radiographs: effect on observer performance. Radiology 2010; 257: pp. 532-540.

  • 50. Kasai S., Li F., Shiraishi J., et. al.: Usefulness of computer-aided diagnosis schemes for vertebral fractures and lung nodules on chest radiographs. AJR Am J Roentgenol 2008; 191: pp. 260-265.

  • 51. Kohli A., Robinson J.W., Ryan J., et. al.: Reader characteristics linked to detection of pulmonary nodules on radiographs: ROC vs. JAFROC analyses of performance. Proc SPIE 2011; 7966: 79660K-79660K-79668

  • 52. Schalekamp S., van Ginneken B., Heggelman B., et. al.: New methods for using computer-aided detection information for the detection of lung nodules on chest radiographs. Br J Radiol 2014; 87: 20140015

  • 53. Yano Y., Yabuuchi H., Tanaka N., et. al.: Detectability of simulated pulmonary nodules on chest radiographs: comparison between irradiation side sampling indirect flat-panel detector and computed radiography. Eur J Radiol 2013; 82: pp. 2050-2054.

  • 54. Brennan P.C., Ryan J., Evanoff M., et. al.: The impact of acoustic noise found within clinical departments on radiology performance. Acad Radiol 2008; 15: pp. 472-476.

This post is licensed under CC BY 4.0 by the author.