Home Radiologists' Attitudes and Use of Mammography Audit Reports
Post
Cancel

Radiologists' Attitudes and Use of Mammography Audit Reports

Rationale and Objectives

The US Mammography Quality Standards Act mandates medical audits to track breast cancer outcomes data associated with interpretive performance. The objectives of our study were to assess the content and style of audits and examine use of, attitudes toward, and perceptions of the value that radiologists’ have regarding mandated medical audits.

Materials and Methods

Radiologists ( n = 364) at mammography registries in seven US states contributing data to the Breast Cancer Surveillance Consortium (BCSC) were invited to participate. We examined radiologists’ demographic characteristics, clinical experience, use, attitudes, and perceived value of audit reports from results of a self-administered survey. Information on the content and style of BCSC audits provided to radiologists and facilities was obtained from site investigators. Radiologists’ characteristics were analyzed according to whether or not they self-reported receiving regular mammography audit reports. Latent class analysis was used to classify radiologists’ individual perceptions of audit reports into overall probabilities of having “favorable,” “less favorable,” “neutral,” or “unfavorable” attitudes toward audit reports.

Results

Seventy-one percent (257 of 364) of radiologists completed the survey; two radiologists did not complete the audit survey question, leaving 255 for the final study cohort. Most survey respondents received regular audits (91%), paid close attention to their audit numbers (83%), found the reports valuable (87%), and felt that audit reports prompted them to improve interpretative performance (75%). Variability was noted in the style, target audience, and frequency of reports provided by the BCSC registries. One in four radiologists reported that if Congress mandates more intensive auditing requirements, but does not provide funding to support this regulation they may stop interpreting mammograms.

Conclusion

Radiologists working in breast imaging generally had favorable opinions of audit reports, which were mandated by Congress; however, almost 1 in 10 radiologists reported that they did not receive audits.

In 1992, the US Congress enacted the Mammography Quality Standards Act (MQSA), which established the first national quality standards for mammography facilities in the United States. MQSA was initiated in response to concerns from the public and medical community about the extensive variability of mammography among facilities . The goal of MQSA is to provide all women living in the United States with equal access to quality mammography, regardless of their geographic location. Under MQSA, mammography facilities are required to have a US Food and Drug Administration–approved accreditation body review their radiological equipment, personnel qualifications, and quality assurance processes every 3 years to ensure that baseline quality standards are practiced .

Mammography outcome audits are one of the quality assurance regulations for mammography facilities that fall under MQSA. The basic elements of MQSA’s medical audit include: 1) a method to collect follow-up data for positive mammograms (defined as mammograms with final Breast Imaging Reporting and Data System (BI-RADS) assessment categories of “suspicious” or “highly suggestive of malignancy”); 2) a system to collect pathology results (benign vs. malignant) for all biopsies performed among mammograms interpreted as “suspicious” or “highly suggestive of malignancy”; 3) methods to correlate pathology and mammography results; and 4) review of known false negatives (examinations assessed as “negative,” “benign,” or “probably benign” that became known to the facility as positive for cancer within 12 months of mammography examination) . In addition, at least once every 12 months, facilities are required to designate an interpreting physician to review the medical outcomes data and notify other interpreting physicians that their individual results are available for review. Food and Drug Administration regulations do not specifically require that individual radiologists review their outcomes . Approaches to implement the elements described previously are left to the facility’s discretion .

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and Methods

Get Radiology Tree app to read full this article<

Radiologist Survey

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

BCSC Audit Performance Reports

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 1

Summary of Breast Cancer Surveillance Consortium Performance Reports, by Site ∗

Site 1 Site 2 Site 3 Site 4 Site 5 Site 6 Site 7 Target audience for distribution Individual radiologists Yes No No Yes Yes Yes Yes Mammography facility Yes Yes Yes Yes Yes Yes Yes Format Number of pages 1 3 † 12 6 † 3 6 5 Report uses figures No No No Yes No Yes Yes Frequency of reports Annual Annual Annual Quarterly and annual Annual ‡ Annual Annual BI-RADS data reported Percent or number of screening mammograms that have recommendation for further imaging (BI-RADS category 0) Reported at all sites Percent or number of screening mammograms that have recommendation for biopsy/surgical consult (BI-RADS categories 4/5) Other performance data reported § ‖ False-negative exams Yes Yes Yes Yes Yes Yes Yes True-positive exams Yes Yes Yes Yes Yes Yes ∗∗∗ False-positive exams Yes Yes Yes Yes Yes Yes Yes PPV1 Yes ∗∗∗ Yes ∗∗∗ Yes Yes Yes PPV2 Yes No Yes ∗∗∗ Yes Yes Yes PPV3 No No Yes ∗∗∗ ∗∗∗ No Yes Abnormal interpretation (recall) rate ∗∗∗ ∗∗∗ Yes Yes Yes Yes Yes Data on breast biopsy Biopsy results for individual women No Yes No Yes Yes No Yes Cancer outcome data Cancer detection rate for 1000 screening exams ∗∗∗ ∗∗∗ Yes ∗∗∗ Yes ∗∗∗ Yes Cancer staging No Yes Yes No No No No Percent of cancers found that are minimal disease (invasive <10 mm or ductal carcinoma in situ) No ∗∗∗ Yes No Yes Yes No % of cancers found that are node negative No ∗∗∗ No No Yes No Yes

PPV, Positive predictive value.

‡ Facilities at this BCSC site can receive monthly reports by request and these reports only list positive mammograms and biopsy results for both positive mammogram and negative mammograms.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Statistical Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 1, Radiologists' attitudes toward mammography audit reports among radiologists who self-reported receiving audit reports ( n = 233).

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Among Radiologists who Receive Audits Reports ( N = 233), the Percent who Agree or Strongly Agree with Survey Questions Regarding Attitudes about Mammography Audit Reports, by Radiologist Characteristic ∗ †

May Leave Mammography if Intensive Requirements Mandated without Funding Attitudes toward Audit Reports Trust Accuracy Pay Attention Reports are Valuable Prompts them to Review Cancers Improves their Performance_N_ = 233 (100%)n = 61n = 187n = 193n = 200n = 184n = 172 Radiologist characteristics Percentage who agree or strongly agree with survey questions about audit reports Demographics Age group 30–34 4 (1.7) 75.0 100.0 75.0 75.0 50.0 75.0 35–44 58 (24.9) 39.7 81.0 91.4 89.7 75.9 80.7 45–54 78 (33.5) 19.5 75.6 81.8 83.1 80.5 75.6 55+ 93 (39.9) 26.4 82.6 79.6 88.2 82.8 29.4 Gender Male 168 (72.1) 31.3 81.4 85.6 88.0 79.0 74.3 Female 65 (27.9) 20.3 76.9 76.9 83.1 81.5 76.6 Clinical experience Years interpreting <10 47 (20.3) 42.6 85.1 87.2 80.9 72.3 70.2 mammograms 10–19 84 (36.2) 22.6 78.6 86.9 89.3 79.8 81.9 20+ 101 (43.5) 26.5 79.0 78.0 88.0 83.0 72.0 Fellowship in breast imaging No 214 (91.6) 27.1 81.2 93.6 87.3 80.3 75.0 Yes 19 (8.2) 10.5 68.4 79.0 79.0 73.7 73.7 Academic affiliation No Affiliation 189 (82.2) 30.7 80.3 81.9 85.1 79.8 72.9 Adjunct or primary 41 (17.8) 19.5 80.5 90.2 95.1 80.5 85.0 Hours spent in breast ≤20 134 (60.1) 31.8 80.6 84.3 87.2 78.2 75.9 imaging 21–40 49 (22.0) 14.3 77.1 81.3 83.7 81.6 72.9 40+ 10 (17.9) 33.3 82.5 82.5 87.5 80.0 75.0 Current clinical practice 1–10 81 (38.4) 31.7 80.3 84.0 85.0 80.0 76.3 % of workload that is 11–24 91 (43.1) 27.5 85.7 86.8 89.0 79.1 75.8 screening Mammography ≥25 39 (18.5) 18.4 71.1 76.3 84.6 82.1 68.4 Self-reported volume of screening mammograms in preceding year <480 8 (3.6) 57.1 75.0 75.0 75.0 50.0 62.5 480–999 31 (13.9) 36.7 80.7 80.7 93.3 80.0 74.2 1000–1999 68 (30.5) 26.9 86.6 85.1 85.3 75.0 74.6 2000–4999 87 (39.0) 26.4 80.5 83.9 88.5 81.6 75.6 ≥5000 29 (13.0) 24.1 65.5 79.3 82.8 86.2 75.9 Self-reported total volume (screening and diagnostic) in preceding year <480 3 (1.4) 66.7 66.7 66.7 66.7 66.7 33.3 480–999 18 (8.2) 58.8 83.3 77.8 83.3 72.2 66.7 1000–1999 68 (31.1) 24.2 86.6 86.6 89.6 79.1 77.6 2000–4999 88 (40.2) 27.3 80.7 84.1 87.5 81.8 73.6 ≥5000 42 (19.2) 21.4 73.8 81.0 85.7 81.0 78.6

Bold number denotes statistically significant difference compared to radiologists who did not agree to individual survey questions at P < .05 using the chi-squared test.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 2, Summary of latent class analyses for four groups of radiologists based on their overall attitudes toward mammography audits ( n = 233, radiologists who reported receiving audits). (a) Radiologists with favorable perceptions of mammography audit reports (75% probability for radiologists in this cohort). (b) Radiologists with less favorable perceptions of mammography audit reports (13% probability for radiologists in this cohort). (c) Radiologists with neutral perceptions of mammography audit reports (8% probability for radiologists in this cohort). (c) Radiologists with unfavorable perceptions of mammography audit reports (3% probability for radiologists in this cohort).

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Appendix 1

Get Radiology Tree app to read full this article<

Item Response Probabilities within Each Latent Class and Latent Class Prevalences for 233 Radiologists who were Surveyed about their Perceptions and Use of MQSA Mandated Medical Outcomes Audits

Item Response Probabilities Latent Class: Favorable Less Favorable Neutral Unfavorable Response Category ∗ Agree Neutral Disagree Agree Neutral Disagree Agree Neutral Disagree Agree Neutral Disagree Trust accuracy of audits 0.87 0.10 0.03 0.88 0.10 0.03 0.41 0.43 0.17 0.00 0.00 1.00 Pay attention to audits 0.95 0.05 0.00 0.70 0.30 0.00 0.00 0.71 0.29 0.51 0.00 0.49 Reports are valuable 0.99 0.01 0.00 0.69 0.31 0.00 0.00 0.83 0.17 0.76 0.00 0.24 Review cancers because of audits 0.91 0.07 0.02 0.40 0.34 0.26 0.73 0.12 0.16 0.16 0.12 0.72 Audits prompt to improve performance 0.94 0.06 0.00 0.00 0.77 0.23 0.16 0.63 0.21 0.76 0.00 0.24 If Congress mandates additional requirements I may stop interpreting mammograms 0.25 0.29 0.46 0.51 0.10 0.39 0.17 0.38 0.45 0.37 0.36 0.27 Class prevalences 0.75 0.13 0.08 0.03

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. US Food & Drug Administration. Mammography Quality Standards Act regulations. Available online at: http://www.fda.gov/cdrh/mammography/frmamcom2.html . Accessed March 12, 2009.

  • 2. Birdwell R.L., Wilcox P.A.: The mammography quality standards act: benefits and burdens. Breast Dis 2001; 13: pp. 97-107.

  • 3. US Food & Drug Administration. Written statement for the record. Available online at: http://www.fda.gov/ola/2002/mqsa0228.html . Accessed March 12, 2009.

  • 4. Monsees B.S.: The Mammography Quality Standards Act. An overview of the regulations and guidance. Radiol Clin North Am 2000; 38: pp. 759-772.

  • 5. US Food & Drug Administration. Medical outcomes audit general requirement. Available online at: http://www.fda.gov/CDRH/mammography/robohelp/med_outcomes_audit_gen_req.htm . Accessed May 4, 2009.

  • 6. Aiello Bowles E.J., Geller B.M.: Best ways to provide feedback to radiologists on mammography performance. AJR Am J Roentgenol 2009; 193: pp. 157-164.

  • 7. Breast Cancer Surveillance Consortium (NCI). Available online at: http://breastscreening.cancer.gov/ . Accessed March 12, 2010.

  • 8. Breast Cancer Surveillance Consortium NCI. Available online at: http://breastscreening.cancer.gov/publications/ . Accessed October 2009.

  • 9. Ballard-Barbash R., Taplin S.H., Yankaskas B.C., et. al.: Breast Cancer Surveillance Consortium: a national mammography screening and outcomes database. AJR Am J Roentgenol 1997; 169: pp. 1001-1008.

  • 10. Carney P.A., Miglioretti D.L., Yankaskas B.C., et. al.: Individual and combined effects of age, breast density, and hormone replacement therapy use on the accuracy of screening mammography. Ann Intern Med 2003; 138: pp. 168-175.

  • 11. Elmore J.G., Jackson S.L., Abraham L., et. al.: Variability in interpretive performance of screening mammography and radiologists’ characteristics associated with accuracy. Radiology 2009; 253: pp. 641-651.

  • 12. Factors Affecting Variability of Radiologists (FAVOR) Research Group. National Survey of Mammography Practices. Available online at: http://breastscreening.cancer.gov/collaborations/favor_ii_mammography_practice_survey.pdf . Accessed September 2009.

  • 13. Bandeen-Roche K., Zeger S.L., Rathouz P.J.: Latent variable regression for multiple discrete outcomes. J Am Stat Assoc 1997; 92: pp. 1375-1386.

  • 14. Lanza S., Collins L.M., Lemmon D., et. al.: PROC LCA: a SAS procedure for latent class analysis. Struct Eq Model 2007; 14: pp. 671-694.

  • 15. Lanza S.T.L.D., Schafer J.L., Collins L.M.: Proc LCA & PROC LTA user’s guide version 1.1.5 beta.2008.The Methodology Center, Pennsylvania State UniversityUniversity Park

  • 16. Whiteman T.: Mammography malpractice litigation and the impact of MQSA. Admin Radiol 1995; 14: pp. 29-31.

  • 17. Dick J.: Predictors of radiologists’ perceived risk of malpractice lawsuits in breast imaging. AJR Am J Roentgenol 2009; 192: pp. 327-333.

  • 18. Carney P.A., Geller B.M., Moffett H., et. al.: Current medicolegal and confidentiality issues in large, multicenter research programs. Am J Epidemiol 2000; 152: pp. 371-378.

  • 19. Miglioretti D.L., Smith-Bindman R., Abraham L., et. al.: Radiologist characteristics associated with interpretive performance of diagnostic mammography. J Natl Cancer Inst 2007; 99: pp. 1854-1863.

  • 20. US Food and Drug Administration (FDA). The Mammography Quality Standards Act of 1992, Pub. L. No. 102–539; 1992.

  • 21. Jiang Y., Miglioretti D.L., Metz C.E., et. al.: Breast cancer detection rate: designing imaging trials to demonstrate improvements. Radiology 2007; 243: pp. 360-367.

  • 22. Clark R.A., King P.S., Worden J.K.: Mammography registry: considerations and options. Radiology 1989; 171: pp. 91-93.

  • 23. Ballard-Barbash R., Taplin S.H., Yankaskas B.C., et. al.: Breast Cancer Surveillance Consortium: a national mammography screening and outcomes database. AJR Am J Roentgenol 1997; 169: pp. 1001-1008.

  • 24. Sickles E.A.: Auditing your breast imaging practice: an evidence-based approach. Semin Roentgenol 2007; 42: pp. 211-217.

  • 25. D’Orsi C.J., Bassett L.W., Berg W.A., et. al.: Breast Imaging Reporting and Data System: ACR BI-RADS-Mammography.ed 42003.American College of RadiologyReston, VA

  • 26. American College of Radiology. National Mammography Database (NMD). Available online at: https://nrdr.acr.org/portal/NMD/Main/page.aspx . Accessed September 2009.

  • 27. Linver M.N., Osuch J.R., Brenner R.J., et. al.: The mammography audit: a primer for the mammography quality standards act (MQSA). AJR Am J Roentgenol 1995; 165: pp. 19-25.

  • 28. Silvey A.B., Warrick L.H.: Linking quality assurance to performance improvement to produce a high reliability organization. Int J Radiat Oncol Biol Phys 2008; 71: pp. S195-S199.

  • 29. Benson H.R.: Benchmarking in healthcare: evaluating data and transforming it into action. Radiol Manage 1996; 18: pp. 40-46.

  • 30. Sickles E.A.: Quality assurance: how to audit your own mammography practice. Radiol Clin N Am 1992; 30: pp. 265-275.

  • 31. Adcock K.A.: Initiative to Improve Mammogram Interpretation. Permanente J 2004; 8: pp. 12-18.

  • 32. Perry N.M.: Interpretive skills in the National Health Service Breast Screening Programme: performance indicators and remedial measures. Semin Breast Dis 2003; 6: pp. 108-113.

  • 33. van der Horst F., Hendriks J., Rijken H., et. al.: Breast cancer screening in the Netherlands: audit and training of radiologists. Semin Breast Dis 2003; 6: pp. 114-122.

  • 34. Institute of Medicine : Improving breast imaging quality standards.2005.The National Academies PressWashington, DC

  • 35. Asch D.A., Jedrziewski M.K., Christakis N.A.: Response rates to mail surveys published in medical journals. J Clin Epidemiol 1997; 50: pp. 1129-1136.

  • 36. Miglioretti DL GC, PA, et al. When radiologists perform best: the learning curve in screening mammography interpretation. Radiology. 2009; 253:632–640.

  • 37. American College of Radiology : Breast imaging reporting and data system (BI-RADS).2003.American College of RadiologyReston, VA

This post is licensed under CC BY 4.0 by the author.