Home A Simulation Screening Mammography Module Created for Instruction and Assessment
Post
Cancel

A Simulation Screening Mammography Module Created for Instruction and Assessment

Rationale and Objectives

To improve mammographic screening training and breast cancer detection, radiology residents participated in a simulation screening mammography module in which they interpreted an enriched set of screening mammograms with known outcomes. This pilot research study evaluates the effectiveness of the simulation module while tracking the progress, efficiency, and accuracy of radiology resident interpretations and also compares their performance against national benchmarks.

Materials and Methods

A simulation module was created with 266 digital screening mammograms enriched with high-risk breast lesions (seven cases) and breast malignancies (65 cases). Over a period of 27 months, 39 radiology residents participated in the simulation screening mammography module. Resident sensitivity and specificity were compared to Breast Cancer Surveillance Consortium (BCSC data through 2009) national benchmark and American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) acceptable screening mammography audit ranges.

Results

The sensitivity, the percentage of cancers with an abnormal initial interpretation (BI-RADS 0), among residents was 84.5%, similar to the BCSC benchmark sensitivity of 84.9% (sensitivity for tissue diagnosis of cancer within 1 year following the initial examination) and within the acceptable ACR BI-RADS medical audit range of ≥75%. The specificity, the percentage of noncancers that had a negative image interpretation (BI-RADS 1 or 2), among residents was 83.2% compared to 90.3% reported in the BCSC benchmark data, but lower than the suggested ACR BI-RADS range of 88%–95%.

Conclusions

Using simulation modules for interpretation of screening mammograms is a promising method for training radiology residents to detect breast cancer and to help them achieve competence toward national benchmarks.

Introduction

Screening mammography is the only breast imaging modality that is known to reduce breast cancer mortality. Instruction of radiology residents in interpreting screening mammograms is challenging, and participation of radiology trainees in screening mammographic interpretation is a critical component of residency training . This simulation module was created to teach interpretation of screening mammography to residents in training with the overarching goal of increasing breast cancer detection among women. To limit variability of exposure to screening mammography cases as well as to track progress, efficiency, and accuracy of interpretations of screening mammograms, we created an enriched standardized set of digital screening mammograms with known outcomes. Participation in this simulation module allowed radiology residents to obtain immediate feedback and to correlate imaging characteristics with histologic findings. The goal of this module was to improve mammographic screening training and breast cancer detection, as well as to prepare more fully for the rigors of private practice or academic medicine. Additionally, the United States Mammography Quality Standards Act (MQSA) mandates medical audits to track breast cancer outcome data associated with interpretive performance. Practicing breast radiologists regularly review feedback from their audit reports and are able to continually refine their interpretive skills. This screening mammography simulation experience provided residents with a similar objective feedback mechanism and the opportunity to learn about the MQSA medical outcomes audit program and compare their performance against national benchmarks. This pilot research study evaluated the effectiveness of the simulation screening mammography module.

Materials and Methods

A simulation module was created and modified over a period of three academic years: from 2012 to 2013, from 2013 to 2014, and from 2014 to 2015, from a collection of 266 digital screening mammograms from 2008 to 2009 to provide radiology residents with a standardized set of screening mammograms.

Resident Workflow

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

First Iteration

Get Radiology Tree app to read full this article<

Second Iteration

Get Radiology Tree app to read full this article<

Third Iteration

Get Radiology Tree app to read full this article<

Data Set

Get Radiology Tree app to read full this article<

Figure 1, Percentage of BI-RADS category 0 or BI-RADS category 1 or 2 cases in the simulation screening module. BI-RADS, Breast Imaging Reporting and Data System.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 2, Number of cases of breast cancers, high-risk lesions, nonbreast cancers, and benign lesions during each simulation. BI-RADS, Breast Imaging Reporting and Data System.

TABLE 1

Breast Pathology Included in Our Simulation Screening Mammography Modules

Breast Cancers High-Risk Lesions or Other Cancers Benign or Non-High-Risk Lesions

DCIS, ductal carcinoma in situ; IDC, invasive ductal carcinoma.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 3, In our simulation screening modules, 103/104 BI-RADS category 0 lesions underwent subsequent percutaneous breast biopsy with known histologic outcomes. One of the 104 cases represented a simple cyst and did not undergo biopsy as it demonstrated benign imaging features at diagnostic ultrasound. BI-RADS, Breast Imaging Reporting and Data System.

Get Radiology Tree app to read full this article<

BI-RADS Categorization

Get Radiology Tree app to read full this article<

Audit Definitions—Our Simulation

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

TABLE 2

Statistical Definitions for Comparison Between Resident and Attending Radiologist Interpretations of Screening Mammography Module Cases

Attending Interpretation

BI-RADS 0 Attending Interpretation

BI-RADS 1 or 2 Resident interpretation

BI-RADS 0 True positive False positive Resident interpretation

BI-RADS 1 or 2 False negative True negative

BI-RADS, Breast Imaging Reporting and Data System.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Statistical Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 4, Resident performance of those on their first, second, and third rotations as compared to the initial attending breast radiologist interpretation.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Figure 5, Overall resident performance as compared to the BCSC Benchmark Data with regard to sensitivity, percentage of cancers with an abnormal initial interpretation, as well as specificity, and percentage of noncancers that had a negative image interpretation. BCSC, Breast Cancer Surveillance Consortium.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Limitations and Future Directions

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Conclusion

Get Radiology Tree app to read full this article<

Acknowledgments

Get Radiology Tree app to read full this article<

Appendix

Audit Definitions—Breast Cancer Surveillance Consortium

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Audit Definitions—ACR BI-RADS

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Hawley J.R., Taylor C.R., Cubbison A.M., et. al.: Influences of radiology trainees on screening mammography interpretation. J Am Coll Radiol 2016; S1546-1440(16)00067-3 [pii]; Epub February 26, 2016

  • 2. US Food and Drug Administration : Mammography quality standards act. Available at: http://www.fda.gov/radiation-emittingproducts/mammographyqualitystandardsactandprogram/default.htm Accessed November 20, 2015

  • 3. Monticciolo D.L., Rebner M., Appleton C.M., et. al.: The ACR/Society of Breast Imaging Resident and Fellowship Training Curriculum for Breast Imaging, updated. J Am Coll Radiol 2013; 10: pp. 207-210. e4; Epub December 23, 2012

  • 4. Breast Cancer Surveillance Consortium : Sensitivity and specificity for 2,061,691 screening mammography examinations from 2004–2008—based on BCSC data through 2009. Available at: http://breastscreening.cancer.gov/statistics/benchmarks/screening/2009/tableSensSpec.html Accessed September 16, 2015

  • 5. Sickles E.A., D’Orsi C.J.: ACR BI-RADS® follow-up and outcome monitoring.ACR BI-RADS® Atlas, breast imaging reporting and data system.2013.American College of RadiologyReston, VA:

  • 6. Spring D.B., Kimbrell-Wilmot K.: Evaluating the success of mammography at the local level: how to conduct an audit of your practice. Radiol Clin North Am 1987; 25: pp. 983-992.

  • 7. Murphy W.A., Destouet J.M., Monsees B.S.: Professional quality assurance for mammography screening programs. Radiology 1990; 175: pp. 319-320.

  • 8. Farria D.M., Salcman J., Monticciolo D.L., et. al.: A survey of breast imaging fellowship programs: current status of curriculum and training in the United States and Canada. J Am Coll Radiol 2014; 11: pp. 894-898. Epub May 22, 2014

  • 9. Burnside E.S., Park J.M., Fine J.P., et. al.: The use of batch reading to improve the performance of screening mammography. AJR Am J Roentgenol 2005; 185: pp. 790-796.

  • 10. Schou Bredal I., Kåresen R., Skaane P., et. al.: Recall mammography and psychological distress. Eur J Cancer 2013; 49: pp. 805-811. Epub September 27, 2012

  • 11. Bond M., Pavey T., Welch K., et. al.: Psychological consequences of false-positive screening mammograms in the UK. Evid Based Med 2013; 18: pp. 54-61. Epub August 2, 2012

  • 12. Nodine C.F., Kundel H.L., Mello-Thoms C., et. al.: How experience and training influence mammography expertise. Acad Radiol 1999; 6: pp. 575-585.

  • 13. Saunders R.S., Samei E.: Improving mammographic decision accuracy by incorporating observer ratings with interpretation time. Br J Radiol 2006; 79: pp. S117-S122.

  • 14. Grimm L.J., Kuzmiak C.M., Ghate S.V., et. al.: Radiology resident mammography training: interpretation difficulty and error-making patterns. Acad Radiol 2014; 21: pp. 888-892.

  • 15. Lee E.H., Jun J.K., Jung S.E., et. al.: The efficacy of mammography boot camp to improve the performance of radiologists. Korean J Radiol 2014; 15: pp. 578-585. Epub September 12, 2014

  • 16. Luo P., Qian W., Romilly P.: CAD-aided mammogram training. Acad Radiol 2005; 12: pp. 1039-1048.

  • 17. Zhang J., Grimm L.J., Lo J.Y., et. al.: Does breast imaging experience during residency translate into improved initial performance in digital breast tomosynthesis?. J Am Coll Radiol 2015; 12: pp. 728-732.

  • 18. Rosenberg R.D., Yankaskas B.C., Abraham L.A., et. al.: Performance benchmarks for screening mammography. Radiology 2006; 241: pp. 55-66.

This post is licensed under CC BY 4.0 by the author.