Home Improving Breast Ultrasound Interpretation in Uganda Using a Condensed Breast Imaging Reporting and Data System
Post
Cancel

Improving Breast Ultrasound Interpretation in Uganda Using a Condensed Breast Imaging Reporting and Data System

Rationale and Objectives

This study aimed to determine whether a 2-day educational course using a condensed Breast Imaging Reporting and Data System (condensed BI-RADS) improved the accuracy of Ugandan healthcare workers interpreting breast ultrasound.

Materials and Methods

The target audience of this intervention was Ugandan healthcare workers involved in performing, interpreting, or acting on the results of breast ultrasound. The educational course consisted of a pretest knowledge assessment, a series of lectures on breast imaging interpretation and standardized reporting using a condensed BI-RADS, and a posttest knowledge assessment. Participants interpreted 53 different ultrasound test cases by selecting the finding type, descriptors for masses, and recommendations. We compared the percent correct on the pretest and posttest based on occupation and training level.

Results

Sixty-one Ugandan healthcare workers participated in this study, including 13 radiologists, 13 other physicians, 12 technologists, and 23 midlevel providers. Most groups improved in identifying the finding type ( P < 0.05). All occupations showed improved use of descriptive terms for the shape and internal echogenicity of masses ( P < 0.05). Most groups showed significant improvement in recommendations for normal and benign findings with a corresponding reduction in biopsy recommendations.

Conclusions

Targeted breast ultrasound education using a condensed BI-RADS improved the interpretive performance of healthcare workers and was particularly successful in reducing the frequency of unnecessary biopsies for normal and benign findings. Multimodal educational efforts to improve accuracy and management of breast ultrasound findings may augment breast cancer early detection efforts in resource-limited settings.

Introduction

In Uganda, breast cancer is the second most common type of cancer in women, and its incidence has increased by 5.2% per year (1993–2007) . Lack of breast cancer awareness results in an average delay of 29 months in seeking care after self-detecting a breast lump . For this reason, greater than 77–89% of women diagnosed with breast cancer have late-stage disease (stages III and IV) . Late-stage breast cancer is more difficult and expensive to treat and less likely gets cured . Directing resources and efforts to improve breast cancer outcomes requires increasing awareness and early detection efforts; however, this would result in inefficient resource utilization in regions with few resources to spare. Therefore, efforts must first establish a systematic approach to evaluate women presenting with self-detected breast lumps.

Currently in Uganda, women with breast symptoms most often present to their midlevel providers (midwives or clinical officers, comparable to physician assistants or nurse practitioners) at a community health center for a clinical breast examination and, if positive, referred to a hospital for a breast ultrasound or other evaluations. To supplement the 47 Ugandan radiologists interpreting medical imaging mostly in cities, many groups, including the Uganda Ministry of Health, have trained technologists (sonographers and radiographers) and midlevel providers to interpret basic ultrasound out in the community . Consequently, nonradiologists interpret 70% of the imaging. In a country where most women live in rural settings, providing ultrasound at local community health centers for women who need higher level care at a referral hospital far away may represent a resource-appropriate strategy. It limits the number of women required to travel long distances from their families and reduces the amount of time trained staff at the referral hospitals have to spend evaluating women not needing higher level care. Although midlevel providers and technologists are capable of using basic ultrasound to triage these women, focused education is required to ensure high-quality care.

Get Radiology Tree app to read full this article<

Materials and Methods

Educational Course and Participants

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

The Pre-/Posttest Instrument

Get Radiology Tree app to read full this article<

Figure 1, Example of a test case ultrasound image with its corresponding answer sheet using the condensed Breast Imaging Reporting and Data System (BI-RADS). Participants were asked to select a finding type and recommendation for each case. If the participant selected a mass for the finding type, as in this case, they were also asked to select one descriptor for the shape, margins and echogenicity of the mass.

Get Radiology Tree app to read full this article<

Condensed BI-RADS System and Answer Sheet

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Data Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Distribution of Participants by Occupation

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Comparison of Pre- and Posttest Correct Responses for Finding Type

Get Radiology Tree app to read full this article<

Table 1

Comparison of Pre- and Posttest Scores for Finding Type

Occupation Average Correct Responses/Average Total Responses (%) All Normal Cyst Lymph Node Mass Pre Post Pre Post Pre Post Pre Post Pre Post Radiologist 511/670(76) 561/670(84) \* 31/57(54) 43/57(75) \* 59/90(66) 80/90(89) \* 28/50(56) 38/50(76) \* 393/473(83) 400/473(85) Other physician 306/642(48) 438/642(68) \* 15/58(26) 30/58(52) \* 46/89(52) 71/89(80) \* 22/46(48) 19/46(41) 223/449(50) 318/449(71) \* Technologist 375/567(66) 422/567(73) 18/53(34) 16/53(30) 52/76(68) 69/76(91) \* 28/43(65) 22/43(51) 277/395(70) 315/395(80) \* Midlevel provider 548/1054(52) 741/1054(70) \* 40/96(42) 44/96(46) 55/143(39) 92/143(64) \* 32/80(40) 41/80(51) 421/735(57) 564/735(77) \*

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Comparison of Pre- and Posttest Correct Responses for Descriptors of Masses

Get Radiology Tree app to read full this article<

Table 2

Comparison of Pre- and Posttest Scores for Descriptors of Masses

Occupation Average Correct Responses/Average Total Responses (%) Shape Margins Echogenicity Pre Post Pre Post Pre Post Radiologist 261/358(73) 281/358(78) \* 237/355(67) 252/355(71) \* 308/344(90) 330/344(96) \* Other physician 124/187(66) 143/187(76) \* 115/184(63) 121/184(66) 105/176(60) 162/176(92) \* Technologist 157/232(68) 184/232(79) \* 133/228(58) 140/228(61) 142/229(62) 201/229(88) \* Midlevel provider 205/321(64) 247/321(77) \* 175/307(57) 177/307(58) 115/293(39) 242/293(83) \*

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Comparison of the Pre- and Posttest Correct Responses for Recommendations by BI-RADS Assessment Category

Get Radiology Tree app to read full this article<

Table 3

Comparison of the Pre- and Posttest Scores for Recommendations by BI-RADS Assessment Category

BI-RADS Category Average Correct Responses/Average Total Responses (%) Pre Post_P_ Value BI-RADS 1,2: Clinical follow-up ( N = 17) Radiologist 65/185(35) 107/185(58) <.01 Other physician 48/191(25) 63/191(33) .07 Technologist 36/191(19) 65/191(34) <.01 Midlevel provider 75/326(23) 120/326(37) <.01 BI-RADS 3: Imaging follow-up ( N = 8) Radiologist 40/95(42) 44/95(46) .5 Other physician 26/98(27) 44/98(45) <.01 Technologist 28/92(30) 51/92(55) <.01 Midlevel provider 64/155(41) 61/155(39) .7 BI-RADS 4: Biopsy ( N = 21) Radiologist 139/258(54) 125/258(49) .1 Other physician 122/231(53) 136/231(59) .2 Technologist 133/236(56) 105/236(45) <.01 Midlevel provider 161/393(41) 167/393(43) .7 BI-RADS 5: Biopsy ( N = 7) Radiologist 59/87(68) 69/87(79) .06 Other physician 48/82(59) 61/82(74) .04 Technologist 48/79(61) 58/79(73) .08 Midlevel Provider 59/133(44) 71/133(53) .1

BI-RADS, Breast Imaging Reporting and Data System.

Get Radiology Tree app to read full this article<

Comparison of the Pre- and Posttest Frequency of Biopsy Recommendations for Normal, Benign, and Probably Benign Ultrasound Findings

Get Radiology Tree app to read full this article<

Table 4

Comparison of the Pre- and Posttest Biopsy Recommendations for Normal, Benign, and Probably Benign Ultrasound Findings

Average Recommended for Biopsy/Average Total Responses (%) Pre Post_P_ Value Normal or benign ( N = 17) Radiologist 60/185(32) 31/185(17) <.01 Other physician 92/191(48) 70/191(37) .02 Technologist 93/191(49) 43/191(23) <.01 Midlevel provider 114/326(35) 105/326(32) .4 Probably benign ( N = 8) Radiologist 30/95(32) 24/95(25) .2 Other physician 56/98(57) 35/98(36) <.01 Technologist 51/92(55) 31/92(34) <.01 Midlevel provider 60/100(39) 61/100(39) .9

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Moten A., Schafer D., Farmer P., et. al.: Redefining global health priorities: improving cancer care in developing settings. J Glob Health 2014; 4: pp. 010304.

  • 2. DeSantis C.E., Bray F., Ferlay J., et. al.: International variation in female breast cancer incidence and mortality rates. Cancer Epidemiol Biomarkers Prev 2015; 24: pp. 1495-1506.

  • 3. Jemal A., Siegel R., Ward E., et. al.: Cancer statistics, 2008. CA Cancer J Clin 2008; 58: pp. 71-96.

  • 4. Galukande M., Mirembe F., Wabinga H.: Patient delay in accessing breast cancer care in a sub Saharan African country: Uganda. Br J Med Med Res 2014; 4: pp. 2599-2610.

  • 5. Gakwaya A., Kigula-Mugambe J.B., Kavuma A., et. al.: Cancer of the breast: 5-year survival in a tertiary hospital in Uganda. Br J Cancer 2008; 99: pp. 63-67.

  • 6. Galukande M., Wabinga H., Mirembe F.: Breast cancer survival experiences at a tertiary hospital in sub-Saharan Africa: a cohort study. World J Surg Oncol 2015; 13: pp. 220.

  • 7. Legorreta A.P., Brooks R.J., Leibowitz A.N., et. al.: Cost of breast cancer treatment. A 4-year longitudinal study. Arch Intern Med 1996; 156: pp. 2197-2201.

  • 8. Yip C.H., Smith R.A., Anderson B.O., et. al.: Guideline implementation for breast healthcare in low- and middle-income countries: early detection resource allocation. Cancer 2009; 113: pp. 2244-2256.

  • 9. Nathan R., Swanson J.O., Marks W., et. al.: Screening obstetric ultrasound training for a 5-country cluster randomized controlled trial. Ultrasound Q 2014; 30: pp. 262-266.

  • 10. Stolz L.A., Muruganandan K.M., Bisanzo M.C., et. al.: Point-of-care ultrasound education for non-physician clinicians in a resource-limited emergency department. Trop Med Int Health 2015; 20: pp. 1067-1072.

  • 11. Swanson J.O., Kawooya M.G., Swanson D.L., et. al.: The diagnostic impact of limited, screening obstetric ultrasound when performed by midwives in rural Uganda. J Perinatol 2014; 34: pp. 508-512.

  • 12. ACR : ACR BI-RADS® atlas.5th ed.2013.American College of Radiology. ACR BI-RADSReston, VA

  • 13. Burnside E.S., Sickles E.A., Bassett L.W., et. al.: The ACR BI-RADS experience: learning from history. J Am Coll Radiol 2009; 6: pp. 851-860.

  • 14. Scheel J.R., Nealey E.M., Orem J., et. al.: ACR BI-RADS use in low-income countries: an analysis of diagnostic breast ultrasound practice in Uganda. J Am Coll Radiol 2016; 13: pp. 163-169.

  • 15. Lazarus E., Mainiero M.B., Schepps B., et. al.: BI-RADS lexicon for US and mammography: interobserver variability and positive predictive value. Radiology 2006; 239: pp. 385-391.

  • 16. Lee H.J., Kim E.K., Kim M.J., et. al.: Observer variability of Breast Imaging Reporting and Data System (BI-RADS) for breast ultrasound. Eur J Radiol 2008; 65: pp. 293-298.

  • 17. Harris P.A., Taylor R., Thielke R., et. al.: Research Electronic Data Capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42: pp. 377-381.

  • 18. American College of Radiology : ACR BI-RADS-Ultrasound.2003.American College of RadiologyReston, VA

  • 19. Barton M.B., Elmore J.G., Fletcher S.W.: Breast symptoms among women enrolled in a health maintenance organization: frequency, evaluation, and outcome. Ann Intern Med 1999; 130: pp. 651-657.

  • 20. Amin A.L., Purdy A.C., Mattingly J.D., et. al.: Benign breast disease. Surg Clin North Am 2013; 93: pp. 299-308.

  • 21. Stavros A.T., Thickman D., Rapp C.L., et. al.: Solid breast nodules: use of sonography to distinguish between benign and malignant lesions. Radiology 1995; 196: pp. 123-134.

  • 22. Hong A.S., Rosen E.L., Soo M.S., et. al.: BI-RADS for sonography: positive and negative predictive values of sonographic features. AJR Am J Roentgenol 2005; 184: pp. 1260-1265.

This post is licensed under CC BY 4.0 by the author.