Home State of Structured Reporting in Radiology, a Survey
Post
Cancel

State of Structured Reporting in Radiology, a Survey

Rationale and Objectives

To survey North American radiologists on current practices in structured reporting and language.

Materials and Methods

An e-mail invitation was sent to the Association of University Radiologists membership (comprising 910 members) to participate in an online survey that addressed development, use, and experience of structured reporting, language, and imaging classification or reporting systems and personal dictation styles.

Results

Of the 910 members e-mailed, 265 (29.1%) responded, 90.6% of whom were from academic teaching hospitals. There were no significant differences in responses based on group size or region of practice. Of all the respondents, 51.3% come from groups that developed structured reporting for at least half of their reports and only 10.9% for none. A significantly fewer 13% of respondents used rigid unmodifiable structures or checklists rather than adaptable outlines; 59.5% respondents report being satisfied or very satisfied with their structured reports, whereas a significantly fewer 13% report being dissatisfied or very dissatisfied. Structured reports were reportedly significantly more likely to be required, appreciated, and to decrease errors in departments using many structured reports compared to groups with less widespread use.

Conclusions

Most academic radiology departments are using or experimenting with structured reports. Although radiologist satisfaction with standardization is significant, there are strong opinions about their limitations and value. Our survey suggests that North American radiologists are invested in exploring structured reporting and will hopefully inform future study on how we define a standard report and how much we can centralize this process.

Although there has been advocacy for structured reporting since the early 1900s , there is mounting evidence that US radiologists and referrers increasingly prefer structured reporting . As such, many studies and associations have sought to create guidelines for radiology reporting , and there is international support for structured reporting . However, what constitutes the structured report and how we should create it are not entirely clear . The Breast imaging data and reporting system (BI-RADS) has become an accepted standard , and other groups are working to create validated standards for reporting (for instance, the Radiological Society of North America [RSNA] Reporting Committee, the developing Liver Imaging Reporting and Data System [LI-RADS], and the Prostate Imaging Reporting and Data Systems) . There is similar interest in standardizing reporting language, most notably with the RSNA creation of the RadLex database . However, many limitations and conflicts remain in developing guidelines for radiology reporting and in how to create a useful, accurate, and efficacious report .

Standardized reporting has not been shown to improve comprehension of radiology reports or accuracy of image interpretation by residents . However, there have been interesting reports from academic institutions with positive results in radiologist adherence to a checklist report and greater satisfaction with structured report clarity and relevance among radiologists and nonradiologists .

Get Radiology Tree app to read full this article<

Materials and methods

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Demographics

Get Radiology Tree app to read full this article<

Table 1

Question 16

What Type of Setting Do You Work in? Answer Options Response Percent Response Count Academic teaching hospital 90.6 193 Private group with academic responsibilities 6.1 13 Private group, hospital, and office based 0.5 1 Private hospital based 0.5 1 Private office based 0.0 0 Other 2.3 5 Answered question 213 Skipped question 52

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Question 17

Where Is Your Practice? Answer Options Response Percent Response Count Northeast 23.9 51 Mid-Atlantic 8.0 17 Southeast 15.0 32 Midwest 25.4 54 Rocky Mountain 2.8 6 Southwest 12.2 26 Pacific Northwest 7.5 16 Other 5.2 11 Answered question 213 Skipped question 52

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 3

Question 15

How Many Radiologists Are in Your Group/Department/Workplace? Answer Options Response Percent Response Count <10 4.7 10 11–25 18.8 40 26–50 29.6 63 51–100 31.5 67 >100 15.5 33 Answered question 213 Skipped question 52

Get Radiology Tree app to read full this article<

Structured Reporting

Get Radiology Tree app to read full this article<

Table 4

Question 1

Has Your Practice Developed Specific Structured Radiology Reports? Answer Options Response Percent Response Count 1–3 reports 1.9 5 A few 31.3 83 About half of our reports 15.1 40 The vast majority of our reports 36.2 96 Only a nonspecific general template for all types of studies 2.6 7 Yes, but they are no longer used 0.4 1 Only for residents 1.5 4 No (not that I am aware of) 10.9 29 Answered question 265

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 5

Question 2

What Are the Structured Reports Based on (Check All That Apply)? Answer Options Response Percent Response Count Physician Quality Reporting System (PQRS) initiatives 19.0 38 Referrer’s requests (please specify) 21.0 42 Other department’s specifications (please specify which departments) 10.0 20 RSNA templates 18.5 37 Reporting systems/classifications (HI-RADS, LI-RADS, PI-RADS, AAST, and so forth) 24.5 49 Other society templates (please specify) 5.5 11 Billing considerations 36.0 72 Standardization of variable reporting styles 51.0 102 Error minimization 34.5 69 Resident education 34.0 68 Ad hoc (without outside sources) 17.0 34 I am not aware of the source/motivation 12.5 25 Other needs/sources (please specify): 6.5 13 Answered question 200 Skipped question 65

AAST, American Association for the Surgery of Trauma; HI-RADS, Head Injury Imaging Reporting and Data System; LI-RADS, Liver Imaging Reporting and Data Systems; PI-RADS, Prostate Imaging Reporting and Data Systems; RSNA, Radiological Society of North America.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 6

Question 4

How Were These Reports Predominantly Created/Instituted? Answer Options Response Percent Response Count Department/service-wide consensus 17.0 34 Committee consensus 9.0 18 1–3 authors with chair approval 23.5 47 We used many different processes for different reports 28.5 57 I am uncertain 12.0 24 Other 10.0 20 Answered question 200 Skipped question 65

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 7

Question 5

What Is the Format of These Structured Reports? Answer Options Response Percent Response Count Checklists 4.5 9 Outline with standard language selections 8.5 17 Outline with standard language or free text allowed 49.0 98 Required elements in free text without a standard order 2.5 5 Many types but mostly checklists 2.0 4 Many types but mostly structured format and language 13.5 27 Many types but mostly outlines with free text options 15.0 30 Too many variations to characterize 2.5 5 Other 2.5 5 Answered question 200 Skipped question 65

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 8

Question 6

For Which Subspecialties Have You Created Structured Reports, Not Including Mammography (Check All That Apply)? Answer Options Response Percent Response Count Body 63.0 126 US 55.5 111 Neuro 43.5 87 IR 26.0 52 Peds 28.5 57 Plain film 34.0 68 Chest 45.5 91 Cardiovascular 30.5 61 Nuclear 35.0 70 PET 36.0 72 MSK 42.5 85 ENT 17.0 34 Fluoro 26.5 53 Universal template 14.0 28 Other 7.0 14 Answered question 200 Skipped question 65

ENT, ear, nose, and throat (otolaryngology); IR, interventional radiology; MSK, musculoskeletal; PET, positron emission tomography; US, ultrasound.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 9

Question 10

Have These Decreased Errors? Answer Options Response Percent Response Count Yes 22.5 45 Yes (but only typographical and dictation errors) 6.5 13 No significant change 16.5 33 No (increased “misses”—errors of study interpretation or recognition of pathology) 0.0 0 No (increased overall errors) 1.5 3 I am uncertain 53.0 106 Other (please specify) 16 Answered question 200 Skipped question 65

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 10

Question 11

What Is Your Overall Opinion of These Structured Reports? Answer Options Response Percent Response Count Very satisfied 19.0 38 Satisfied 40.5 81 Neutral 24.0 48 Dissatisfied 10.5 21 Very dissatisfied 2.5 5 Too mixed to define 3.5 7 Answered question 200 Skipped question 65

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 11

Question 12

How Have Referrers Commented on These Structured Reports? Answer Options Response Percent Response Count Very positively 12.0 24 Positively 25.5 51 Neutrally 11.5 23 Negatively 1.0 2 Very negatively 1.5 3 Too mixed to define 6.5 13 I have not heard 42.0 84 Answered question 200 Skipped question 65

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 12

Question 13

Does Your Group Formally Adhere to Any of the Following Reporting/Language Systems or Imaging Classification Systems (Check All That Apply) Answer Options Response Percent Response Count HI-RADS 1.7 3 PI-RADS 2.8 5 LI-RADS 22.2 39 RadLex 8.0 14 AAST injury grading 22.2 39 Bosniak renal cyst classification 52.3 92 Acute pancreatitis (revised) Atlanta Classification 13.6 24 MRI perianal fistula classification 8.0 14 Various MSK fracture and injury classifications 25.0 44 Other 28.4 50 Answered question 176 Skipped question 89

AAST, American Association for the Surgery of Trauma; HI-RADS, Head Injury Imaging Reporting and Data System; LI-RADS, Liver Imaging Reporting and Data Systems; PI-RADS, Prostate Imaging Reporting and Data Systems.

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 13

Question 14

How Do You Personally Dictate the Majority of Studies? Answer Options Response Percent Response Count Free text with limited pertinent information 13.1 28 Free text and short “macros” (short saved phrases) 20.7 44 Outline with free text and/or short macros 32.9 70 Structured reports with standard phrasing 22.5 48 Checklists 3.8 8 My reporting style varies widely 5.6 12 Other 1.4 3 Answered question 213 Skipped question 52

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

References

  • 1. Hickey P.M.: Standardization of roentgen-ray reports. AJR Am J Roentgenol 1922; 9: pp. 422.

  • 2. Flanders A.E., Lakhani P.: Radiology reporting and communications: a look forward. Neuroimaging Clin N Am 2012; 22: pp. 477-496.

  • 3. Dunnick N.R., Langlotz C.P.: The radiology report of the future: a summary of the 2007 Intersociety Conference. J Am Coll Radiol 2008; 5: pp. 626-629.

  • 4. Kahn C.E., Langlotz C.P., Burnside E.S., et. al.: Toward best practices in radiology reporting. Radiology 2009; 252: pp. 852-856.

  • 5. Bosmans J.M., Peremans L., Menni M., et. al.: Structured reporting: if, why, when, how-and at what expense? Results of a focus group meeting of radiology professionals from eight countries. Insights Imaging 2012; 3: pp. 295-302.

  • 6. Burnside E.S., Sickles E.A., Bassett L.W., et. al.: The ACR BI-RADS experience: learning from history. J Am Coll Radiol 2009; 6: pp. 851-860.

  • 7. Avrin D.E., Flanders A.E.: Annual report from the RSNA Radiology Informatics Committee for 2009. Radiographics 2010; 30: pp. 7-11.

  • 8. LI-RADS enables standardized interpretation, reporting of HCC. RSNA News 2012; 22:13–14.

  • 9. Barentsz J.O., Richenberg J., Clements R., et. al., European Society of Urogenital Radiology: ESUR prostate MR guidelines 2012. Eur Radiol 2012; 22: pp. 746-757.

  • 10. Rubin D.L.: Creating and curating a terminology for radiology: ontology modeling and analysis. J Digit Imaging 2008; 21: pp. 355-362.

  • 11. Pool F., Goergen S.: Quality of the written radiology report: a review of the literature. J Am Coll Radiol 2010; 7: pp. 634-643.

  • 12. Sistrom C.L., Honeyman-Buck J.: Free text versus structured format: information transfer efficiency of radiology reports. AJR Am J Roentgenol 2005; 185: pp. 804-812.

  • 13. Johnson A.J., Chen M.Y., Swan J.S., et. al.: Cohort study of structured reporting compared with conventional dictation. Radiology 2009; 253: pp. 74-80.

  • 14. Larson D.B., Towbin A.J., Pryor R.M., et. al.: Improving consistency in radiology reporting through the use of department-wide standardized structured reporting. Radiology 2013; 267: pp. 240-250.

  • 15. Schwartz L.H., Panicek D.M., Berk A.R., et. al.: Improving communication of diagnostic radiology findings through structured reporting. Radiology 2011; 260: pp. 174-181.

  • 16. Accreditation Council for Graduate Medical Education Data Resource Book Academic Year 2012-2013. Available at: https://www.acgme.org/acgmeweb/Portals/0/PFAssets/PublicationsBooks/2012-2013_ACGME_DATABOOK_DOCUMENT_Final.pdf . Accessed on July 27, 2014.

  • 17. Deloney L.A., Rozenshtein A., Deitte L.A., et. al.: What program directors think: results of the 2011 annual survey of the Association of Program Directors in Radiology. Acad Radiol 2012; 19: pp. 1583-1588.

  • 18. Houssami N., Boyages J., Stuart K., et. al.: Quality of breast imaging reports falls short of recommended standards. Breast 2007; 16: pp. 271-279.

  • 19. Langlotz C.P.: Structured radiology reporting: are we there yet?. Radiology 2009; 253: pp. 23-25.

  • 20. Weiss D.L., Langlotz C.P.: Structured reporting: patient care enhancement or productivity nightmare?. Radiology 2008; 249: pp. 739-747.

This post is licensed under CC BY 4.0 by the author.