Home Review of Research Reporting Guidelines for Radiology Researchers
Post
Cancel

Review of Research Reporting Guidelines for Radiology Researchers

Prior articles have reviewed reporting guidelines and study evaluation tools for clinical research. However, only some of the many available accepted reporting guidelines at the Enhancing the QUAlity and Transparency Of health Research Network have been discussed in previous reports. In this paper, we review the key Enhancing the QUAlity and Transparency Of health Research reporting guidelines that have not been previously discussed. The study types include diagnostic and prognostic studies, reliability and agreement studies, observational studies, analytical and descriptive, experimental studies, quality improvement studies, qualitative research, health informatics, systematic reviews and meta-analyses, economic evaluations, and mixed methods studies. There are also sections on study protocols, and statistical analyses and methods. In each section, there is a brief overview of the study type, and then the reporting guideline(s) that are most applicable to radiology researchers including radiologists involved in health services research are discussed.

Introduction

In 2006, the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network was formed to standardize and improve the quality of the reporting of health research with the development of research reporting guidelines. This article reviews how to report research in health care for the following study designs: diagnostic and prognostic studies, reliability and agreement studies, observational studies, experimental studies, quality improvement studies, qualitative research, health informatics, systematic reviews and meta-analyses, economic evaluations, mixed methods studies; and study protocols are discussed, as well as the reporting of statistical analysis. In each section, there is a brief overview of the study type, and then the available guideline(s) on how to report these different study types of health research are discussed. In this paper, we complete the review of the key EQUATOR reporting guidelines most applicable to radiology researchers including radiologists involved in health services research. The aim of this paper is to increase awareness in the radiology community of the available resources to enable researchers to produce scientific articles with a high standard of reporting of the research content and with a clear writing style. Where guideline checklists (and where applicable flow charts) are easily available from the EQUATOR Network Web site (or guideline statement Web site or other Web site), these Web links are provided. When guideline checklists are less easily available, they are summarized in tables.

Diagnostic and Prognostic Studies

Diagnostic test accuracy studies evaluate a test for the diagnosis of a disease by comparing the test in patients with and without disease using a reference standard. Diagnostic test accuracy studies provide evidence on how well a test correctly identifies or rules out disease and informs subsequent decisions about treatment for clinicians, their patients, and healthcare providers . This research study design is one of the most commonly used in radiology research. Prognosis refers to the possible outcomes of a disease and the frequency with which they can be expected to occur. Sometimes the characteristics of a particular patient can be used to more accurately predict that patient’s eventual outcome. These characteristics are called prognostic factors, and they can be used to predict outcome. Prognostic factors need not necessarily cause the outcomes, but may have a strong enough association to predict their development. Prognostic studies aim to predict the course of a disease following its onset. A prediction model is a mathematical equation that combines information from multiple predictors measured from an individual to predict the probability of the presence (diagnosis) or future occurrence (prognosis) of a particular disease or outcome. Other names for a prediction model include risk prediction model, predictive model, prediction rule, and risk score . The EQUATOR Network has recently changed its study type section from a section for diagnostic test accuracy studies to a section that includes both diagnostic and prognostic studies. Currently, there are nine reporting guidelines for this section with the key reporting guidelines being STAndards for Reporting of Diagnostic accuracy (STARD) 2015 and Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis (TRIPOD).

Toward Complete and Accurate Reporting of Studies of Diagnostic Accuracy: The STARD Initiative

This is a reporting guideline for studies of diagnostic accuracy . The objective of the STARD initiative is to improve the accuracy and completeness of reporting of studies of diagnostic accuracy, to allow readers to assess the potential for bias in the study (internal validity) and to evaluate its generalizability (external validity) . The initial STARD statement (now known as STARD 2003) consisted of a checklist of 25 items. The STARD statement has been recently updated with the updated statement known as STARD 2015. In STARD 2015, the updated list now contains 30 essential items that should be included in every report of a diagnostic accuracy study. A summary of new items in STARD 2015 is shown in Table 1 . This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy studies. The statement also recommends the use of a flow diagram that describes the design of the study and the flow of patients . It is hoped that STARD 2015 will help to improve completeness and transparency in the reporting of diagnostic accuracy studies. More than 200 biomedical journals encourage the use of the STARD statement in their instructions for authors . This has been covered in depth in an article in the previous Radiology Alliance for Health Services Research (RAHSR) edition . The STARD and STARD 2015 checklist and flow diagram are available to download from the STARD Web site and the EQUATOR Network .

TABLE 1

Summary of New Items in STARD 2015

Section and Topic Item Checklist Item and Rationale 2Abstract

Structured abstract Abstracts are increasingly used to identify key elements of study design and results. 3Introduction

Intended use and clinical role of the test Describing the targeted application of the test helps readers to interpret the implications of reported accuracy estimates. 4Introduction

Study hypotheses Not having a specific study hypothesis may invite generous interpretation of the study results and “spin” in the conclusions. 18Methods

Sample size Readers want to appreciate the anticipated precision and power of the study and whether authors were successful in recruiting the targeted number of participants. 26–27Discussion

Structured discussion To prevent jumping to unwarranted conclusions, authors are invited to discuss study limitations and draw conclusions keeping in mind the targeted application of the evaluated tests (see item 3). 28Other information

Registration Prospective test accuracy studies are trials, and, as such, they can be registered in clinical trial registries, such as ClinicalTrials.gov , before their initiation, facilitating identification of their existence and preventing selective reporting. 29Other information

Protocol The full study protocol, with more information about the predefined study methods, may be available elsewhere, to allow more fine-grained critical appraisal. 30Other information

Sources of funding Awareness of the potentially compromising effects of conflicts of interest between researchers’ obligations to abide by scientific and ethical principles and other goals, such as financial ones; test accuracy studies are no exception.

STARD, STAndards for Reporting of Diagnostic accuracy.

TRIPOD

The TRIPOD Statement is an evidence-based, minimum set of recommendations for the reporting of both diagnostic and prognostic prediction modeling studies. It comprises a 22-item checklist that focuses on reporting how the study was designed, conducted, analyzed, and interpreted. The main components of the TRIPOD checklist are available to download from the TRIPOD Web site and the EQUATOR Network . It is hoped that this will aid their critical appraisal, interpretation, and uptake by potential users. On January 6, 2015, 11 journals simultaneously published the TRIPOD Statement . It is endorsed by a large number of prominent general medical journals and leading editorial organizations.

Reliability and Agreement Studies

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

GRRAS

Get Radiology Tree app to read full this article<

TABLE 2

Guidelines for Reporting Reliability and Agreement Studies (GRRAS), a Guideline for Reporting Reliability and Agreement Studies

Section and Topic Item # Checklist Item Title and abstract 1 Identify in title or abstract that interrater/intrarater reliability or agreement was investigated Introduction 2 Name and describe the diagnostic or measurement device of interest explicitly 3 Specify the subject population of interest 4 Specify the rater population of interest (if applicable) 5 Describe what is already known about reliability and agreement and provide a rationale for the study (if applicable) Methods 6 Explain how the sample size was chosen. State the determined number of raters, subjects/objects, and replicate observations 7 Describe the sampling method 8 Describe the measurement/rating process (eg, time interval between repeated measurements, availability of clinical information, blinding) 9 State whether measurements/ratings were conducted independently 10 Describe the statistical analysis Results 11 State the actual number of raters and subjects/objects that was included and the number of replicate observations that was conducted 12 Describe the sample characteristics of raters and subjects (eg, training, experience) 13 Report estimates of reliability and agreement including measures of statistical uncertainty Discussion 14 Discuss the practical relevance of results Auxiliary material 15 Provide detailed results if possible (eg, online)

Get Radiology Tree app to read full this article<

Observational Studies: Analytical or Descriptive

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Cohort Studies, Cross-sectional Studies, and Case-Control Studies

Get Radiology Tree app to read full this article<

Case Series

Get Radiology Tree app to read full this article<

TABLE 3

A Suggested Guideline and Checklist for Reporting Case Series

Checklist Item 1 Explicitly state the hypothesis/hypotheses under consideration 2 Explicitly provide eligibility criteria for subjects in the report 3 Precisely describe how treatments were administered or define potential risk factors 4 Compare observed results with those in an appropriate external comparison group; discuss potential biases arising from such comparison 5 Perform appropriate statistics, ensuring that assumptions of the statistical methods are reasonable in this setting 6 Discuss the biological plausibility of the hypothesis in light of the report’s observations 7 Explicitly discuss the report’s limitations and how these limitations could be overcome in future studies

Get Radiology Tree app to read full this article<

Case Reports

Get Radiology Tree app to read full this article<

TABLE 4

A Suggested Guideline and Checklist for Writing Case Reports Based on Advice in Existing Literature

Section and Topic Item # Checklist Item Title 1 Should facilitate retrieval with electronic searching. Introduction 2 Describe whether the case is unique. If not, does the case have an unusual diagnosis, prognosis, therapy, or harm? 3 Describe how the case contributes to scientific knowledge. 4 Describe the instructive or teaching points that add value to this case. Methods and results 5 Describe the history, examination, and investigations adequately. Is the cause of the patient’s illness clear-cut? What are other plausible explanations? 6 Describe the treatments adequately. Have all available therapeutic options been considered? Are outcomes related to treatments? Discussion 7 Report a literature review of other similar cases. Describe how this case is different. 8 Explain the rationale for reporting the case. What is unusual about the case? Does it challenge prevailing wisdom? 9 In the future, could things be done differently in a similar case?

Get Radiology Tree app to read full this article<

The CARE Guidelines: Consensus-based Clinical Case Reporting Guideline Development

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Good Practice in the Conduct and Reporting of Survey Research

Get Radiology Tree app to read full this article<

TABLE 5

A Suggested Reporting Guideline for Conduct and Reporting of Survey Research

Section and Topic Item # Checklist Item Introduction 1 Explain the purpose or aim of the research, with the explicit identification of the research question. 2 Explain why the research was necessary and place the study in context, drawing upon previous work in relevant fields (the literature review). 3 Describe in (proportionate) detail how the research was done. Methods 4 State the chosen research method or methods, and justify why this method was chosen. 5 Describe the research tool. If an existing tool is used, briefly state its psychometric properties and provide references to the original development work. If a new tool is used, you should include an entire section describing the steps undertaken to develop and test the tool, including results of psychometric testing. 6 Describe how the sample was selected and how data were collected, including: 6a How were potential subjects identified? 6b How many and what type of attempts were made to contact subjects? 6c Who approached potential subjects? 6d Where were potential subjects approached? 6e How was informed consent obtained? 6f How many agreed to participate? 6g How did those who agreed differ from those who did not agree? 6h What was the response rate? 7 Describe and justify the methods and tests used for data analysis. Results 8 Present the results of the research. The results section should be clear, factual, and concise. Discussion 9 Interpret and discuss the findings. This “discussion” section should not simply reiterate results; it should provide the author’s critical reflection upon both the results and the processes of data collection. The discussion should assess how well the study met the research question, should describe the problems encountered in the research, and should honestly judge the limitations of the work. 10 Present conclusions and recommendations.

Get Radiology Tree app to read full this article<

Improving the Quality of Web Surveys: CHERRIES

Get Radiology Tree app to read full this article<

TABLE 6

Checklist for Reporting Results of Internet E-Surveys (CHERRIES)

Section and Topic Item # Checklist Item Explanation Design 1 Describe survey design Describe target population, sample frame. Is the sample a convenience sample? (In “open” surveys, this is most likely.) Institutional review board (IRB) approval and informed consent process 2 IRB approval Mention whether the study has been approved by an IRB. 3 Informed consent Describe the informed consent process. Where were the participants told the length of time of the survey, which data were stored and where and for how long, who was the investigator, and what was the purpose of the study? 4 Data protection If any personal information was collected or stored, describe what mechanisms were used to protect unauthorized access. Development and pretesting 5 Development and testing State how the survey was developed, including whether the usability and the technical functionality of the electronic questionnaire had been tested before fielding the questionnaire. Recruitment process and description of the sample having access to the questionnaire 6 Open survey versus closed survey An “open survey” is a survey open for each visitor of a site, whereas a closed survey is only open to a sample that the investigator knows (password-protected survey). 7 Contact mode Indicate whether the initial contact with the potential participants was made on the Internet. (Investigators may also send out questionnaires by mail and allow for Web-based data entry.) 8 Advertising the survey How/where was the survey announced or advertised? Some examples are off-line media (newspapers), or online (mailing lists—If yes, which ones?) or banner ads (Where were these banner ads posted and what did they look like?). It is important to know the wording of the announcement as it will heavily influence who chooses to participate. Ideally, the survey announcement should be published as an appendix. Survey administration 9 Web/E-mail State the type of e-survey (eg, one posted on a Web site, or one sent out through e-mail). If it is an e-mail survey, were the responses entered manually into a database, or was there an automatic method for capturing responses? 10 Context Describe the Web site (for mailing list/newsgroup) in which the survey was posted. What is the Web site about, who is visiting it, and what are visitors normally looking for? Discuss to what degree the content of the Web site could preselect the sample or influence the results. For example, a survey about vaccination on a anti-immunization Web site will have different results from a Web survey conducted on a government Web site. 11 Mandatory/voluntary Was it a mandatory survey to be filled in by every visitor who wanted to enter the Web site, or was it a voluntary survey? 12 Incentives Were any incentives offered (eg, monetary, prizes, or nonmonetary incentives such as an offer to provide the survey results)? 13 Time/Date In what time frame were the data collected? 14 Randomization of items or questionnaires To prevent biases, items can be randomized or alternated. 15 Adaptive questioning Use adaptive questioning (certain items, or only conditionally displayed based on responses to other items) to reduce number and complexity of the questions. 16 Number of items What was the number of questionnaire items per page? The number of items is an important factor for the completion rate. 17 Number of screens (pages) Over how many pages was the questionnaire distributed? The number of items is an important factor for the completion rate. 18 Completeness check It is technically possible to do consistency or completeness checks before the questionnaire is submitted. Was this done, and if “yes,” how (usually JAVAScript)? An alternative is to check for completeness after the questionnaire has been submitted (and highlight mandatory items). If this has been done, it should be reported. All items should provide a nonresponse option such as “not applicable” or “rather not say,” and selection of one response option should be enforced. 19 Review step State whether respondents were able to review and change their answers (eg, through a back button or a review step that displays a summary of the responses and asks the respondents if they are correct). Response rates 20 Unique site visitor If you provide view rates or participation rates, you need to define how you determined a unique visitor. There are different techniques available, based on IP addresses or cookies or both. 21 View rate (ratio of unique survey visitors/unique site visitors) Requires counting unique visitors to the first page of the survey, divided by the number of unique site visitors (not page views!). It is not unusual to have view rates of less than 0.1% if the survey is voluntary. 22 Participation rate (ratio of unique visitors who agreed to participate/unique first survey page visitors) Count the unique number of people who filled in the first survey page (or agreed to participate, for example, by checking a checkbox), divided by visitors who visit the first page of the survey (or the informed consents page, if present). This can also be called “recruitment” rate. 23 Completion rate (ratio of users who finished the survey/users who agreed to participate) The number of people submitting the last questionnaire page, divided by the number of people who agreed to participate (or submitted the first survey page). This is only relevant if there is a separate “informed consent” page or if the survey goes over several pages. This is a measure for attrition. Note that “completion” can involve leaving questionnaire items blank. This is not a measure for how completely questionnaires were filled in. (If you need a measure for this, use the word “completeness rate.”) Preventing multiple entries from the same individual 24 Cookies used Indicate whether cookies were used to assign a unique user identifier to each client computer. If so, mention the page on which the cookie was set and read, and how long the cookie was valid. Were duplicate entries avoided by preventing users’ access to the survey twice, or were duplicate database entries having the same user ID eliminated before analysis? In the latter case, which entries were kept for analysis (eg, the first entry or the most recent)? 25 IP check Indicate whether the IP address of the client computer was used to identify potential duplicate entries from the same user. If so, mention the period of time for which no two entries from the same IP address were allowed (eg, 24 hours). Were duplicate entries avoided by preventing users with the same IP address access to the survey twice, or were duplicate database entries having the same IP address within a given period of time eliminated before analysis? If the latter, which entries were kept for analysis (eg, the first entry or the most recent)? 26 Log file analysis Indicate whether other techniques to analyze the log file for identification of multiple entries were used. If so, please describe. 27 Registration In “closed” (nonopen) surveys, users need to log in first and it is easier to prevent duplicate entries from the same user. Describe how this was done. For example, was the survey never displayed a second time once the user had filled it in, or was the username stored together with the survey results and later eliminated? If the latter, which entries were kept for analysis (eg, the first entry or the most recent)? Analysis 28 Handling of incomplete questionnaires Were only completed questionnaires analyzed? Were questionnaires that terminated early (where, for example, users did not go through all questionnaire pages) also analyzed? 29 Questionnaires submitted with an atypical time stamp Some investigators may measure the time people needed to fill in a questionnaire and exclude questionnaires that were submitted too soon. Specify the time frame that was used as a cutoff point, and describe how this point was determined. 30 Statistical correction Indicate whether any methods such as weighting of items or propensity scores have been used to adjust for the nonrepresentative sample; if so, please describe the methods.

Get Radiology Tree app to read full this article<

A Guide for the Design and Conduct of Self-administered Surveys of Clinicians

Get Radiology Tree app to read full this article<

TABLE 7

A Suggested Guideline for the Design and Conduct of Self-administered Surveys of Clinicians When Preparing a Report of Findings From Postal Surveys

Section and Topic Item # Checklist Item Abstract 1 Is the objective clearly stated? 2 Is the design of the study stated? 3 Is the study setting well described? 4 Is the survey population described? 5 Is the response rate reported? 6 Are the outcome measures identified? 7 Are the main results clearly reported? 8 Are the conclusions appropriate? Introduction 9 Is the problem clearly stated? 10 Is the pertinent literature cited and critically appraised? 11 Is the relevance of the research question explained? 12 Is the objective clearly stated? Methods 13 Is the study design appropriate to the objective? 14 Is the setting clearly described? 15 Are the methods described clearly enough to permit other researchers to duplicate the study? 16 Is the survey sample likely to be representative of the population? 17 Is the questionnaire described adequately? 18 Have the validity and reliability of the questionnaire been established? 19 Was the questionnaire administered in a satisfactory way? 20 Are the statistical methods used appropriately? Results 21 Do the results address the objective? 22 Are all respondents accounted for? 23 Are the results clearly and logically presented? 24 Are the tables and figures appropriate? 25 Are the numbers consistent in the text and the tables? Discussion 26 Are the results succinctly summarized? 27 Are the implications of the results stated? 28 Are other interpretations considered and refuted? 29 Are the limitations of the study and its results explained? 30 Are appropriate conclusions drawn?

Get Radiology Tree app to read full this article<

Experimental Studies

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Reporting of Cluster Randomized Trials: An Extension of the CONSORT 2010 Statement

Get Radiology Tree app to read full this article<

Reporting of Noninferiority and Equivalence Randomized Trials: An Extension of the CONSORT 2010 Statement

Get Radiology Tree app to read full this article<

Reporting of Trials Assessing NPT: An Extension of the CONSORT 2010 Statement

Get Radiology Tree app to read full this article<

Better Reporting of Harms in Randomized Trials: An Extension of the CONSORT Statement

Get Radiology Tree app to read full this article<

Improving the Reporting of Pragmatic Trials: An Extension of the CONSORT Statement

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Reporting of PRO in Randomized Trials: The CONSORT-PRO Extension

Get Radiology Tree app to read full this article<

TREND

Get Radiology Tree app to read full this article<

Quality Improvement Studies

Get Radiology Tree app to read full this article<

Publication Guidelines for Quality Improvement in Health Care: Evolution of the SQUIRE Project

Get Radiology Tree app to read full this article<

Qualitative Research

Get Radiology Tree app to read full this article<

COREQ: A Checklist for Interviews and Focus Groups

Get Radiology Tree app to read full this article<

Health Informatics

Get Radiology Tree app to read full this article<

Statement on Reporting of Evaluation Studies in Health Informatics (STARE-HI)

Get Radiology Tree app to read full this article<

TABLE 8

The Statement on Reporting of Evaluation Studies in Health Informatics (STARE-HI) Principles: Items Recommended to be Included in Health Informatics Evaluation Reports and Guideline for Reporting Health Informatics Studies

Section and Topic Item# Checklist Item Title 1 Title Abstract 2 Abstract Keywords 3 Keywords Introduction 4.1 Scientific background 4.2 Rationale for the study 4.3 Objectives of study Study context 5.1 Organizational setting 5.2 System details and system in use Methods 6.1 Study design 6.2 Theoretical background 6.3 Participants 6.4 Study flow 6.5 Outcome measures or evaluation criteria 6.6 Methods for data acquisition and measurement 6.7 Methods for data analysis Results 7.1 Demographic and other study coverage data 7.2 Unexpected events during the study 7.3 Study findings and outcome data 7.4 Unexpected observations Discussion 8.1 Answers to study questions 8.2 Strengths and weaknesses of the study 8.3 Results in relation to other studies 8.4 Meaning and generalizability of the study 8.5 Unanswered and new questions Conclusion 9 Conclusion Authors’ contribution 10 Authors’ contribution Competing interests 11 Competing interests Acknowledgement 12 Acknowledgment References 13 References Appendices 14 Appendices

Get Radiology Tree app to read full this article<

Systematic Reviews and Meta-analyses

Get Radiology Tree app to read full this article<

The PRISMA Statement

Get Radiology Tree app to read full this article<

MOOSE: A Proposal for Reporting. MOOSE Group

Get Radiology Tree app to read full this article<

TABLE 9

Meta-analyses of Observational Studies (MOOSE) Checklist and Guideline for the Reporting of Meta-Analyses of Observational Studies

Checklist Item Reporting of background should include 1 Problem definition 2 Hypothesis statement 3 Description of study outcome(s) 4 Type of exposure or intervention used 5 Type of study designs used 6 Study population Reporting of search strategy should include 7 Qualifications of searchers (eg, librarians and investigators) 8 Search strategy, including time period included in the synthesis and key words 9 Effort to include all available studies, including contact with authors 10 Databases and registries searched 11 Search software used, name and version, including special features used (eg, explosion) 12 Use of hand searching (eg, reference lists of obtained articles) 13 List of citations located and those excluded, including justification 14 Method of addressing articles published in languages other than English 15 Method of handling abstracts and unpublished studies 16 Description of any contact with authors Reporting of methods should include 17 Description of relevance or appropriateness of studies assembled for the assessment of the hypothesis to be tested 18 Rationale for the selection and coding of data (eg, sound clinical principles or convenience) 19 Documentation of how data were classified and coded (eg, multiple raters, blinding, and interrater reliability) 20 Assessment of confounding (eg, comparability of cases and controls in studies where appropriate) 21 Assessment of study quality, including blinding of quality assessors, stratification, or regression on possible predictors of study results 22 Assessment of heterogeneity 23 Description of statistical methods (eg, complete description of fixed or random effects models, justification of whether the chosen models account for predictors of study results, dose-response models, or cumulative meta-analysis) in sufficient detail to be replicated 24 Provision of appropriate tables and graphics Reporting of results should include 25 Graphic summarizing individual study estimates and overall estimate 26 Table giving descriptive information for each study included 27 Results of sensitivity testing (eg, subgroup analysis) 28 Indication of statistical uncertainty of findings

Get Radiology Tree app to read full this article<

Meta-analysis of Individual Participant Data: Rationale, Conduct, and Reporting

Get Radiology Tree app to read full this article<

TABLE 10

Suggested Information to Report From an Individual Participant Data Meta-analysis, to Supplement Those Reporting Guidelines of PRISMA and MOOSE

Checklist Item 1 Whether there was a protocol for the individual participant data project, and where it can be found 2 Whether ethics approval was necessary and (if appropriate) granted 3 Why the individual participant data approach was initiated 4 The process used to identify relevant studies for the meta-analysis 5 How authors of relevant studies were approached for individual participant data 6 How many authors (or collaborating groups) were approached for individual participant data, and the proportion that provided such data 7 The number of authors who did not provide individual participant data, the reasons why, and the number of patients (and events) in the respective study 8 Whether those authors who provided individual participant data gave all their data or only a proportion; if the latter, then describe what information was omitted and why 9 Whether there were any qualitative or quantitative differences between those studies providing individual participant data and those studies not providing individual participant data (if appropriate) 10 The number of patients within each of the original studies and, if appropriate, the number of events 11 Details of any missing individual level data within the available individual participant data for each study, and how this was handled within the meta-analyses performed 12 Details and reasons for including (or excluding) patients who were originally excluded (or included) by the source study investigators 13 Whether a one-step or a two-step individual participant data meta-analysis was performed, and the statistical details thereof, including how clustering of patients within studies was accounted for 14 How many patients from each study were used in each meta-analysis performed 15 Whether the assumptions of the statistical models were validated (for example, proportional hazards) within each study 16 Whether the individual participant data results for each study were comparable to the published results, and, if not, why not (for example, individual participant data contained updated or modified information) 17 How individual participant data and nonindividual participant data studies were analyzed together (if appropriate) 18 The robustness of the meta-analysis results following the inclusion or exclusion of nonindividual participant data studies (if appropriate)

MOOSE, Meta-analysis Of Observational Studies in Epidemiology; PRISMA, Preferred Reporting Items for Systematic Reviews and Meta-Analyses.

Get Radiology Tree app to read full this article<

ENTREQ

Get Radiology Tree app to read full this article<

Economic Evaluations

Get Radiology Tree app to read full this article<

CHEERS Statement

Get Radiology Tree app to read full this article<

Mixed Methods Studies

Get Radiology Tree app to read full this article<

The Quality of Mixed Methods Studies in Health Services Research

Get Radiology Tree app to read full this article<

TABLE 11

Good Reporting of a Mixed Methods Study (GRAMMS), Guidelines for the Reporting of Mixed Methods Studies

Checklist Item 1 Describe the justification for using a mixed methods approach to the research question 2 Describe the design in terms of the purpose, priority, and sequence of methods 3 Describe each method in terms of sampling, data collection, and analysis 4 Describe where integration has occurred, how it has occurred, and who has participated in it 5 Describe any limitation of one method associated with the present of the other method 6 Describe any insights gained from mixing or integrating methods

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Study Protocols

Get Radiology Tree app to read full this article<

SPIRIT 2013 Statement: Defining Standard Protocol Items for Clinical Trials

Get Radiology Tree app to read full this article<

PRISMA-P 2015 Statement

Get Radiology Tree app to read full this article<

Statistical Analyses and Methods

Get Radiology Tree app to read full this article<

Statistical Analyses and Methods in the Published Literature (SAMPL)

Get Radiology Tree app to read full this article<

Conclusion

Get Radiology Tree app to read full this article<

TABLE 12

Reporting guidelines by Research Study Design, Acronym, Website URL, and Bibliographic reference

Research Study Design Reporting Guideline(S) Provided For Reporting Guideline Acronym Reporting Guideline Website URL Full-Text If Available Full Bibliographic Reference Diagnostic and prognostic studies Studies of diagnostic accuracy STARD http://www.stard-statement.org/ Full-text PDF documents of the STARD Statement, checklist, flow diagram and the Explanation and Elaboration document Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, Lijmer JG, Moher D, Rennie D, de Vet HC. Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Standards for Reporting of Diagnostic Accuracy. Clin Chem. 2003; 49(1):1-6. PMID: 12507953 (14).

BMJ. 2003; 326(7379):41-44. PMID: 12511463 (4).

Radiology. 2003; 226(1):24-28. PMID: 12511664 (5).

Ann Intern Med. 2003; 138(1):40-44. PMID: 12513043 (6).

Am J Clin Pathol. 2003; 119(1):18-22. PMID: 12520693 (7).

Clin Biochem. 2003; 36(1):2-7. PMID: 12554053 (8).

Clin Chem Lab Med. 2003; 41(1):68-73. PMID: 12636052 (3). Studies of diagnostic accuracy STARD 2015 http://www.stard-statement.org/ The full-text of the STARD 2015 reporting guideline for diagnostic accuracy studies is available to download as a PDF file.

STARD 2015 checklist (PDF)

STARD 2015 flow diagram (PDF) Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig L, LijmerJG Moher D, Rennie D, de Vet HCW, Kressel HY, Rifai N, Golub RM, Altman DG, Hooft L, Korevaar DA, Cohen JF, For the STARD Group. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies. BMJ. 2015;351:h5527. PMID: 26511519

Radiology. 2015:151516. PMID: 26509226

Clinical Chemistry. 2015. pii: clinchem.2015.246280. PMID: 26510957 Reporting of studies developing, validating, or updating a prediction model, whether for diagnostic or prognostic purposes. TRIPOD http://www.tripod-statement.org/ Full-text PDF and WORD documents of the TRIPOD Statement checklist for prediction model development and validation

http://www.tripod-statement.org/TRIPOD/TRIPOD-Checklists

http://www.tripod-statement.org/Downloads Moons KG, Altman DG, Reitsma JB, Ioannidis JP, Macaskill P, Steyerberg EW, Vickers AJ, Ransohoff DF, Collins GS. Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD): Explanation and Elaboration. Ann Intern Med. 2015;162(1):W1-W73. PMID: 25560730 Reliability and Agreement Studies Reliability and agreement studies GRRAS Kottner J, Audigé L, Brorson S, Donner A, Gajeweski BJ, Hróbjartsson A, Robersts C, Shoukri M, Streiner DL. Guidelines for reporting reliability and agreement studies (GRRAS) were proposed. J Clin Epidemiol. 2011;64(1):96-106 PMID: 21130355 (21).

Int J Nurs Stud. 2011;48(6):661-671. PMID: 21514934 (22). Observational Studies Observational studies in epidemiology (cohort, case-control studies, cross-sectional studies) STROBE http://www.strobe-statement.org/index.php?id=strobe-home Full-text PDF copies of the STROBE Statement and explanation and elaboration papers.

STROBE checklists

http://www.strobe-statement.org/index.php?id=strobe-publications

http://www.strobe-statement.org/index.php?id=available-checklists von Elm E, Altman DG, Egger M, Pocock SJ, Gotzsche PC, Vandenbroucke JP. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies. Ann Intern Med. 2007; 147(8):573-577. PMID: 17938396 (24)

PLoS Med. 2007;4(10):e296. PMID: 17941714 (25)

BMJ. 2007;335(7624):806-808. PMID: 17947786 (26)

Prev Med. 2007;45(4):247-251. PMID: 17950122 (27)

Epidemiology. 2007;18(6):800-804. PMID: 18049194 (28)

Lancet. 2007;370(9596):1453-1457. PMID: 18064739 (29) For completeness, transparency and data analysis in case reports and data from the point of care. CARE http://www.care-statement.org/ The CARE Checklist

The CARE Writing Template

The 2016 updated CARE Checklist as PDF and Word file.

The CARE Writing Template for Authors as PDF and a Word file. Gagnier JJ, Kienle G, Altman DA, Moher D, Sox H, Riley D; the CARE Group. The CARE Guidelines: Consensus-based Clinical Case Reporting Guideline Development. BMJ Case Rep. 2013; doi: 10.1136/bcr-2013-201554 PMID: 24155002 (35).

Global Adv Health Med. 2013;10.7453/gahmj.2013.008

Dtsch Arztebl Int. 2013;110(37):603-608. PMID: 24078847 Full-text in English / Full-text in German

J Clin Epidemiol. 2013. Epub ahead of print. PMID: 24035173 (38).

J Med Case Rep. 2013;7(1):223. PMID: 24228906 (37).

J Diet Suppl. 2013;10(4):381-90. PMID: 24237192 (39). Reporting items specific to observational studies using routinely collected health data. RECORD http://record-statement.org/ The full-text of this reporting guideline can be accessed at: http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1001885 Benchimol EI, Smeeth L, Guttmann A, Harron K, Moher D, Petersen I, Sørensen HT, von Elm E, Langan SM; RECORD Working Committee. The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) Statement. PLoS Med. 2015;12(10):e1001885 PMID: 26440803 Reporting Web-based surveys CHERRIES Eysenbach G. Improving the quality of Web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res. 2004; 6(3):e34. PMID: 15471760 (51)

Experimental Studies Parallel group randomised trials CONSORT http://www.consort-statement.org/ Full-text PDF documents of the CONSORT 2010 Statement, CONSORT 2010 checklist, CONSORT 2010 flow diagram and the CONSORT 2010 Explanation and Elaboration document

CONSORT checklist (Word)

CONSORT flow diagram (Word) Schulz KF, Altman DG, Moher D, for the CONSORT Group. CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. Ann Int Med. 2010;152(11):726-32. PMID: 20335313 (55).

BMC Medicine. 2010;8:18. PMID: 20334633 (56).

BMJ. 2010;340:c332. PMID: 20332509 (57).

J Clin Epidemiol. 2010;63(8): 834-40. PMID: 20346629 (58).

Lancet. 2010;375(9721):1136 supplementary webappendix.

Obstet Gynecol. 2010;115(5):1063-70. PMID: 20410783 (59).

Open Med. 2010;4(1):60-68.

PLoS Med. 2010;7(3): e1000251. PMID: 20352064 (60).

Trials. 2010;11:32. PMID: 20334632 (61). Cluster randomised trials CONSORT Cluster http://www.consort-statement.org/extensions/designs/cluster-trials/ The full-text of the extension for cluster randomised trials Campbell MK, Piaggio G, Elbourne DR, Altman DG; for the CONSORT Group. Consort 2010 statement: extension to cluster randomised trials. BMJ. 2012;345:e5661. PMID: 22951546 (63). Reporting of noninferiority and equivalence randomized trials CONSORT Non-inferiority http://www.consort-statement.org/extensions/designs/non-inferiority-and-equivalence-trials/ The full-text of the extension for noninferiority and equivalence randomized trials Piaggio G, Elbourne DR, Pocock SJ, Evans SJ, Altman DG; CONSORT Group. Reporting of noninferiority and equivalence randomized trials: extension of the CONSORT 2010 statement. JAMA. 2012;308(24):2594-2604. PMID: 23268518 (65). Reporting of pragmatic trials in healthcare CONSORT Pragmatic trials http://www.consort-statement.org/extensions/designs/pragmatic-trials/ The full-text of the extension for pragmatic trials in healthcare Zwarenstein M, Treweek S, Gagnier JJ, Altman DG, Tunis S, Haynes B, Oxman AD, Moher D; CONSORT group; Pragmatic Trials in Healthcare (Practihc) group. Improving the reporting of pragmatic trials: an extension of the CONSORT statement. BMJ. 2008;337:a2390. PMID: 19001484 (71). Trials assessing nonpharmacologic treatments CONSORT Nonpharmacological treatment interventions http://www.consort-statement.org/extensions/interventions/non-pharmacologic-treatment-interventions/ The full-text of the extension for trials assessing nonpharmacologic treatments Boutron I, Moher D, Altman DG, Schulz K, Ravaud P, for the CONSORT group. Methods and Processes of the CONSORT Group: Example of an Extension for Trials Assessing Nonpharmacologic Treatments. Ann Intern Med. 2008:W60-W67. PMID: 18283201 (67). Patient-reported outcomes in randomized trials CONSORT-PRO http://www.consort-statement.org/extensions/data/pro/ The full-text of the extension for patient reported outcomes (PROs) Calvert M, Blazeby J, Altman DG, Revicki DA, Moher D, Brundage MD; CONSORT PRO Group. Reporting of patient-reported outcomes in randomized trials: the CONSORT PRO extension. JAMA. 2013;309(8):814-822. PMID; 23443445 (75). Reporting of harms in randomized trials CONSORT Harms http://www.consort-statement.org/extensions/data/harms/ Ioannidis JPA, Evans SJW, Gotzsche PC, O’Neill RT, Altman DG, Schulz K, Moher D, for the CONSORT Group. Better Reporting of Harms in Randomized Trials: An Extension of the CONSORT Statement. Ann Intern Med. 2004; 141(10):781-788. PMID: 15545678 (69). Reporting randomised trials in journal and conference abstracts CONSORT for abstracts http://www.consort-statement.org/extensions/data/abstracts/ The full-text of the extension for journal and conference abstracts Hopewell S, Clarke M, Moher D, Wager E, Middleton P, Altman DG, Schulz KF, the CONSORT Group. CONSORT for reporting randomised trials in journal and conference abstracts. Lancet. 2008;371(9609):281-283. PMID: 18221781 . Reporting of intervention evaluation studies using nonrandomized designs TREND http://www.cdc.gov/trendstatement/ Des Jarlais DC, Lyles C, Crepaz N, Trend Group. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health. 2004;94(3):361. PMID: 14998794 (77)

Quality Improvement Studies Quality improvement in health care SQUIRE http://squire-statement.org/ The full-text of the SQUIRE 2.0 update, published in 2015, is available from: SQUIRE 2.0

SQUIRE 2.0 checklist (PDF) - 2015 update Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney S. Publication guidelines for quality improvement in health care: evolution of the SQUIRE project. Qual Saf Health Care. 2008;17 Suppl 1:i3-i9. PMID: 18836063 (80).

BMJ. 2009; 338:a3152. PMID: 19153129 (81).

Jt Comm J Qual Patient Saf. 2008;34(11):681-687. PMID: 19025090 (82).

Ann Intern Med. 2008;149(9):670-676. PMID: 18981488 (83).

J Gen Intern Med. 2008;23(12):2125-2130. PMID: 18830766 (84) Qualitative research Qualitative research interviews and focus groups COREQ http://intqhc.oxfordjournals.org/content/19/6/349.long Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349-357. PMID: 17872937 (85) Qualitative research reviews RATS http://www.biomedcentral.com/authors/rats The RATS guidelines modified for BioMed Central Instructions to Authors are copyright Jocalyn Clark, BMJ. They can be found in Clark JP: How to peer review a qualitative manuscript . In Peer Review in Health Sciences . Second edition. Edited by Godlee F, Jefferson T. London BMJ Books; 2003:219-235 Health Informatics Evaluation studies in health informatics STARE-HI Talmon J, Ammenwerth E, Brender J, de Keizer N, Nykanen P, Rigby M. STARE-HI - Statement on reporting of evaluation studies in Health Informatics. Int J Med Inform. 2009;78(1):1-9. PMID: 18930696 (86). Systematic Reviews/Meta-analyses/HTA Systematic reviews and meta-analyses PRISMA http://www.prisma-statement.org/ Full-text PDF documents of the PRISMA Statement, checklist, flow diagram and the PRISMA Explanation and Elaboration

PRISMA checklist (Word)

PRISMA flow diagram (Word) Moher D, Liberati A, Tetzlaff J, Altman DG, The PRISMA Group. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med. 2009; 6(7):e1000097.  PMID: 19621072 (88).

BMJ. 2009; 339:b2535. PMID: 19622551 (89).

Ann Intern Med. 2009;151(4):264-269, W64. PMID: 19622511 (90).

J Clin Epidemiol. 2009;62(10):1006-1012. PMID: 19631508 (91).

Open Med. 2009;3(3);123-130 Reporting systematic reviews in journal and conference abstracts PRISMA for Abstracts Beller EM, Glasziou PP, Altman DG, Hopewell S, Bastian H, Chalmers I, Gøtzsche PC, Lasserson T, Tovey D; PRISMA for Abstracts Group. PRISMA for Abstracts: Reporting Systematic Reviews in Journal and Conference Abstracts.. PLoS Med. 2013;10(4):e1001419. PMID: 23585737 . Meta-analysis of observational studies in epidemiology MOOSE Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, Moher D, Becker BJ, Sipe TA, Thacker SB. Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. JAMA. 2000; 283(15):2008-2012. PMID: 10789670 (93) Meta-analysis of individual participant data Riley RD, Lambert PC, Abo-Zaid G. Meta-analysis of individual participant data: rationale, conduct, and reporting. BMJ. 2010;340:c221. PMID: 20139215 (95). Synthesis of qualitative research ENTREQ Tong A, Flemming K, McInnes E, Oliver S, Craig J. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12(1):181. PMID: 23185978 (97). Economic Evaluations Economic evaluations of health interventions CHEERS http://www.ispor.org/taskforces/EconomicPubGuidelines.asp Information about the CHEERS Statement and a full-text PDF copy of the CHEERS checklist

A full-text PDF copy of the CHEERS checklist is available from: http://www.ispor.org/workpaper/CHEERS/revised-CHEERS-Checklist-Oct13.pdf Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, Augustovski F, Briggs AH, Mauskopf J, Loder E. Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement Eur J Health Econ. 2013;14(3):367-372. PMID: 23526140 (98).

Value Health. 2013;16(2):e1-e5. PMID: 23538200 (99).

Clin Ther. 2013;35(4):356-363. PMID: 23537754 (100).

Cost Eff Resour Alloc. 2013;11(1):6. PMID: 23531194 (101).

BMC Med. 2013;11:80. PMID: 23531108 (102).

BMJ. 2013;346:f1049. PMID: 23529982 (103).

Pharmacoeconomics. 2013;31(5):361-367. PMID: 23529207 (104).

J Med Econ. 2013;16(6):713-719. PMID: 23521434 (105).

Int J Technol Assess Health Care. 2013;29(2):117-122. PMID: 23587340 (106).

BJOG. 2013;120(6):765-770. PMID: 23565948 (107). Mixed Methods Studies Mixed methods studies in health services research GRAMMS O’Cathain A, Murphy E, Nicholl J. The quality of mixed methods studies in health services research.. J Health Serv Res Policy. 2008;13(2):92-98. PMID: 18416914 (110). Study Protocols Defining standard protocol items for clinical trials SPIRIT http://www.spirit-statement.org/ The full-text of the SPIRIT 2013 Statement

The full-text of the SPIRIT 2013 Statement is available from: http://www.spirit-statement.org/publications-downloads/ Chan A-W, Tetzlaff JM, Altman DG, Laupacis A, Gøtzsche PC, Krleža-Jerić K, Hróbjartsson A, Mann H, Dickersin K, Berlin J, Doré C, Parulekar W, Summerskill W, Groves T, Schulz K, Sox H, Rockhold FW, Rennie D, Moher D. SPIRIT 2013 Statement: Defining standard protocol items for clinical trials. Ann Intern Med. 2013;158(3):200-207. PMID: 23295957 (111). Systematic review and meta-analysis protocols PRISMA-P PRISMA-P checklist (Word) Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, Shekelle P, Stewart LA. Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 statement. Syst Rev. 2015;4(1):1. PMID: 25554246 (114) Systematic reviews in health care http://www.york.ac.uk/inst/crd/index_guidance.htm Statistical methods and analyses Basic statistical reporting for articles published in biomedical journals SAMPL SAMPL Guidelines (pdf) Lang TA, Altman DG. Basic Statistical Reporting for Articles Published in Biomedical Journals: The “Statistical Analyses and Methods in the Published Literature” or The SAMPL Guidelines” Smart P, Maisonneuve H, Polderman A (eds). Science Editors’ Handbook, European Association of Science Editors, 2013.

Int J Nurs Stud. 2015 Jan;52(1):5-9. PMID: 25441757 (116)

Get Radiology Tree app to read full this article<

Appendix

Get Radiology Tree app to read full this article<

TABLE A1 Glossary of Terms

Abbreviation Full Text STARD STAndards for Reporting of Diagnostic accuracy TRIPOD Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis GRRAS Guidelines for Reporting Reliability and Agreement Studies STROBE STrengthening the Reporting of OBservational studies in Epidemiology RECORD REporting of studies Conducted using Observational Routinely-collected Data CARE Case Report CHERRIES Checklist for Reporting Results of Internet E-Surveys CONSORT CONsolidated Standards Of Reporting Trials CONSORT PRO CONsolidated Standards of Reporting Trials Patient-Reported Outcomes TREND Transparent Reporting of Evaluations with Nonrandomized Designs SQUIRE Standards for Quality Improvement Reporting Excellence COREQ COnsolidated criteria for REporting Qualitative research STARE-HI STAtement on the Reporting of Evaluation studies in Health Informatics PRISMA Preferred Reporting Items for Systematic Reviews and Meta-Analyses PRISMA-P Preferred Reporting Items for Systematic Reviews and Meta-Analyses-Protocols MOOSE Meta-analysis Of Observational Studies in Epidemiology ENTREQ ENhancing Transparency in REporting the synthesis of Qualitative research CHEERS Consolidated Health Economic Evaluation Reporting Standards GRAMMS Good Reporting of A Mixed Methods Study SPIRIT Standard Protocol Items: Recommendations for Interventional Trials SAMPL Statistical Analyses and Methods in the Published Literature

Get Radiology Tree app to read full this article<

References

  • 1. Mallett S., Halligan S., Thompson M., et. al.: Interpreting diagnostic accuracy studies for patient care. BMJ 2012; 345: pp. e3999.

  • 2. The TRIPOD Website : Available at http://www.tripod-statement.org/ Accessed March 1, 2015

  • 3. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Clin Chem Lab Med 2003; 41: pp. 68-73.

  • 4. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. BMJ 2003; 326: pp. 41-44.

  • 5. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Radiology 2003; 226: pp. 24-28.

  • 6. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Ann Intern Med 2003; 138: pp. 40-44.

  • 7. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Toward complete and accurate reporting of studies of diagnostic accuracy. The STARD initiative. Am J Clin Pathol 2003; 119: pp. 18-22.

  • 8. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Clin Biochem 2003; 36: pp. 2-7.

  • 9. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Toward complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Acad Radiol 2003; 10: pp. 664-669.

  • 10. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. AJR Am J Roentgenol 2003; 181: pp. 51-55.

  • 11. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Clin Radiol 2003; 58: pp. 575-580.

  • 12. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Croat Med J 2003; 44: pp. 635-638.

  • 13. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Fam Pract 2004; 21: pp. 4-10.

  • 14. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Clin Chem 2003; 49: pp. 1-6.

  • 15. The STARD Website : Available at http://www.stard-statement.org/ Accessed June 25, 2014

  • 16. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. BMJ 2015; 351: h5527

  • 17. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. Radiology 2015; 277: pp. 826-832.

  • 18. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al.: STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. Clin Chem 2015; 61: pp. 1446-1452.

  • 19. Cronin P., Rawson J.V., Heilbrun M.E., et. al.: How to report a research study. Acad Radiol 2014; 21: pp. 1088-1116.

  • 20. The EQUATOR Network Website : Available at http://www.equator-network.org/ Accessed December 21, 2013

  • 21. The STARD checklist : Available at http://www.stard-statement.org/ Accessed January 4, 2016

  • 22. The STARD Flow Diagram : Available at http://www.stard-statement.org/ Accessed January 4, 2016

  • 23. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/tripod-statement/ Accessed March 9, 2015

  • 24. Kottner J., Audige L., Brorson S., et. al.: Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed. J Clin Epidemiol 2011; 64: pp. 96-106.

  • 25. Kottner J., Audige L., Brorson S., et. al.: Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed. Int J Nurs Stud 2011; 48: pp. 661-671.

  • 26. The STROBE Website : Available at http://www.strobe-statement.org/ Accessed June 25, 2014

  • 27. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Ann Intern Med 2007; 147: pp. 573-577.

  • 28. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med 2007; 4: pp. e296.

  • 29. von Elm E., Altman D.G., Egger M., et. al.: Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ 2007; 335: pp. 806-808.

  • 30. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Prev Med 2007; 45: pp. 247-251.

  • 31. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Epidemiology 2007; 18: pp. 800-804.

  • 32. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet 2007; 370: pp. 1453-1457.

  • 33. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. J Clin Epidemiol 2008; 61: pp. 344-349.

  • 34. von Elm E., Altman D.G., Egger M., et. al.: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Bull World Health Organ 2007; 85: pp. 867-872.

  • 35. Jabs D.A.: Improving the reporting of clinical case series. Am J Ophthalmol 2005; 139: pp. 900-905.

  • 36. Kempen J.H.: Appropriate use and reporting of uncontrolled case series in the medical literature. Am J Ophthalmol 2011; 151: pp. 7-10.e1.

  • 37. Sorinola O., Olufowobi O., Coomarasamy A., et. al.: Instructions to authors for case reporting are limited: a review of a core journal list. BMC Med Educ 2004; 4: pp. 4.

  • 38. Gagnier J.J., Kienle G., Altman D.G., et. al.: The CARE guidelines: consensus-based clinical case reporting guideline development. BMJ Case Rep 2013; 2013:

  • 39. Gagnier J.J., Riley D., Altman D.G., et. al.: The CARE guidelines: consensus-based clinical case reporting guideline development. Dtsch Arztebl Int 2013; 110: pp. 603-608.

  • 40. Gagnier J.J., Kienle G., Altman D.G., et. al.: The CARE guidelines: consensus-based clinical case reporting guideline development. J Med Case Rep 2013; 7: pp. 223.

  • 41. Gagnier J.J., Kienle G., Altman D.G., et. al.: The CARE guidelines: consensus-based clinical case report guideline development. J Clin Epidemiol 2014; 67: pp. 46-51.

  • 42. Gagnier J.J., Kienle G., Altman D.G., et. al.: The CARE guidelines: consensus-based clinical case report guideline development. J Diet Suppl 2013; 10: pp. 381-390.

  • 43. The CARE Website : Available at http://www.care-statement.org/ Accessed June 25, 2014

  • 44. The EQUATOR Network Website : Available at http://www.equator-network.org/?post_type=eq_guidelines&eq_guidelines_study_design=observational-studies&eq_guidelines_clinical_specialty=0&eq_guidelines_report_section=0&s= Accessed March 1, 2015

  • 45. Perry D.C., Parsons N., Costa M.L.: “Big data” reporting guidelines: how to answer big questions, yet avoid big problems. Bone Joint J 2014; 96-B: pp. 1575-1577.

  • 46. RECORD Website : Available at www.http://record-statement.org/ Accessed January 4, 2016

  • 47. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/record/ Accessed January 4, 2016

  • 48. Berger M.L., Mamdani M., Atkins D., et. al.: Good research practices for comparative effectiveness research: defining, reporting and interpreting nonrandomized studies of treatment effects using secondary data sources: the ISPOR Good Research Practices for Retrospective Database Analysis Task Force Report—Part I. Value Health 2009; 12: pp. 1044-1052.

  • 49. Cox E., Martin B.C., Van Staa T., et. al.: Good research practices for comparative effectiveness research: approaches to mitigate bias and confounding in the design of nonrandomized studies of treatment effects using secondary data sources: the International Society for Pharmacoeconomics and Outcomes Research Good Research Practices for Retrospective Database Analysis Task Force Report—Part II. Value Health 2009; 12: pp. 1053-1061.

  • 50. Johnson M.L., Crown W., Martin B.C., et. al.: Good research practices for comparative effectiveness research: analytic methods to improve causal inference from nonrandomized studies of treatment effects using secondary data sources: the ISPOR Good Research Practices for Retrospective Database Analysis Task Force Report—Part III. Value Health 2009; 12: pp. 1062-1073.

  • 51. Agency for Healthcare Research and Quality Website : Available at http://effectivehealthcare.ahrq.gov/index.cfm/search-for-guides-reviews-and-reports/?pageaction=displayproduct&productid=318 Accessed January 4, 2016

  • 52. Agency for Healthcare Research and Quality Website : Available at http://www.effectivehealthcare.ahrq.gov/ehc/products/440/1166/User-Guide-Observational-CER-130113.pdf Accessed January 4, 2016

  • 53. Kelley K., Clark B., Brown V., et. al.: Good practice in the conduct and reporting of survey research. Int J Qual Health Care 2003; 15: pp. 261-266.

  • 54. Eysenbach G.: Improving the quality of web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res 2004; 6: pp. e34.

  • 55. Burns K.E., Duffett M., Kho M.E., et. al.: A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 2008; 179: pp. 245-252.

  • 56. Aberle D.R., Berg C.D., Black W.C., et. al.: The National Lung Screening Trial: overview and study design. Radiology 2011; 258: pp. 243-253.

  • 57. The CONSORT Website : Available at http://www.consort-statement.org/ Accessed June 25, 2014

  • 58. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 statement: updated guidelines for reporting parallel group randomized trials. Ann Intern Med 2010; 152: pp. 726-732.

  • 59. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. BMC Med 2010; 8: pp. 18.

  • 60. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ 2010; 340: pp. c332.

  • 61. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. J Clin Epidemiol 2010; 63: pp. 834-840.

  • 62. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 statement: updated guidelines for reporting parallel group randomized trials. Obstet Gynecol 2010; 115: pp. 1063-1070.

  • 63. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. PLoS Med 2010; 7: pp. e1000251.

  • 64. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. Trials 2010; 11: pp. 32.

  • 65. Schulz K.F., Altman D.G., Moher D.: CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. Int J Surg 2011; 9: pp. 672-677.

  • 66. Campbell M.K., Piaggio G., Elbourne D.R., et. al.: Consort 2010 statement: extension to cluster randomised trials. BMJ 2012; 345: pp. e5661.

  • 67. The CONSORT Website : Available at http://www.consort-statement.org/extensions?ContentWidgetId=554 Accessed March 1, 2015

  • 68. Piaggio G., Elbourne D.R., Pocock S.J., et. al.: Reporting of noninferiority and equivalence randomized trials: extension of the CONSORT 2010 statement. JAMA 2012; 308: pp. 2594-2604.

  • 69. The CONSORT Website : Available at http://www.consort-statement.org/extensions?ContentWidgetId=555 Accessed March 1, 2015

  • 70. Boutron I., Moher D., Altman D.G., et. al.: Methods and processes of the CONSORT Group: example of an extension for trials assessing nonpharmacologic treatments. Ann Intern Med 2008; 148: pp. W60-W66.

  • 71. The CONSORT Website : Available at http://www.consort-statement.org/extensions?ContentWidgetId=558 Accessed March 1, 2015

  • 72. Ioannidis J.P., Evans S.J., Gotzsche P.C., et. al.: Better reporting of harms in randomized trials: an extension of the CONSORT statement. Ann Intern Med 2004; 141: pp. 781-788.

  • 73. The CONSORT Website : Available at http://www.consort-statement.org/extensions?ContentWidgetId=561 Accessed March 1, 2015

  • 74. Zwarenstein M., Treweek S., Gagnier J.J., et. al.: Improving the reporting of pragmatic trials: an extension of the CONSORT statement. BMJ 2008; 337: pp. a2390.

  • 75. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/improving-the-reporting-quality-of-nonrandomized-evaluations-of-behavioral-and-public-health-interventions-the-trend-statement/ Accessed March 9, 2015

  • 76. The CONSORT Website : Available at http://www.consort-statement.org/extensions?ContentWidgetId=556 Accessed March 1, 2015

  • 77. Lee C.I., Jarvik J.G.: Patient-centered outcomes research in radiology: trends in funding and methodology. Acad Radiol 2014; 21: pp. 1156-1161.

  • 78. Calvert M., Blazeby J., Altman D.G., et. al.: Reporting of patient-reported outcomes in randomized trials: the CONSORT PRO extension. JAMA 2013; 309: pp. 814-822.

  • 79. The CONSORT Website : Available at http://www.consort-statement.org/extensions?ContentWidgetId=560 Accessed March 1, 2015

  • 80. Des Jarlais D.C., Lyles C., Crepaz N., et. al.: Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health 2004; 94: pp. 361-366.

  • 81. The Centers for Disease Control and Prevention Website : Available at http://www.cdc.gov/trendstatement/ Accessed March 1, 2015

  • 82. The SQUIRE Website : Available at http://squire-statement.org Accessed June 25, 2014

  • 83. Davidoff F., Batalden P., Stevens D., et. al.: Publication guidelines for quality improvement in health care: evolution of the SQUIRE project. Qual Saf Health Care 2008; 17: pp. i3-i9.

  • 84. Davidoff F., Batalden P., Stevens D., et. al.: Publication guidelines for quality improvement studies in health care: evolution of the SQUIRE project. BMJ 2009; 338: pp. a3152.

  • 85. Davidoff F., Batalden P.B., Stevens D.P., et. al.: Development of the SQUIRE Publication Guidelines: evolution of the SQUIRE project. Jt Comm J Qual Patient Saf 2008; 34: pp. 681-687.

  • 86. Davidoff F., Batalden P., Stevens D., et. al.: Publication guidelines for improvement studies in health care: evolution of the SQUIRE Project. Ann Intern Med 2008; 149: pp. 670-676.

  • 87. Davidoff F., Batalden P., Stevens D., et. al.: Publication guidelines for quality improvement studies in health care: evolution of the SQUIRE project. J Gen Intern Med 2008; 23: pp. 2125-2130.

  • 88. Tong A., Sainsbury P., Craig J.: Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19: pp. 349-357.

  • 89. Talmon J., Ammenwerth E., Brender J., et. al.: STARE-HI—Statement on reporting of evaluation studies in Health Informatics. Int J Med Inform 2009; 78: pp. 1-9.

  • 90. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/stare-hi-statement-on-reporting-of-evaluation-studies-in-health-informatics/ Accessed March 1, 2015

  • 91. Moher D., Liberati A., Tetzlaff J., et. al.: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009; 6: pp. e1000097.

  • 92. Moher D., Liberati A., Tetzlaff J., et. al.: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 2009; 339: pp. b2535.

  • 93. Moher D., Liberati A., Tetzlaff J., et. al.: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med 2009; 151: pp. 264-269. W64

  • 94. Moher D., Liberati A., Tetzlaff J., et. al.: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. J Clin Epidemiol 2009; 62: pp. 1006-1012.

  • 95. The PRISMA Website : Available at http://www.prisma-statement.org Accessed June 25, 2014

  • 96. Stroup D.F., Berlin J.A., Morton S.C., et. al.: Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. JAMA 2000; 283: pp. 2008-2012.

  • 97. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/meta-analysis-of-observational-studies-in-epidemiology-a-proposal-for-reporting-meta-analysis-of-observational-studies-in-epidemiology-moose-group/ Accessed March 1, 2015

  • 98. Riley R.D., Lambert P.C., Abo-Zaid G.: Meta-analysis of individual participant data: rationale, conduct, and reporting. BMJ 2010; 340: pp. c221.

  • 99. Foerster B.R., Dwamena B.A., Petrou M., et. al.: Diagnostic accuracy of diffusion tensor imaging in amyotrophic lateral sclerosis: a systematic review and individual patient data meta-analysis. Acad Radiol 2013; 20: pp. 1099-1106.

  • 100. Tong A., Flemming K., McInnes E., et. al.: Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol 2012; 12: pp. 181.

  • 101. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Eur J Health Econ 2013; 14: pp. 367-372.

  • 102. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Value Health 2013; 16: pp. e1-e5.

  • 103. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Clin Ther 2013; 35: pp. 356-363.

  • 104. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Cost Eff Resour Alloc 2013; 11: pp. 6.

  • 105. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. BMC Med 2013; 11: pp. 80.

  • 106. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. BMJ 2013; 346: pp. f1049.

  • 107. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Pharmacoeconomics 2013; 31: pp. 361-367.

  • 108. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. J Med Econ 2013; 16: pp. 713-719.

  • 109. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. Int J Technol Assess Health Care 2013; 29: pp. 117-122.

  • 110. Husereau D., Drummond M., Petrou S., et. al.: Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. BJOG 2013; 120: pp. 765-770.

  • 111. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/cheers/ Accessed March 1, 2015

  • 112. The International Society For Pharmacoeconomics and Outcomes Research Website : Available at http://www.ispor.org/workpaper/CHEERS/revised-CHEERS-Checklist-Oct13.pdf Accessed March 1, 2015

  • 113. O’Cathain A., Murphy E., Nicholl J.: The quality of mixed methods studies in health services research. J Health Serv Res Policy 2008; 13: pp. 92-98.

  • 114. Chan A.W., Tetzlaff J.M., Altman D.G., et. al.: SPIRIT 2013 statement: defining standard protocol items for clinical trials. Ann Intern Med 2013; 158: pp. 200-207.

  • 115. Chiavaras M.M., Jacobson J.A., Carlos R., et. al.: IMpact of Platelet Rich plasma OVer alternative therapies in patients with lateral Epicondylitis (IMPROVE): protocol for a multicenter randomized controlled study: a multicenter, randomized trial comparing autologous platelet-rich plasma, autologous whole blood, dry needle tendon fenestration, and physical therapy exercises alone on pain and quality of life in patients with lateral epicondylitis. Acad Radiol 2014; 21: pp. 1144-1155.

  • 116. The SPIRIT Website : Available at http://www.spirit-statement.org/ Accessed June 25, 2014

  • 117. Moher D., Shamseer L., Clarke M., et. al.: Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev 2015; 4:

  • 118. The EQUATOR Network Website : Available at http://www.equator-network.org/reporting-guidelines/prisma-protocols/ Accessed March 1, 2015

  • 119. Lang T.A., Altman D.G.: Basic statistical reporting for articles published in biomedical journals: the “Statistical Analyses and Methods in the Published Literature” or the SAMPL Guidelines. Int J Nurs Stud 2015; 52: pp. 5-9.

  • 120. Moons K.G., Altman D.G., Reitsma J.B., et. al.: Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD): explanation and elaboration. Ann Intern Med 2015; 162: pp. W1-W73.

  • 121. Hopewell S., Clarke M., Moher D., et. al.: CONSORT for reporting randomised trials in journal and conference abstracts. Lancet 2008; 371: pp. 281-283.

  • 122. Beller E.M., Glasziou P.P., Altman D.G., et. al.: PRISMA for abstracts: reporting systematic reviews in journal and conference abstracts. PLoS Med 2013; 10: pp. e1001419.

This post is licensed under CC BY 4.0 by the author.