Home Evaluating and Writing Education Papers Compared with Noneducation Papers
Post
Cancel

Evaluating and Writing Education Papers Compared with Noneducation Papers

The field of medical education research is growing rapidly, but educational research has been widely criticized for its lack of a scientific approach, poor theoretical frameworks or study designs, deficient research methods and reporting quality, and lack of meaningful outcomes that would inform practice. There have been recent calls for greater accountability and return on investment for all research efforts and clinical practice. The impact of medical education on important health outcomes such as patient care is unclear but likely underestimated. A better understanding of the role and the structure of medical education research is called for, and in this review, the author outlines a framework for reading, reviewing, and (hopefully) pursuing and carrying out medical education research. Medical education research methodologies are discussed, along with guidelines for reading articles. A structured guideline is suggested and provided for interested readers and reviewers of educational research. Although there are challenges to be faced, they provide endless possibilities to expand and improve on medical education research and to bring it to its full fruition, alongside traditional clinical research. Education is critical to important outcomes, and a greater emphasis should be placed on understanding medical education research.

Medical education research is growing rapidly, as can be seen from a search of the literature or a review of the list of medical journals dedicated to education. However, (medical) educational research is widely criticized for not informing practice, for a lack of scientific approach, for poor theoretical frameworks or study designs, for deficient research methods and reporting quality, and for a lack of meaningful outcomes . There have been calls for greater accountability and return on investment for medical education (research) efforts .

The main goal of medical education is to produce physicians who will deliver high-quality health care . A recent commonwealth report stated that the quality of patient care is determined to some extent by the quality of medical education that students and residents receive . The connection between medical education and clinical outcomes is less clear than for mainstream scientific medical literature . More high-quality medical education research is needed to demonstrate the impact of medical education on physician quality of care and patient clinical outcomes. This evidence will be necessary to justify increased funding and resource allocation toward medical education research studies, given the current health care cost constraints.

The Agency for Healthcare Research and Quality and the Bureau of Health Professions of the Health Resources and Services Administration convened a meeting of experts to discuss medical education outcomes research in 2002. Medical education researchers, outcomes researchers, and stakeholder organizations met to determine the measurement of clinical outcomes of medical education, including physician training, the quality of patient care, and the impact of medical educational research on patient-centered clinical outcomes . An outcomes-type research framework was proposed for medical education, linking it at several organizational levels (society, health care systems, individual practitioners, teachers and educators, residents, medical students, and patients) to outcomes (including satisfaction, performance, knowledge, professionalism, quality of life, and cost) . The Medical School Outcomes Project of the Association of American Medical Colleges and the Outcome Project of the Accreditation Council for Graduate Medical Education both reflect an acknowledgment of the need to examine medical training and ensure the quality of the graduates of medical education programs . The 2001 Institute of Medicine report Crossing the Quality Chasm: A New Health System for the 21st Century highlighted the need for medical education and workforce training to be reoriented to address health care quality and to develop strategies for restructuring clinical education to fit current health care needs. One recommendation was focused on studying the link between quality outcomes and training, given that the foundation for quality health care rests on developing good communication skills, interdisciplinary collaboration, evidence-based practice, knowledge management tools, and shared decision making across the full range of care settings .

Necessary elements for educational research include theoretical frameworks, rigorous study designs, and meaningful outcomes . However, medical education research faces many challenges unique to it that can affect the performance and quality of reporting of studies. These include insufficient resources and funding sources as well as limited research training and experience among educators, small sample sizes, outcomes that are difficult to define and measure, and difficulty navigating the institutional review board process, to name but a few . The latency of the educational effects along with difficulty standardizing educational interventions, as well as individual variability, make it difficult to measure outcomes .

Peer review lies at the core of academic literature and is the main mechanism that journals use to ensure quality . Guidelines exist in the literature for reporting randomized controlled trials (RCTs), prospective or cohort studies, studies of diagnostic accuracy, and meta-analyses, but to the best of my knowledge, none are specifically designed for reporting medical education research . A few authors have performed systematic reviews of the effects of educational methodologies, but no guidelines have been developed to help improve the scientific and ethical validity of educational research .

Get Radiology Tree app to read full this article<

Medical education research manuscript review

Developing a Guideline to Use When Reviewing Medical Education Research Manuscripts

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Study Title, Authors, and Abstract

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Introduction or Background Section

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and Methods (Research Design)

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 1

Medical Education Research Designs Compared to Clinical Research Study Designs

Clinical Studies Education Studies Example Randomized controlled trial Randomized controlled trial Randomizing half of a class of students to extra teaching, such as case-based review of imaging studies, while the other half has the usual didactic teaching only Experimental/observational/prospective cohort study Pretest-posttest control group Dividing a class of residents into two halves and aiming to keep possible confounder variables equal between the groups; testing both groups before and after half of them receive an educational intervention such as simulator training and use Posttest-only control group Dividing a class of students into two halves and aiming to keep possible confounder variables equal between the groups; testing both groups after half of them receive an educational intervention such as web-based interactive modules Observational/prospective cohort study Nonequivalent control group design Taking two groups of residents from the same program, one group of whom will have extra teaching, but they cannot be matched for possible confounding variables; testing both groups after the educational intervention Separate-sample pretest posttest design Taking two groups of medical students from different classes or schools, one group of whom will have extra teaching, but they cannot be matched for possible confounding variables; testing both groups before and after the educational intervention Pre-experimental studies Time-series designs Taking a group of residents and testing their knowledge of on-call cases repeatedly during their first year of call One-group pretest-posttest design Taking a group of medical students who signed up for a radiology elective and testing their radiology knowledge before and after they do the elective One-group posttest design Taking a group of medical students who have already done a radiology elective and testing their radiology knowledge after the elective Cross-sectional study Static group comparison design Surveying an existing group of students after some have received health information education materials No equivalent in clinical medicine Solomon four-group design Taking a group of residents and dividing them into two smaller groups, one of which undergoes the intervention and the other is the control group; within each group, half of the residents take a pretest, and all residents take the posttest Retrospective case-control study Retrospective case-control study Looking at the teaching efforts of junior faculty members who work in the radiology department and comparing teaching scores of those who trained in a university hospital with those who trained in a private practice setting Case series or report Case study Looking at an individual or a small group and gathering detailed information about a process (policies) or a phenomenon (learning about professionalism), such as from individual interviews or focus groups Descriptive study Descriptive study Describing a new method of teaching, such as problem-based learning, simulators, or interactive web modules Consensus proceedings guidelines Consensus proceedings guidelines An organization convenes a meeting of experts in the field, who draw up guidelines after discussion and publish the guidelines

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and Methods (Validity)

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Materials and Methods (Outcome Measures)

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Data Collection

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Data Analysis

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Results

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Discussion

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Get Radiology Tree app to read full this article<

Table 2

Suggested Review Guide for Readers, Reviewers, and Prospective Authors of Educational Papers

Element Guidelines Title Is the title appropriate for the paper? Abstract Is it structured? Does it match the text? Suitability Is the study suitable to the journals readership? Introduction Does the introduction and background make a case for the problem statement or study aim and put it in its appropriate context?

Does the introduction or background include an appropriate literature review? Conceptual framework A description of the conceptual framework or theoretical background should identify the key variables or factors and also illustrate their relationship to one another. Research question Is the problem statement (study aim or research question) clear and well articulated?

Is the study question relevant and or important? Study outcome Is the study outcome relevant and or important? Study design Is the research design appropriate (or as optimal as possible) to answer the research question?

Is the research design implausible, given the research question, the intellectual context of the study, and the practical circumstances where the study is conducted? Internal validity Does the research have internal validity (ie, integrity) to address the question rigorously?

Are there any biases that limit the integrity or validity of the study? External validity Is the research generalizable? Does the research have external validity? Does the research design permit unexpected outcomes or events to occur? Measurement instrument Is the study measurement instrument appropriate?

Are the study measurement scales used appropriate?

Are the cutoffs or thresholds used appropriate? Statistical analysis Is it appropriate for the data collected?

Is there an appropriate balance of descriptive and inferential statistics? Illustrations Are tables, graphs, and images used appropriately? Discussion Was a suitable, thorough literature search performed with comparisons and contrasts to prior studies? Conclusions Are the conclusions justified? Limitations Are limitations discussed? Sources of funding Are these declared? Is there a conflict of interest?

Table 3

Top Reasons to Reject Medical Education Research Manuscripts

Source: Bordage .

Statistics (inappropriate, incomplete, insufficiently described) Overinterpretation of results Instrument (inappropriate, suboptimal, insufficiently described) Sample too small or biased Text too difficult to follow or understand Insufficient or incomplete problem statement Inaccurate or inconsistent data reported Review of literature (inadequate, incomplete, inaccurate, outdated) Insufficient data presented Defective tables or figures

Table 4

Top Reasons to Accept Medical Education Research Manuscripts

Source: Bordage .

Problem (important, timely, relevant, critical) Well-written manuscript Well-designed study Review of literature (thoughtful, focused, up to date) Sample size sufficiently large Practical useful implications Interpretation took into account the limitations of the study Problem well stated and formulated Novel, unique approach to data analysis

Get Radiology Tree app to read full this article<

Summary

Get Radiology Tree app to read full this article<

References

  • 1. Cook D.A.: Quality of reporting of experimental studies in medical education: a systematic review. Med Educ 2007; 41: pp. 737-745.

  • 2. Chen F.M., Bauchner H., Burstin H.: A call for outcomes research in medical education. Acad Med 2004; 79: pp. 955-960.

  • 3. Yarris L.M., Deiorio N.M.: Education research: a primer for educators in emergency medicine. Acad Emerg Med 2011; 18: pp. S27-S35.

  • 4. Institute of Medicine, Committee on Quality of Health Care in America: Crossing the quality chasm: a new health system for the 21st century.2001.National Academy PressWashington, DC

  • 5. Bordage G., Caelleigh A.S., Steinecke A., et. al., Joint Task Force of Academic Medicine and the GEA-RIME Committee: Review criteria for research manuscripts. Acad Med 2001; 76: pp. 897-978.

  • 6. Begg C., Cho M., Eastwood S., et. al.: Improving the quality of reporting of randomized controlled trials. The CONSORT statement. JAMA 1996; 276: pp. 637-639.

  • 7. Schulz K.F., Altman D.G., Moher D., CONSORT Group: CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMC Med 2010; 8: pp. 18.

  • 8. Clarke M.: The QUORUM statement. Lancet 2000; 355: pp. 756-757.

  • 9. Bossuyt P.M., Reitsma J.B., Bruns D.E., et. al., STARD Group: Toward complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Acad Radiol 2003; 10: pp. 664-669.

  • 10. von Elm E., Altman D.G., Egger M., et. al., STROBE Initiative: The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet 2007; 370: pp. 1453-1457.

  • 11. Stroup D.F., Berlin J.A., Morton S.C., et. al., Meta-Analysis of Observational Studies in Epidemiology (MOOSE) Group: Meta-analysis of observational studies in epidemiology: a proposal for reporting. JAMA 2000; 283: pp. 2008-2012.

  • 12. Davidoff F., Batalden P., Stevens D., et. al., Standards for Quality Improvement Reporting Excellence Development Group: Publication guidelines for quality improvement studies in health care: evolution of the SQUIRE project [published erratum appears in J Gen Intern Med 2009; 24:147]. J Gen Intern Med 2008; 23: pp. 2125-2130.

  • 13. Tong A., Sainsbury P., Craig J.: Consolidated Criteria for Reporting Qualitative Research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19: pp. 349-357.

  • 14. International Committee of Medical Journal Editors: Uniform requirements for manuscripts submitted to biomedical journals. CMAJ 1995; 152: pp. 1459-1473.

  • 15. International Committee of Medical Journal Editors. Uniform requirements for manuscripts submitted to biomedical journals: writing and editing for biomedical publication. Available at: http://www.icmje.org/urm_full.pdf . Accessed January 15, 2012.

  • 16. Ad Hoc Working Group for Critical Appraisal of the Medical Literature: A proposal for more informative abstracts of clinical articles. Ann Intern Med 1987; 106: pp. 598-604.

  • 17. Bordage G.: Conceptual frameworks to illuminate and magnify. Med Educ 2009; 43: pp. 312-319.

  • 18. Dauphinee W.D., Wood-Dauphinee S.: The need for evidence in medical education: the development of best evidence medical education as an opportunity to inform, guide, and sustain medical education research. Acad Med 2004; 79: pp. 925-930.

  • 19. Hulley S.B., Cummings S.R., Browner W.S., et. al.: Conceiving the research question.Designing clinical research.2001.Lippincott Williams & WilkinsPhiladelphia, PA:pp. 335.

  • 20. Glasson C., Chapman K., Gander K., et. al.: The efficacy of a brief, peer-led nutrition education intervention in increasing fruit and vegetable consumption: a wait-list, community-based randomised controlled trial. Public Health Nutr 2012; 8: pp. 1-9.

  • 21. Relyea-Chew A., Talner L.B.: A dedicated general competencies curriculum for radiology residents development and implementation. Acad Radiol 2011; 18: pp. 650-654.

  • 22. McCambridge J., Butor-Bhavsar K., Witton J., et. al.: Can research assessments themselves cause bias in behaviour change trials? A systematic review of evidence from solomon 4-group studies. PLoS One 2011; 6: pp. e25223.

  • 23. Eva K.: Issues to consider when planning and conducting educational research. J Dent Ed 2004; 68: pp. 316-323.

  • 24. Lynch D.C.: A rationale for using synthetic designs in medical education research. Advances in health sciences education 2000; 5: pp. 93-103.

  • 25. Curry L.A.: Qualitative and mixed methods provide unique contributions to outcomes research. Circ 2009; 119: pp. 1442-1452.

  • 26. Kern D.E., Thomas P.A., Howard D.M., et. al.: Curriculum development for medical education: a six-step approach.1998.Johns Hopkins University PressBaltimore, MD 178

  • 27. Huang G.C., Newman L.R., Tess A.V., et. al.: Teaching patient safety: conference proceedings and consensus statements of the Millennium Conference 2009. Teach Learn Med 2011; 23: pp. 172-178.

  • 28. Huang G., Newman L., Anderson M.B., et. al.: Conference proceedings and consensus statements of the Millennium Conference 2007: a collaborative approach to educational research. Teach Learn Med 2010; 22: pp. 50-55.

  • 29. Huang G.C., Gordon J.A., Schwartzstein R.M.: Millennium Conference 2005 on medical simulation: a summary report. Simul Healthc 2007; 2: pp. 88-95.

  • 30. Cook D.A.: Description, justification and clarification: a framework for classifying the purposes of research in medical education. Med Educ 2008; 42: pp. 128-133.

  • 31. Sica G.T.: Bias in research studies. Radiology 2006; 238: pp. 780-789.

  • 32. Prystowsky J.B., Bordage G.: An outcomes research perspective on medical education: the predominance of trainee assessment and satisfaction. Med Educ 2001; 35: pp. 331-336.

  • 33. Shea J.A., McGaghie W.C., Pangaro L.: Instrumentation, data collection and quality control. Acad Med 2001; 76: pp. 931-933.

  • 34. Fraenkel J.R., Wallen N.E.: How to design and evaluate research in education.8th ed.2011.McGraw-HillNew York

  • 35. Code of Federal Regulation, Title 45, public welfare, part 46—protection of human subjects, US Department of Human Services. Available at: http://www.hhs.gov/ohrp/policy/ohrpregulations.pdf . Accessed January 15, 2012.

  • 36. US Department of Health and Human Services. Health Insurance Portability and Accountability Act. Available at: http://www.hhs.gov/ocr/privacy/hipaa/administrative/privacyrule/index.html . Accessed January 15, 2012.

  • 37. US Department of Health and Human Services. Institutional review board. Available at: http://www.hhs.gov/ohrp/assurances/irb/index.html . Accessed January 15, 2012.

  • 38. Bordage G.: Reasons reviewers reject and accept manuscripts: the strengths and weaknesses in medical education reports. Acad Med 2001; 76: pp. 889-896.

This post is licensed under CC BY 4.0 by the author.