Over the past 2 decades, there has been an exponential adoption of simulation in health care education throughout the continuum of medical training and practice. During this time evidence demonstrating the effectiveness and impact of simulation-based medical education (SBME) has mounted: When simulation training is conducted under the right conditions and outcomes are measured by instruments with evidence for reliability and validity, research has shown not only that graduate trainees can obtain desired skills in a controlled, simulated environment1 but also that these skills can transfer to the clinical setting2 and, in some cases, lead to improved patient outcomes.3 Several systematic reviews and meta-analyses indicate that the magnitude of the effects of SBME are significant and consistent across medical disciplines, and these studies highlight the features of SBME that lead to more effective results.4–6
Faculty involved in postgraduate education are increasingly adopting simulation in response to several challenges: mounting curricular demands in a setting in which there are fewer real patient encounters because of changes in patient care reimbursement and work hour restrictions, emphasis on patient safety and quality care, and recently, more rigorous requirements to teach and evaluate on a regular basis the Accreditation Council for Graduate Medical Education (ACGME) core competencies and developmental Milestones.7 These pressures have led to widespread change in postgraduate medical education that increasingly involves simulation technology and innovative ways to provide a standardized training and evaluation program. At the same time, academic medical centers expect their faculty to demonstrate scholarly activity in the form of peer-reviewed publications and presentations at scientific and education meetings. The Journal of Graduate Medical Education provides detailed guidelines for the preparation of manuscripts reporting various categories of scholarly activity, and authors should always follow these instructions before submitting their work for consideration to publish.8 The goal of this brief article is to provide 5 specific tips to authors for a more successful submission (especially in the “Original Research” category) on the use of simulation for health care education.
Tip 1: Define the Purpose of Study
The purpose and scope of the project should aim beyond the simple description (case report) of a training intervention at a single institution and make an argument for its potential generalization beyond local use. Even brief reports should aim to surpass the idiographic description of an intervention, method, or approach and lead to generalizable knowledge. A new report can contribute optimally to the existing literature only if it is supported by a description of the theoretical or conceptual rationale for the intervention and a discussion hypothesizing which generic aspects of it could be used in other contexts or situations.9 This is very important if the report is to have an impact on readers who may wish to adopt a similar approach at their own institution. There needs to be a link that bridges the described work to readers' prior knowledge and that indicates the relevance to their current educational setting or practice. In postgraduate education, at a minimum, this usually means linking the project with ACGME core competencies and specific Milestones in the relevant specialties. This will immediately make the work germane to others who share the same challenges to document training and evaluation of these important outcomes.
Tip 2: Conduct a Literature Review
The study should build on previous studies in some manner. The literature on the modern use of simulation for health care training has a 40-year history. Begin with previous studies in the same field, and then move to other disciplines and professions in health care. What is known from the previous literature? What are the gaps and limitations, and how will the current work address these gaps and add to our knowledge? Be sure to conduct a thorough literature search before concluding that the current question or issue has not been previously addressed and that is why you have chosen to carry out the study or project. Truly novel uses of simulation are becoming increasingly rare.
Tip 3: Describe the Simulation Intervention
The primary purpose of sharing your experience using simulation to train or evaluate residents and fellows is so that others may learn from your work and perhaps adopt similar methods at their institution. In order for this to occur, you should describe the simulation intervention—usually in the form of a scenario, skills station (for physical examination technique, other psychomotor or procedural skills, or communication abilities)—with detail sufficient enough that it can be replicated elsewhere. Several templates and models for scenario development exist,10 and citing at least 1 of these (or providing these as an appendix to accommodate word limits) will give readers some basis for developing their own scenarios if your particular intervention does not apply exactly to their local situation or needs.
Tip 4: Provide Information on the Role of the Instructor
In the current literature on SBME there is a conspicuous absence of details about the experience and expertise of faculty instructors using simulation as a methodology for training and assessment.11 If any description is provided, it is usually vague like “faculty with experience in the use of simulation.” What does “experience” mean in this context? Medical educators more easily grasp this notion in the clinical context: They understand that an “experienced” person has usually graduated from medical school, completed several years of residency (and fellowship) training, and then practiced in a professional setting that often requires board certification (and recertification). But what about simulation instruction experience? Did the faculty receive formal training and certification? If so, where and in what form? This factor is often overlooked, but it is very important for readers to understand that, in addition to simulation equipment, facilities, and other resources, investment in faculty development and support is critical to the success of SBME interventions and the achievement of effective outcomes.
Tip 5: Select Outcome Measures
Previous articles in the Journal have offered guidance on providing validity evidence for the use of assessment instruments.12 These guidelines should be followed when evaluating the effects of SBME, where we often attempt to measure constructs related to performance. As opposed to (objectively measurable) physical properties, the subjectivity inherent in most performance-based assessments requires, almost without exception, that we provide evidence each time we use an assessment tool to support the argument that such use is valid. Statements such as “previous studies have demonstrated the validity of this tool” are insufficient. Because the conditions and people (study participants and raters) involved are likely to be different from those previously described, investigators must demonstrate that they have taken steps to reduce threats to validity in the current study by:
Controlling and standardizing the conditions and setting for the simulation, including equipment choice and operation, instructor role, and so on.
Analyzing the reliability of the data (usually in the form of interrater reliability calculations). Rater training and calibration can improve reliability of data, and if carried out, it should be reported in the study methods.
Describing the motivation of the participants to perform optimally. What was the participants' motivation to perform at their highest level during the simulation exercise? Was participation in the simulation exercise voluntary or mandatory? (Consent to use data is required in either case.) What were the consequences of performance outcomes? For example, were simulation training and achievement of a certain benchmark required before the trainees could apply these skills to real patients? These factors have tremendous influence on the motivation of participants, and providing readers with an understanding of this context will help them to make judgments about the validity of the results and conclusions presented.
Reviewers and editors often identify failure to address these 5 areas when they reject submissions on the use of simulation for training and assessment. By contrast, manuscripts that follow most or all of these tips are more likely to be successful—the first reference cited here is exemplary in this regard (table).1 These suggestions are intended to provide practical guidance to authors when submitting future work to a journal. The goals are to raise the level of scholarship in the field of SBME, to increase the likelihood of having a journal submission accepted for publication, and to improve the education and proficiency of trainees and, ultimately, the care of their patients.
References
Author notes
S. Barry Issenberg, MD, FACP, is Michael S. Gordon Professor of Medicine and Medical Education, Director, Michael S. Gordon Center for Research in Medical Education, and Associate Dean, Research in Medical Education, University of Miami Miller School of Medicine; and Ross J. Scalese, MD, FACP, is Associate Professor of Medicine and Director of Educational Technology, Michael S. Gordon Center for Research in Medical Education, University of Miami Miller School of Medicine.