Over the past 2 decades, there has been an exponential adoption of simulation in health care education throughout the continuum of medical training and practice. During this time evidence demonstrating the effectiveness and impact of simulation-based medical education (SBME) has mounted: When simulation training is conducted under the right conditions and outcomes are measured by instruments with evidence for reliability and validity, research has shown not only that graduate trainees can obtain desired skills in a controlled, simulated environment1 but also that these skills can transfer to the clinical setting2 and, in some cases, lead to improved patient outcomes.3 Several systematic reviews and meta-analyses indicate that the magnitude of the effects of SBME are significant and consistent across medical disciplines, and these studies highlight the features of SBME that lead to more effective results.46 

Faculty involved in postgraduate education are increasingly adopting simulation in response to several challenges: mounting curricular demands in a setting in which there are fewer real patient encounters because of changes in patient care reimbursement and work hour restrictions, emphasis on patient safety and quality care, and recently, more rigorous requirements to teach and evaluate on a regular basis the Accreditation Council for Graduate Medical Education (ACGME) core competencies and developmental Milestones.7 These pressures have led to widespread change in postgraduate medical education that increasingly involves simulation technology and innovative ways to provide a standardized training and evaluation program. At the same time, academic medical centers expect their faculty to demonstrate scholarly activity in the form of peer-reviewed publications and presentations at scientific and education meetings. The Journal of Graduate Medical Education provides detailed guidelines for the preparation of manuscripts reporting various categories of scholarly activity, and authors should always follow these instructions before submitting their work for consideration to publish.8 The goal of this brief article is to provide 5 specific tips to authors for a more successful submission (especially in the “Original Research” category) on the use of simulation for health care education.

TABLE

Key Elements and Examples of a Successful Submission on Simulation-Based Medical Education

Key Elements and Examples of a Successful Submission on Simulation-Based Medical Education
Key Elements and Examples of a Successful Submission on Simulation-Based Medical Education

The purpose and scope of the project should aim beyond the simple description (case report) of a training intervention at a single institution and make an argument for its potential generalization beyond local use. Even brief reports should aim to surpass the idiographic description of an intervention, method, or approach and lead to generalizable knowledge. A new report can contribute optimally to the existing literature only if it is supported by a description of the theoretical or conceptual rationale for the intervention and a discussion hypothesizing which generic aspects of it could be used in other contexts or situations.9 This is very important if the report is to have an impact on readers who may wish to adopt a similar approach at their own institution. There needs to be a link that bridges the described work to readers' prior knowledge and that indicates the relevance to their current educational setting or practice. In postgraduate education, at a minimum, this usually means linking the project with ACGME core competencies and specific Milestones in the relevant specialties. This will immediately make the work germane to others who share the same challenges to document training and evaluation of these important outcomes.

The study should build on previous studies in some manner. The literature on the modern use of simulation for health care training has a 40-year history. Begin with previous studies in the same field, and then move to other disciplines and professions in health care. What is known from the previous literature? What are the gaps and limitations, and how will the current work address these gaps and add to our knowledge? Be sure to conduct a thorough literature search before concluding that the current question or issue has not been previously addressed and that is why you have chosen to carry out the study or project. Truly novel uses of simulation are becoming increasingly rare.

The primary purpose of sharing your experience using simulation to train or evaluate residents and fellows is so that others may learn from your work and perhaps adopt similar methods at their institution. In order for this to occur, you should describe the simulation intervention—usually in the form of a scenario, skills station (for physical examination technique, other psychomotor or procedural skills, or communication abilities)—with detail sufficient enough that it can be replicated elsewhere. Several templates and models for scenario development exist,10 and citing at least 1 of these (or providing these as an appendix to accommodate word limits) will give readers some basis for developing their own scenarios if your particular intervention does not apply exactly to their local situation or needs.

In the current literature on SBME there is a conspicuous absence of details about the experience and expertise of faculty instructors using simulation as a methodology for training and assessment.11 If any description is provided, it is usually vague like “faculty with experience in the use of simulation.” What does “experience” mean in this context? Medical educators more easily grasp this notion in the clinical context: They understand that an “experienced” person has usually graduated from medical school, completed several years of residency (and fellowship) training, and then practiced in a professional setting that often requires board certification (and recertification). But what about simulation instruction experience? Did the faculty receive formal training and certification? If so, where and in what form? This factor is often overlooked, but it is very important for readers to understand that, in addition to simulation equipment, facilities, and other resources, investment in faculty development and support is critical to the success of SBME interventions and the achievement of effective outcomes.

Previous articles in the Journal have offered guidance on providing validity evidence for the use of assessment instruments.12 These guidelines should be followed when evaluating the effects of SBME, where we often attempt to measure constructs related to performance. As opposed to (objectively measurable) physical properties, the subjectivity inherent in most performance-based assessments requires, almost without exception, that we provide evidence each time we use an assessment tool to support the argument that such use is valid. Statements such as “previous studies have demonstrated the validity of this tool” are insufficient. Because the conditions and people (study participants and raters) involved are likely to be different from those previously described, investigators must demonstrate that they have taken steps to reduce threats to validity in the current study by:

  1. Controlling and standardizing the conditions and setting for the simulation, including equipment choice and operation, instructor role, and so on.

  2. Analyzing the reliability of the data (usually in the form of interrater reliability calculations). Rater training and calibration can improve reliability of data, and if carried out, it should be reported in the study methods.

  3. Describing the motivation of the participants to perform optimally. What was the participants' motivation to perform at their highest level during the simulation exercise? Was participation in the simulation exercise voluntary or mandatory? (Consent to use data is required in either case.) What were the consequences of performance outcomes? For example, were simulation training and achievement of a certain benchmark required before the trainees could apply these skills to real patients? These factors have tremendous influence on the motivation of participants, and providing readers with an understanding of this context will help them to make judgments about the validity of the results and conclusions presented.

Reviewers and editors often identify failure to address these 5 areas when they reject submissions on the use of simulation for training and assessment. By contrast, manuscripts that follow most or all of these tips are more likely to be successful—the first reference cited here is exemplary in this regard (table).1 These suggestions are intended to provide practical guidance to authors when submitting future work to a journal. The goals are to raise the level of scholarship in the field of SBME, to increase the likelihood of having a journal submission accepted for publication, and to improve the education and proficiency of trainees and, ultimately, the care of their patients.

1
Barsuk
JH
,
Cohen
ER
,
Vozenilek
JA
,
O'Connor
LM
,
McGaghie
WC
,
Wayne
DB
.
Simulation-based education with mastery learning improves paracentesis skills
.
J Grad Med Educ
.
2012
;
4
(
1
):
23
27
.
2
Ahya
SN
,
Barsuk
JH
,
Cohen
ER
,
Tuazon
J
,
McGaghie
WC
,
Wayne
DB
.
Clinical performance and skill retention after simulation-based education for nephrology fellows
.
Semin Dial
.
2012
;
25
(
4
):
470
473
.
3
Barsuk
JH
,
Cohen
ER
,
Potts
S
,
Demo
H
,
Gupta
S
,
Feinglass
J
,
et al.
Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections
.
BMJ Qual Saf
.
2014
Mar
14
.
doi: 10.1136/bmjqs-2013-002665. Epub ahead of print
.
4
McGaghie
WC
,
Issenberg
SB
,
Barsuk
JH
,
Wayne
DB
.
A critical review of simulation-based mastery learning with translational outcomes
.
Med Educ
.
2014
;
48
(
4
):
375
385
.
5
McGaghie
WC
,
Issenberg
SB
,
Cohen
ER
,
Barsuk
JH
,
Wayne
DB
.
Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence
.
Acad Med
.
2011
;
86
(
6
):
706
711
.
6
Cook
DA
,
Hatala
R
,
Brydges
R
,
Zendejas
B
,
Szostek
JH
,
Wang
AT
,
et al.
Technology-enhanced simulation for health professions education: a systematic review and meta-analysis
.
JAMA
.
2011
;
306
(
9
):
978
988
.
7
Developmental Milestones for Internal Medicine Residency Training
. .
8
Journal of Graduate Medical Education
.
Instructions to Authors
. .
9
Schuwirth
L
,
Colliver
J
,
Gruppen
L
,
Kreiter
C
,
Mennin
S
,
Onishi
H
,
et al.
Research in assessment: consensus statement and recommendations from the Ottawa 2010 Conference
.
Med Teach
.
2011
;
33
(
3
):
224
233
.
10
AAMC MedEd Portal, Human Patient Simulation Template
. .
11
McGaghie
WC
,
Issenberg
SB
,
Petrusa
ER
,
Scalese
RJ
.
A critical review of simulation-based medical education research: 2003–2009
.
Med Educ
.
2010
;
44
(
1
):
50
63
.
12
Sullivan
GM
.
A primer on the validity of assessment instruments
.
J Grad Med Educ
.
2011
;
3
(
2
):
119
120
.
Erratum in J Grad Med Educ. 2011;3(3):446
.

Author notes

S. Barry Issenberg, MD, FACP, is Michael S. Gordon Professor of Medicine and Medical Education, Director, Michael S. Gordon Center for Research in Medical Education, and Associate Dean, Research in Medical Education, University of Miami Miller School of Medicine; and Ross J. Scalese, MD, FACP, is Associate Professor of Medicine and Director of Educational Technology, Michael S. Gordon Center for Research in Medical Education, University of Miami Miller School of Medicine.