Background

Few studies have examined residents' retained knowledge and confidence regarding essential evidence-based medicine (EBM) topics.

Objective

To compare postgraduate year-3 (PGY-3) residents' confidence with EBM topics taught during internship with that of PGY-1 residents before and after exposure to an EBM curriculum.

Methods

All residents participated in an EBM curriculum during their intern year. We surveyed residents in 2009. PGY-1 residents completed a Likert-scale type survey (which included questions from the validated Berlin questionnaire and others, developed based on input from local EBM experts). We administered the Berlin questionnaire to a subset of PGY-3 residents.

Results

Forty-five PGY-3 (88%; n  =  51) and 42 PGY-1 (91%; n  =  46) residents completed the survey. Compared with PGY-1 residents pre-curriculum, PGY-3 residents were significantly more confident in their knowledge of pre- and posttest probability (mean difference, 1.14; P  =  .002), number needed to harm (mean difference, 1.09; P  =  .002), likelihood ratio (mean difference, 1.01; P  =  .003), formulation of a focused clinical question (mean difference, 0.98; P  =  .001), and critical appraisal of therapy articles (mean difference, 0.91; P  =  .002). Perceived confidence was significantly lower for PGY-3 than post-curriculum PGY-1 residents on relative risk (mean difference, −0.86; P  =  .002), study design for prognosis questions (mean difference, −0.75; P  =  .004), number needed to harm (mean difference, −0.67; P  =  .01), ability to critically appraise systematic reviews (mean difference, −0.65, P  =  .009), and retrieval of evidence (mean difference, −0.56; P  =  .008), among others. There was no relationship between confidence with and actual knowledge of EBM topics.

Conclusions

Our findings demonstrate lower confidence among PGY-3 than among PGY-1 internal medicine residents for several EBM topics. PGY-3 residents demonstrated poor knowledge of several core topics taught during internship. Longitudinal EBM curricula throughout residency may help reinforce residents' EBM knowledge and their confidence.

Editor's note: The online version of this article contains the survey used in this study.

Evidence-based medicine (EBM) training is an integral component of the Accreditation Council for Graduate Medical Education (ACGME) core competencies of problem-based learning and improvement.1 Because graduating residents are expected to show competence in EBM, internal medicine residency programs have explored creative ways of teaching and assessing it.2 

Most EBM curricula are structured as brief educational experiences ranging from 6-hour EBM workshops3 to 7 weeks of weekly tutorials.4,5 Workshops have utilized didactic teaching, small group exercises using adult learning principles,4 dedicated elective rotations,6 and journal clubs to educate residents about basic EBM principles.7 Others have introduced an educational prescription to incorporate EBM into patient care.8 

EBM curricular assessment spans 3 main domains: knowledge, skills, and attitudes. Some educators have incorporated an EBM station into their standard Objective Structured Clinical Examination (OSCE) training for assessment of EBM skills.9 The Berlin questionnaire is 1 of only 2 validated measures used to assess skills and knowledge of EBM across the entire EBM cycle (“5 As”: assess, ask, acquire, appraise, apply).2,1012 Several medical schools have utilized this tool to assess the effectiveness of their EBM curricula.1315 Few have explored evaluating their curricula to assess EBM-knowledge retention over time.5,13 

Despite the educational importance of EBM, few internal medicine residency programs have investigated the effectiveness of existing EBM curricula in ensuring knowledge retention and confidence among residents. Our study sought to (1) assess postgraduate year-3 (PGY-3) internal medicine residents' confidence in previously taught EBM knowledge; (2) compare PGY-3 residents' perceived knowledge with that of PGY-1 residents, pre- and post-EBM curriculum; and (3) examine correlations between perceived and actual (Berlin questionnaire) EBM knowledge among PGY-3 residents.

The internal medicine residency at Mayo Clinic Rochester has approximately 48 categorical residents in each of 3 sequential years of training. The EBM curriculum takes place during a 1-month rotation in the first year of residency. PGY-1 residents participate in 8 interactive 1-hour sessions that address the following topics: the EBM cycle; the patient, intervention, comparison, outcome, type of study design, type of question (PICOTT) format for structuring clinical questions; hierarchy of evidence; study design; and critical appraisal of therapy, diagnosis, prognosis, meta-analysis, and harm articles.16 Residents apply their knowledge by presenting their EBM analysis of a clinical case and pertinent literature to a group of residents and faculty members at a 1-hour weekly conference called Clinical Decision Making Journal Club (CDMJC). During their third year, the same residents present and also mentor first-year residents in their EBM case presentations during an outpatient rotation. We received an educational quality exemption for our study from the Mayo Foundation Institutional Review Board.

What was known

Evidence-based medicine (EBM) is an essential skill of physicians with benefits to patients. Graduates' competence to practice EBM has largely been unexamined.

What is new

Senior internal medicine residents showed poor knowledge of basic EBM topics taught during internship.

Limitations

Small sample; cross-sectional study; PGY-1/PGY-3 are comparisons based on self-reported confidence, not assessed knowledge.

Bottom line

Teaching EBM longitudinally during residency may enhance graduates' knowledge and confidence.

To assess PGY-3 residents' confidence in previously taught EBM topics, we designed a Likert-scale type survey (1  =  extremely unconfident, 3  =  neutral, 5  =  extremely confident) that included validated content from the Berlin questionnaire (15 items; see www.bmj.com for content) and questions developed based on input from local EBM experts (11 items). The same survey was distributed electronically to all first (N  =  46) and third-year (N  =  51) residents in January 2010. After the electronic survey was administered, PGY-3 residents assigned to the outpatient rotation from November 2009 to April 2010 were invited to complete the Berlin questionnaire. Individual residents' responses to the survey and the Berlin questionnaire were paired for statistical analysis.

We collected baseline information for the 3 cohorts, PGY-1 pre-EBM course, PGY-1 post-EBM course, and PGY-3 residents to assess differences among the cohorts. Variables included age, sex, in-training examination (ITE) percentage correct and percentile, US Medical Licensing Examination (USMLE) 1 and 2 scores, Alpha Omega Alpha honor society (AOA) status, and international medical graduate (IMG) status.

Cochran-Mantel-Haenszel tests were used to compare confidence ratings for the 3 groups and assess associations between perceived and actual knowledge among PGY-3 residents. Statistical significance was set at a P value of .01 to account for multiple comparisons. The survey questions used in this study are provided as online supplemental content. All calculations were performed using SAS statistical software (version 9.1; SAS Institute Inc, Cary, NC).

Baseline Characteristics

Baseline characteristics of the 3 cohorts are shown in table 1. There was no statistically significant difference between the 3 cohorts with regard to the variables shown. There was a trend toward higher ITE percentage for the pre-EBM and PGY-3 groups, but the difference was not statistically significant (P  =  .10). Moreover, there were no differences between any of the variables for the PGY-3 residents who completed the Berlin questionnaire and those who did not.

TABLE 1

Baseline Characteristics of Residents in Pre-EBM, Post-EBM, and PGY-3 Groups

Baseline Characteristics of Residents in Pre-EBM, Post-EBM, and PGY-3 Groups
Baseline Characteristics of Residents in Pre-EBM, Post-EBM, and PGY-3 Groups

PGY-3 Resident Results

Forty-five of 51 PGY-3 residents (88%) and 42 of 46 first-year residents (91%) completed the survey. For Berlin questionnaire survey items, mean (±SD) confidence scores for PGY-3 residents ranged from 3.88 (±0.80) for “hierarchy of evidence” to 2.93 (±1.11) for “ability to calculate positive likelihood ratio.” For items provided by local EBM experts, mean confidence scores ranged from 4.27 (±0.87) for “ability to form a clinical question based on PICOTT format,” to 2.98 (±1.27) for the concepts related to meta-analysis (heterogeneity, forest plot, funnel plot, and publication bias). A summary of results is shown in tables 2 and 3.

TABLE 2

Participants' Mean Responses to Berlin Survey Questions

Participants' Mean Responses to Berlin Survey Questions
Participants' Mean Responses to Berlin Survey Questions
TABLE 3

Summary of Survey Results

Summary of Survey Results
Summary of Survey Results

PGY-3 Residents Compared with PGY-1 Residents Pre-EBM course

Compared with PGY-1 residents who had not taken the EBM course, PGY-3 residents had statistically significantly higher perceived confidence in their understanding of pre- and posttest probability (mean difference, 1.14; P  =  .002), ability to calculate a likelihood ratio (mean difference, 1.01; P  =  .003), ability to derive the number needed to harm (mean difference, 1.09; P  =  .002), and the ability to identify the best study design to answer different types of clinical questions (mean difference, 0.69, P  =  .01). There were no statistically significant differences between PGY-1 residents' pre-EBM course results and those of PGY-3 residents for the remaining Berlin questionnaire survey items. For survey topics developed by local EBM experts, PGY-3 residents had significantly higher confidence than PGY-1 residents before their EBM course, regarding their ability to formulate PICOTT question (mean difference, 0.98; P < .001), understand important concepts related to critical appraisal of meta-analysis (mean difference, 0.98; P  =  .007), appraise therapy articles (mean difference, 0.91; P  =  .002), and mentor PGY-1 residents (mean difference, 1.52; P < .001). PGY-3 residents' confidence was not lower than PGY-1 residents' confidence pre-EBM course on any of the survey questions.

PGY-3 Residents Compared with PGY-1 Residents Post-EBM course

Compared with PGY-1 residents who completed the EBM course, PGY-3 residents had statistically significantly lower confidence in their abilities to identify the highest level of evidence (mean difference, −0.68, P  =  .001), compare relative risk ratios and number needed to treat between studies (mean difference, −0.86, P  =  .002), create a 2×2 table for diagnosis articles (mean difference, −0.77; P  =  .002), identify the best study design to address prognosis questions (mean difference, −0.75, P  =  .004), and understand the benefits of meta-analysis (mean difference, −0.65; P  =  .003). Similarly, for items developed according to EBM experts' recommendations, PGY-3 residents' confidence was lower than that of PGY-1 residents in their abilities to perform an efficient and effective literature search (mean difference, −0.56; P  =  .008) and to critically appraise systematic reviews (mean difference, −0.65; P  =  .009). PGY-3 residents' confidence did not exceed that of PGY-1 residents' post-EBM course on any of the questions.

PGY-3 Residents' Perceived Versus Actual Knowledge Assessment

Twenty-two (43%) PGY-3 residents assigned to outpatient rotation from November 2009 through April 2010 completed both the Berlin questionnaire and the electronic survey. The mean proportion of correct answers was 71.2%, which approached the 79% mean score of experts with formal EBM training.12 Residents scored less than 50% on average for 2 questions: the ability to calculate positive and negative predictive values (36.4% correct) and the ability to calculate relative risk (31.8% correct). There was no statistically significant correlation between perceived knowledge, based on subjective electronic survey, and actual knowledge, as measured by performance on the Berlin questionnaire.

Our study shows that PGY-3 residents' confidence in EBM knowledge exceeds that of interns who have not had formal EBM training but is less than that of interns who have just completed training on key elements of the EBM curriculum. These results imply that PGY-3 residents' personal confidence in EBM knowledge and skill may wane over 3 years of training.

The findings of our study reflect the method and content of the EBM curriculum and the CDMJC at our residency program. PGY-3 residents appear to be more confident in EBM topics that are presented on a regular basis at our weekly journal club, such as PICOTT question formulation and identification of question and study type. During the academic year 2009–2010, an average of 90 journal articles were presented at CDMJC, of which 71 (79%) were therapy articles, 8 (9%) were meta-analyses articles, 6 (7%) were diagnosis articles, 3 (3%) were prognosis, and 2 (2%) were harm studies. Therapy articles, therefore, were the articles most commonly critically appraised during our journal club conferences. This likely explains why our PGY-3 residents expressed higher confidence in understanding EBM concepts pertaining to therapy articles than to prognosis articles or systematic reviews. Similarly, PGY-3 residents' confidence was significantly lower in EBM topics that were taught over 1 month during internship but not commonly discussed during journal club. These findings suggest that the frequency with which EBM content is taught and presented could influence residents' confidence in EBM knowledge.

Previous Berlin questionnaire surveys of medical students and residents have also demonstrated little or no correlation between perceived and actual knowledge of EBM.13,17 A recent study did find significant correlation between a new validated pediatric EBM assessment tool and comfort level evaluation.18 However, that study did not use the Berlin questionnaire due to its limitations in reflecting only applied knowledge, and the study contained only 6 items (versus 26 in our study). Despite the lack of correlation in our study, the findings of the Berlin questionnaire helped identify important EBM knowledge gaps among our graduating PGY-3 residents. Moreover, the perceived knowledge survey identified areas of our EBM curriculum that are either not emphasized (eg, study design) or not taught (eg, Cochrane reviews and American College of Physicians journal club as sources of evidence, hypothesis testing, and types of error, and critical appraisal of guidelines and systematic reviews). These findings helped identify important EBM topics to incorporate in our curriculum.

One possible explanation for residents' decreasing confidence in EBM knowledge over time is the absence of a practice-based, longitudinal curriculum. Previous studies have demonstrated post-EBM curriculum knowledge retention over 6-month to 9-month intervals.5,14 We are unaware of studies demonstrating knowledge retention over 3 years. Another possible explanation is that some EBM topics demand mathematical skills that require regular practice to maintain. Examples of such skills include constructing 2×2 tables, calculating predictive values, and determining outcomes for articles dealing with therapy and harm. Finally, PGY-3 residents may demonstrate decreased confidence in EBM as they have more experience and knowledge of subtleties of EBM concepts than PGY-1 residents, who may overestimate their understanding of EBM due to less clinical experience.

Our study has several limitations. First, this was a cross-sectional study, and we compared PGY-1 and PGY-3 residents as similar cohorts, except for exposure to residency EBM course, rather than following a single cohort of residents over time. Moreover, we did not account for baseline EBM exposure from medical school. In addition, our electronic survey was sent to PGY-1 residents 6 months after the beginning of the academic year, which may have allowed diminution in any confidence in EBM knowledge that had been learned during medical school. Second, comparisons between PGY-1 and PGY-3 residents were based on self-reported confidence levels. Administering the Berlin questionnaire across all 3 years of residency would have provided more objective assessment of EBM knowledge and skill. Third, although the EBM curriculum has remained consistent over the years, the curriculum is taught by different chief medical residents each year. This may introduce a “teacher effect,” or bias due to variation in educational effectiveness from one teacher to the next, despite the same preparatory training that all Mayo chief medical residents receive. Last, this study involved a single internal medicine residency program, and our sample size may not be large enough to detect small but possibly meaningful correlations between confidence with EBM topics and objective EBM knowledge. Multi-institutional studies are needed to establish the generalizability of the results. Nevertheless, EBM remains an important ACGME core competency.

To our knowledge, this is the first study to demonstrate diminished confidence in previously taught EBM knowledge among PGY-3 residents. Our findings support the need for longitudinal curricula for teaching EBM. Based on our findings, we are developing additional EBM curricula for PGY-2 and PGY-3 residents. Future studies will be needed to determine whether EBM confidence and knowledge remain stable or improve after the implementation of a longitudinal curriculum.

1
Accreditation Council for Graduate Medical Education
.
Common program requirements
. .
2
Shaneyfelt
T
,
Baum
KD
,
Bell
D
,
et al.
Instruments for evaluating education in evidence-based practice: a systematic review
.
JAMA
.
2006
;
296
(
9
):
1116
1127
.
3
Harewood
GC
,
Hendrick
LM
.
Prospective, controlled assessment of the impact of formal evidence-based medicine teaching workshop on ability to appraise the medical literature
.
Ir J Med Sci
.
2010
;
179
(
1
):
91
94
.
4
Green
ML
,
Ellis
PJ
.
Impact of an evidence-based medicine curriculum based on adult learning theory
.
J Gen Intern Med
.
1997
;
12
(
12
):
742
750
.
5
Smith
CA
,
Ganschow
PS
,
Reilly
BM
,
et al.
Teaching residents evidence-based medicine skills: a controlled trial of effectiveness and assessment of durability
.
J Gen Intern Med
.
2000
;
15
(
10
):
710
715
.
6
Akl
EA
,
Izuchukwu
IS
,
El-Dika
S
,
et al.
Integrating an evidence-based medicine rotation into an internal medicine residency program
.
Acad Med
.
2004
;
79
(
9
):
897
904
.
7
Green
ML
.
Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula
.
Acad Med
.
1999
;
74
(
6
):
686
694
.
8
Feldstein
DA
,
Mead
S
,
Manwell
LB
.
Feasibility of an evidence-based medicine educational prescription
.
Med Educ
.
2009
;
43
(
11
):
1105
1106
.
9
Tudiver
F
,
Rose
D
,
Banks
B
,
et al.
Reliability and validity testing of an evidence-based medicine OSCE station
.
Fam Med
.
2009
;
41
(
2
):
89
91
.
10
Ilic
D
.
Assessing competency in evidence based practice: strengths and limitations of current tools in practice
.
BMC Med Educ
.
2009
;
9
:
53
.
11
Ramos
KD
,
Schafer
S
,
Tracz
SM
.
Validation of the Fresno test of competence in evidence based medicine
.
BMJ
.
2003
;
326
(
7384
):
319
321
.
12
Fritsche
L
,
Greenhalgh
T
,
Falck-Ytter
Y
,
et al.
Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine
.
BMJ
.
2002
;
325
(
7376
):
1338
1341
.
13
West
CP
,
McDonald
FS
.
Evaluation of a longitudinal medical school evidence-based medicine curriculum: a pilot study
.
J Gen Intern Med
.
2008
;
23
(
7
):
1057
1059
.
14
Lai
NM
,
Teng
CL
.
Competence in evidence-based medicine of senior medical students following a clinically integrated training programme
.
Hong Kong Med J
.
2009
;
15
(
5
):
332
338
.
15
Zaidi
Z
,
Iqbal
M
,
Hashim
J
,
et al.
Making evidence-based medicine (EBM) doable in developing countries: a locally-tailored workshop for EBM in a Pakistani institution
.
Educ Health (Abingdon)
.
2009
;
22
(
1
):
176
.
16
Guyatt
G
,
Rennie
D
,
Meade
OM
,
et al.
Users' Guides to the Medical Literature: Essentials of Evidence-Based Clinical Practice. 2nd ed
.
American Medical Association
;
Chicago, IL.
2002
:
836
.
17
Khan
KS
,
Awonuga
AO
,
Dwarakanath
LS
,
et al.
Assessments in evidence-based medicine workshops: loose connection between perception of knowledge and its objective assessment
.
Med Teach
.
2001
;
23
(
1
):
92
94
.
18
Chernick
L
,
Pusic
M
,
Liu
H
,
et al.
A pediatrics-based instrument for assessing resident education in evidence-based practice
.
Acad Pediatr
.
2010
;
10
(
4
):
260
265
.

Author notes

All authors are from the Department of Internal Medicine, Mayo Clinic. Mira T. Keddis, MD, is Instructor of Medicine; Thomas J. Beckman, MD, is Associate Professor of Medicine; Michael W. Cullen, MD, is Assistant Professor of Medicine; Darcy A. Reed, MD, MPH, is Assistant Professor of Medicine; Andrew J. Halvorsen, MS, is Biostatistician at the Department of Internal Medicine Residency Office of Educational Innovations; Christopher M. Wittich, MD, is Assistant Professor of Medicine; Colin P. West, MD, PhD, is Associate Professor of Medicine and Biostatistics; and Furman S. McDonald, MD, MPH, is Associate Professor of Medicine and Medical Education.

Funding: This study was supported in part by the Mayo Clinic Internal Medicine Residency Office of Educational Innovations as part of the ACGME Educational Innovations Project.

Supplementary data