Background

We developed a practice-based learning and improvement (PBLI) curriculum to address important gaps in components of content and experiential learning activities through didactics and participation in systems-level quality improvement projects that focus on making changes in health care processes.

Methods

We evaluated the impact of our curriculum on resident PBLI knowledge, self-efficacy, and application skills. A quasi-experimental design assessed the impact of a curriculum (PBLI quality improvement systems compared with non-PBLI) on internal medicine residents' learning during a 4-week ambulatory block. We measured application skills, self-efficacy, and knowledge by using the Systems Quality Improvement Training and Assessment Tool. Exit evaluations assessed time invested and experiences related to the team projects and suggestions for improving the curriculum.

Results

The 2 groups showed differences in change scores. Relative to the comparison group, residents in the PBLI curriculum demonstrated a significant increase in the belief about their ability to implement a continuous quality improvement project (P  =  .020), comfort level in developing data collection plans (P  =  .010), and total knowledge scores (P < .001), after adjusting for prior PBLI experience. Participants in the PBLI curriculum also demonstrated significant improvement in providing a more complete aim statement for a proposed project after adjusting for prior PBLI experience (P  =  .001). Exit evaluations were completed by 96% of PBLI curriculum participants who reported high satisfaction with team performance.

Conclusion

Residents in our curriculum showed gains in areas fundamental for PBLI competency. The observed improvements were related to fundamental quality improvement knowledge, with limited gain in application skills. This suggests that while heading in the right direction, we need to conceptualize and structure PBLI training in a way that integrates it throughout the residency program and fosters the application of this knowledge and these skills.

Medical practice includes the clinical management of an individual patient's health conditions in a health care system where the number and types of treatment and care options are continually changing. As a result, residents must develop continuous quality improvement (CQI) knowledge and systems thinking to succeed as practicing physicians. If a gap remains in physicians' knowledge of these essential domains, the impact of system interactions on patients and health care practices will go unrecognized, potentially resulting in substandard and inefficient health care delivery. Accrediting organizations have recognized the need for these CQI skills by establishing competencies, such as practice-based learning and improvement (PBLI), and have incorporated them into physicians' training and practice.1,2 

In response to the Accreditation Council for Graduate Medical Education's (ACGME's) PBLI competency requirement, several approaches to training PBLI have emerged, but descriptions provided are not sufficiently detailed to permit replication.3 In the literature, the descriptions of the frequency and duration of educational sessions' content and objectives are variable and occasionally difficult to ascertain.3 Most sessions occurred during an elective4,5 or ambulatory rotation/setting.6617 Most PBLI curricula include didactic instruction (small groups/lectures,416,18,19 workshops,17 and web-based instruction20) and/or experiential learning, with chart audit,6,10,21 systematic analysis of morbidity and mortality conferences,22 root cause analysis,5,20 and proposals/projects,4,5,720 with many of the proposals/projects being team based.7,8,10,1216,19 

There are important gaps in PBLI curricula content.3,2325 For example, Windish et al3 highlight that only 5 of 13 resident-based quality improvement (QI) curricula reviewed had addressed the Institute for Healthcare Improvement's knowledge domains consisting of (1) developing new, locally useful knowledge and (2) understanding health care as a process/system. Moreover, we found only 1 study describing the presentation of project proposals to institutional leadership and staff that was not involved in mentoring/facilitating or teaching the educational sessions.11 Another gap relates to limited integration of experiential learning activities within the organization.3,24 

To address these gaps, we developed a comprehensive PBLI QI systems curriculum that integrates experiential learning activities and highlights health care system interactions via participation in system-level QI projects. The curriculum emphasized making changes in health care processes, understanding core improvement knowledge and skills, and incorporating opportunities to share QI skills through project presentations.24 

We evaluated the impact of the PBLI QI systems' curriculum on PBLI application skills, self-efficacy, and knowledge by using the Systems Quality Improvement Training and Assessment Tool (SQI TAT), which consists of a questionnaire and a coding system for scoring open-ended responses.26 The study was approved by the Louis Stokes Cleveland Department of Veterans Affairs Medical Center's (LSCDVAMC) Institutional Review Board.

Participants were internal medicine (IM) residents, postgraduate year 1 through 3, training at University Hospitals of Cleveland and the LSCDVAMC. Each year, IM residents are assigned to complete a 4-week ambulatory block. During academic year 2005, a PBLI QI systems and a non-PBLI (systems-based practice and microteaching) curriculum were offered on alternating blocks.

In a quasi-experimental design, ambulatory block data were collected from residents during 6 intervention (PBLI QI systems curriculum) blocks (n  =  46) and 5 comparison blocks (n  =  40). Five additional residents participated in some teaching sessions but weren't available to complete the pre- or post-test questionnaires and two residents had missing information on the post-test. Questionnaires were completed by residents at the outset of the first session (pre-test), prior to any discussion, and at the conclusion of the last session of the block (post-test).” When the questionnaires were administered, part A and part B were completed sequentially. An anonymous exit evaluation was also completed at the end of the last session of the block.

The systems-based PBLI curriculum occurred during a 4-week period on alternate resident ambulatory blocks (table 1).24 Residents attended one-half-day-a-week sessions involving didactics and small-group application exercises linked to the development of system-level QI projects aligned with organizational needs. Opportunities for these projects were identified by the clinical managers in the area of practice of the residents, for example, outpatient clinic and emergency department. Inclusion criteria for project themes were (1) alignment with the institution's QI goals, (2) relation to the residents' practice in these work areas, and (3) support by a clinician champion (ie, physician[s] interested in improving the problem identified and working with residents and project stakeholders). Further development of the project by the residents occurred outside the didactic sessions and included activities such as discussion with stakeholders and data collection and analysis (table 1). The clinical champion provided an interface between trainees and project stakeholders. Physicians in this role also facilitated the implementation of projects.

table 1 

Overview of the Curriculum Experiences for Intervention and Comparison Groups

Overview of the Curriculum Experiences for Intervention and Comparison Groups
Overview of the Curriculum Experiences for Intervention and Comparison Groups

Consistent with ACGME goals, the curriculum was structured to teach the core learning principles of PBLI and to challenge residents to share this knowledge with other trainees, clinical faculty, and clinical leadership through their project presentations at Morbidity and Mortality Conference (MMC). This high-visibility forum was sought so residents could present their newly acquired skills and respond to questions from professional colleagues (from medical students to institutional leadership).

The curriculum was delivered by 2 second-year fellows in the Veterans Affairs Quality Scholars Fellowship Program who had expertise in QI methods.27 They served as content experts and facilitated application of concepts through direct feedback during project development.

The comparison group consisted of residents who participated in a different 4-week curriculum (systems-based practice and microteaching) occurring on alternate ambulatory blocks to the PBLI QI systems curriculum (table 1). Experiential learning activities for this group included small-group exercises, for example, patient safety analysis, and individual activities emphasizing microteaching skills. These trainees also had tasks to complete outside of the formal learning sessions.

Measurement and Analysis

We evaluated the intervention by assessing application skills, self-efficacy, and knowledge. In addition, we collected input regarding time invested in and experiences with the team projects and suggestions for improving the curriculum. The former were measured with the SQI TAT, which consists of closed- and open-ended questions and a coding system.26 This tool scores key variables for ideal responses to evaluate project proposals (14 variables identified) and short answer definitions for core knowledge concepts (30 variables). In preliminary analyses of coder training and reliability for this sample, the SQI TAT and its coding system were found to have good intercoder reliability (percent agreement greater than 85%; the Lin concordance coefficient for total knowledge was 0.98 and for total application scores, 0.88). It was also found to have face and content validity (established via review by experts in the field) and discriminative validity. Specifically, at baseline, mean scores for residents who had experienced a prior PBLI curriculum were higher than those who had not, for total knowledge scores and for 3 of 4 self-efficacy items related to project development, development of a data collection plan, and for project implementation. Although the mean scores for total application and self-efficacy about teaching PBLI were higher for the group with prior experience, the differences were not statistically significant.26 The psychometric properties of this tool for this cohort are described in detail elsewhere.26 

table 2 provides an outline of the SQI TAT questions. Part A was an open-ended format to assess application skills through a project proposal and was based on recall and understanding rather than recognition via prompting. Instructions for part A were as follows:

Based on your clinical experiences, develop a project that would help to improve any aspects of patient care. Please provide enough information so that someone unfamiliar with the context would know what to do, how to do it, and why.

table 2 

Baseline Information for Intervention (n  =  46) and Comparison (n  =  40) Groupsa

Baseline Information for Intervention (n  =  46) and Comparison (n  =  40) Groupsa
Baseline Information for Intervention (n  =  46) and Comparison (n  =  40) Groupsa

Coders evaluated responses by using a coding system in which each relevant unit of information (variable) was scored as present or absent (dichotomous), with some units weighted more according to their importance or complexity. Specifically, responses were coded for the presence or absence of variables in the following sections relevant to application skills: background, aim, intervention, measurement, impact, and next steps (table 3).26 

table 3 

Description of Scoring for the Application Sections and Knowledge Items

Description of Scoring for the Application Sections and Knowledge Items
Description of Scoring for the Application Sections and Knowledge Items

Part B is composed of 4 closed-ended (5-point Likert scale) self-efficacy (belief one is capable) and comfort-level items and 6 open-ended knowledge items (table 2). The self-efficacy items correspond to core considerations that form a developmental trajectory in PBLI capabilities and build on previous instruments.4,9,18 We used the following 4 closed-ended items (5-point Likert scale): (1) developing a plan, (2) developing a plan that takes contextual considerations and restraints into consideration, (3) implementing a plan, and (4) teaching.

Basic knowledge was assessed by 6 open-ended items that focus on core concepts that need to be understood to adequately develop a project, though the actual terms or concepts may not be defined in a project proposal (eg, defining common cause variation). Respondents were asked to define the following 6 knowledge items: change concept, cause-effect diagram, elements of the improvement model, common cause variation, special cause variation, and why the distinction is important. Responses to each of the knowledge items were coded for the presence or absence of multiple relevant units of information (variables) required to provide a comprehensive description of the concept (eg, to adequately define the concept of a cause-effect diagram, 9 units of information/variables were coded). Some of the individual variables are weighted more given their complexity and importance in demonstrating an understanding of the concept (table 3).

The pretest also assesses perceived knowledge and prior experience (table 2) by using items from Djuricich's CQI Curriculum for Residents.9,18,28 

Two coders (A.M.T. and R.H.L.) scored the open-ended application and knowledge components. Consensus was obtained on all discrepancies before summing to obtain scores for analysis. Baseline scores were compared for the 2 groups and difference scores (post-pre) were analyzed. Because the distributions for the difference scores were nonnormal, we transformed the data (added a constant of 10 points to eliminate negative and zero scores and then obtained the square root) to achieve normal distributions.

We evaluated the curriculum's impact with t tests, followed by multiple regression analysis to evaluate the intervention's impact after adjusting for prior experience. Analyses were performed on total scores and their subsections. Significance was determined by P < .05. For ease of presentation and discussion, we report raw, not transformed, scores in the tables.

Team projects were evaluated via an anonymous exit evaluation completed by trainees at the end of the last PBLI session, which also collected data on resident time spent on team projects (individually and as a team) outside of the weekly structured didactic sessions. Responses were used to generate descriptive information. Team project experience was assessed by focusing on residents' satisfaction with team performance and sense of shared responsibility for the project. Both items were measured on a 6-point Likert scale. Responses were summarized for descriptive purposes. Suggestions for improving the curriculum and experiences related to PBLI were sought via an open-ended item. Responses were used to generate themes for guiding integration of feedback.

Pre-post data were available for 46 residents in the intervention group and 40 residents in the comparison group. Exit evaluations for the PBLI curriculum were completed by 96% of the 49 attendees. Baseline data are summarized in table 2. The groups did not differ significantly on any pretest items. Overall levels of PBLI knowledge and application skills were low. Groups were not different regarding distribution of post-graduate year (P  =  .616) and previous exposure to a PBLI curriculum (P  =  .227) (χ2 analysis).

Core Domains of PBLI Curriculum

The difference in scores between the intervention group and the comparison group are shown in table 4.

table 4 

Difference Scoresa (Postscore Minus Prescore) for Intervention (n  =  46) and Comparison Groups (n  =  40)

Difference Scoresa (Postscore Minus Prescore) for Intervention (n  =  46) and Comparison Groups (n  =  40)
Difference Scoresa (Postscore Minus Prescore) for Intervention (n  =  46) and Comparison Groups (n  =  40)

Application (Proposed Projects)

Residents in the PBLI curriculum did not demonstrate significant improvement in total application scores relative to the comparison group (P  =  .06). When analyzing the individual sections, the groups differed only on the aim statement (P < .001), which remained significant after adjusting for prior experience (P  =  .001).

Self-efficacy and Comfort Level

Residents participating in the PBLI curriculum had a greater increase in the belief about their ability to implement a CQI project and in comfort level in developing data collection plans. The groups did not differ on the other 2 items. These findings remained after adjusting for prior experience.

Knowledge

Residents in the PBLI curriculum demonstrated greater increase in total knowledge and its component parts except when defining common cause variation and the distinction between common and special cause variation. These findings remained after adjusting for prior experience.

Team Projects

There were 7 clinical projects: (1) medication refill; (2) hand-offs; (3) uncertain disposition; (4) code cart; (5) documentation for blood pressure; (6) community-acquired pneumonia; and (7) Papanicolaou smear.

Resident Time Invested

The median time spent working individually outside of didactic sessions was 2 and one-quarter hours (interquartile range [IQR], almost 2 and three-quarter hours; range, 0 to 28 hours) and working with the team was 2 hours (IQR, 2 and one-half hours; range, 20 minutes to 23 hours). On average, more individual time was spent during week 2 and more team time during week 3.

Team Project Experience

Overall, residents were satisfied with team performance (93.9% were either “very satisfied [category 1]” or “mostly satisfied [category 2]” and the median response category was 2). They mostly agreed that everyone shared responsibility (82.1% reported either “strongly agreeing [category 1]” or “mostly agreeing [category 2],” with a median response category of “mostly agree”).

Suggestions for Improving the Curriculum

Twenty-four of 49 intervention residents left this item blank, or drew a line. Four wrote “none.” Three themes emerged from responses provided by the 21 residents: (1) more time and/or more support to reduce time; (2) better road map (suggestions to make tasks and processes clearer, more consistent, and/or more structured); and (3) suggestions for defining projects more clearly and making them more achievable. Aside from improvement feedback, 8 residents provided positive feedback (unsolicited) in response to this question. These responses acknowledged the value and importance of the curriculum, including the contributions of the faculty. The data are summarized in table 5.

table 5 

Themes From Exit Evaluations: Suggestions for Improving the PBLI Curriculum Experience (n  =  21)a

Themes From Exit Evaluations: Suggestions for Improving the PBLI Curriculum Experience (n  =  21)a
Themes From Exit Evaluations: Suggestions for Improving the PBLI Curriculum Experience (n  =  21)a

The call for PBLI competency among graduates of medical education has led to a variety of approaches to establishing curricula. As we move toward a more standard approach for PBLI training that can be implemented across institutions, there is a need to identify certain curricular elements that should be an accepted part of the PBLI teaching methodology. Our curriculum informs these next steps by uniquely integrating (1) core knowledge concepts (improvement knowledge, performance measurement, and systems thinking) to ensure a solid foundation for QI implementation; (2) experiential learning into organizational improvement efforts; (3) team-based project development and implementation modeling QI learning that occurs in clinical practice; and (4) mandatory presentation of QI projects to a broad audience, including key stakeholders and institutional leadership. Several of these attributes have been acknowledged by leaders in health care improvement as being important knowledge systems, tools, and methods that are needed to transform the health care system.29 

Compared to the control group, residents participating in our PBLI curriculum demonstrated an increased understanding of key improvement concepts and tools, such as “change concept,” “improvement model,” “cause-effect diagram,” “special cause variation,” and increased ability to develop a complete aim statement. As for the findings related to the knowledge component, they are consistent with the expected progression in the cognitive development of learners from basic terminology (ie, describe a change concept) to more complex concepts, such as analyzing data (ie, special cause variation).26 The improvement in defining the more complex items' common cause variation and explaining the difference between common versus special cause variation was not statistically significant. This suggests that the curriculum needs to identify opportunities to reinforce these more challenging concepts, consistent with suggestions made by the residents (table 5).

Perhaps not surprisingly, the curriculum had a larger impact on learner cognitive development regarding basic knowledge than application skills. Among the application skills evaluated by the SQI TAT, significant improvement was found in developing an aim statement, a fundamental step in applying the Model for Improvement before successful project implementation.30 While it is surprising that there was no significant improvement in the background section, a more basic application skill, this skill is not unique to QI work and may not have been seen as a priority for faculty or residents. These findings may reflect the need, within this time-constrained curriculum, to tailor actual project advancement to meet the residents' evolving QI learning needs, consistent with appreciating a progression in the cognitive development that is needed to advance learners' skill acquisition.25,31 Given the limited exposure to PBLI in this curriculum and other aspects of the training program, it may have been difficult for residents to transfer application skills from the actual project to the application component of the SQI TAT.32 

Participants in the PBLI curriculum had greater belief in their ability to implement a CQI project and a greater comfort level in developing data collection plans, reinforcing the value of including projects. The high satisfaction reported for team performance further underscores the value of experiential learning grounded in team-based clinical projects. At the same time, participants did not demonstrate increased ability to develop a project or teach CQI principles. One explanation may be that, while the curriculum was limited in its exposure to PBLI (ie, 4 teaching sessions), it provided residents with a sense of the tremendous scope of CQI. Time limitations may have also hindered the progression of skill development, from implementing CQI to developing and teaching it.

While the findings from this preliminary study are promising, there are several limitations. Although the study used a comparison group, it involved only 1 institution and 1 residency program. However, the comparison group provided a stringent assessment, as the participants were receiving a systems-based practice curriculum that had overlapping domains of knowledge and skill sets.33,34 Moreover, while limited to a single site and residency program, we have complete data on 92% of the residents (n  =  86) for an entire academic year. Generalizability is limited because it is not clear exactly how much time was required of the faculty and clinician champions to support the project outside of the curriculum time. In an effort to measure this effort, we solicited weekly estimates from faculty and champions regarding the amount of time spent outside of the teaching sessions and the method of contact. Inconsistent responses limited our conclusions but provided some estimates: for the 23 responses provided, the average amount of time spent with residents was 11.7 minutes per week (range, 0–60 minutes), mostly as face-to-face interactions. Of note, our faculty had prior experience in QI; this factor may also limit the ability of other programs to estimate faculty time commitments.

The SQI TAT is still in development and testing phase and may be limited in its ability to detect improvement in the 3 domains, namely, application, self-efficacy, and knowledge. Further evaluation, including in other settings, is needed to validate and refine it. Nevertheless, this preliminary assessment tool demonstrated good psychometric properties consistent with a developmental progression of PBLI knowledge and application skills.26 

While the residents demonstrated change and worked to develop and implement system-relevant projects, we do not know if the curriculum has or will impact on their practices. Our assessment for these analyses focused on application skills, self-efficacy, and knowledge; however, other analyses have assessed the added value and sustainability of these projects for the institution.24 Our vision includes integrating a PBLI curriculum that nurtures learners in the development of PBLI knowledge and skills, and application of those knowledge and skills sets in such a way that CQI becomes part of everyday practices. Our experience during this pilot study suggests that a longitudinal curriculum throughout the 3 years of training is more likely to enhance the progression of improvement in application skills, self-efficacy, and knowledge, beyond that afforded by the 4-week immersion during the ambulatory block. It has been suggested that transfer of knowledge and skills is difficult and requires repeated exposure.32 Thus, repeated exposure to PBLI, combined with active mentoring of these experiences, should facilitate the transfer of improvement knowledge and application skills to different scenarios. Additionally, faculty and organizational support for this work by the trainees is critical, as it validates its importance in advancing health care outcomes.

Residents participating in our comprehensive curriculum showed overall greater self-efficacy, comfort level, knowledge, and application of CQI. This observed impact reflects increased understanding of some CQI concepts fundamental to the development of improvement in health care.29 At the same time, we recognize that these necessary building blocks do not guarantee implementation and/or changes in everyday practices and thinking. We are hopeful that, coupled with the changes in self-efficacy and comfort, there is a solid foundation for further development of PBLI. We hope that our findings will encourage residency programs to pursue a dialogue about challenges and opportunities for creating a developmental approach to teaching and assessing PBLI. Our aim is to help nurture professionals who value and integrate PBLI into their professional practice, not merely know about PBLI.35 

1.
American Board of Medical Specialties
.
Maintenance of certification: competencies and criteria
. .
2.
Accreditation Council for Graduate Medical Education
.
Common program requirements: general competencies
. .
3.
Windish
DM
,
Reed
DA
,
Boonyasai
RT
,
Chakraborti
C
,
Bass
EB
.
Methodological rigor of quality improvement curricula for physician trainees: a systematic review and recommendations for change
.
Acad Med
.
2009
;
84
(
12
):
1677
1692
.
4.
Ogrinc
G
,
Headrick
LA
,
Morrison
LJ
,
Foster
T
.
Teaching and assessing resident competence in practice-based learning and improvement
.
J Gen Intern Med
.
2004
;
19
(
5, pt 2
):
496
500
.
5.
Weingart
SN
,
Tess
A
,
Driver
J
Aronson
MD
,
Sands
K
.
Creating a quality improvement elective for medical house officers
.
J Gen Intern Med
.
2004
;
19
(
8
):
861
867
.
6.
Holmboe
ES
,
Prince
L
,
Green
M
.
Teaching and improving quality of care in a primary care internal medicine residency clinic
.
Acad Med
.
2005
;
80
(
6
):
571
577
.
7.
Coleman
MT
,
Nasraty
S
,
Ostapchuk
M
,
Wheeler
S
,
Looney
S
,
Rhodes
S
.
Introducing practice-based learning and improvement ACGME core competencies into a family medicine residency curriculum
.
Jt Comm J Qual Saf
.
2003
;
29
(
5
):
238
247
.
8.
Shunk
R
,
Dulay
M
,
Julian
K
,
et al
.
Using the American Board of Internal Medicine practice improvement modules to teach internal medicine residents practice improvement
.
J Grad Med Educ
.
2010
;
2
(
1
):
90
95
.
9.
Djuricich
AM
,
Ciccarelli
M
,
Swigonski
NL
.
A continuous quality improvement curriculum for residents: addressing core competency, improving systems
.
Acad Med
.
2004
;
79
(
10
):
S65
S67
.
10.
Oyler
J
,
Vince
L
,
Arora
V
,
Johnson
J
.
Teaching internal medicine residents quality improvement techniques using the ABIM's practice improvement modules
.
J Gen Intern Med
.
2008
;
23
(
7
):
927
930
.
11.
Leenstra
JL
,
Beckman
TJ
,
Reed
DA
,
et al
.
Validation of a method for assessing resident physicians' quality improvement proposals
.
J Gen Intern Med
.
2007
;
22
(
9
):
1330
1334
.
12.
Nuovo
J
,
Balsbaugh
T
,
Barton
S
,
et al
.
Development of a diabetes care management curriculum in a family practice residency program
.
Dis Manag
.
2004
;
7
(
4
):
314
324
.
13.
Varkey
P
,
Reller
M
,
Smith
A
Ponto
J
,
Osborn
M
.
An experiential interdisciplinary quality improvement education initiative
.
Am J Med Qual
.
2006
;
21
(
5
):
317
322
.
14.
Frey
K
,
Edwards
F
,
Altman
K
Spahr
N
,
Gorman
R
.
The “Collaborative Care” curriculum: an educational model addressing key ACGME core competencies in primary care residency training
.
Med Educ
.
2003
;
37
(
9
):
786
789
.
15.
Mohr
J
,
Randolph
G
,
Laughon
M
,
Schaff
E
.
Integrating improvement competencies into residency education: a pilot project from a pediatric continuity clinic
.
Ambul Pediatr
.
2003
;
3
(
3
):
131
136
.
16.
Landis
S
,
Schwarz
M
,
Curran
D
.
North Carolina family medicine residency programs' diabetes learning collaborative
.
Fam Med
.
2006
;
38
(
3
):
190
195
.
17.
Morrison
LJ
,
Headrick
LA
.
Teaching residents about practice-based learning and improvement
.
Jt Comm J Qual Patient Saf
.
2008
;
34
(
8
):
453
459
.
18.
Canal
DF
,
Torbeck
L
,
Djuricich
AM
.
Practice-based learning and improvement: a curriculum in continuous quality improvement for surgery residents
.
Arch Surg
.
2007
;
142
(
5
):
479
483
.
19.
Ellrodt
AG
.
Introduction of total quality management (TQM) into an internal medicine residency
.
Acad Med
.
1993
;
68
(
11
):
817
823
.
20.
Peters
AS
,
Kimura
J
Ladden
JD
,
March
E
,
Moore
GT
.
A self-instructional model to teach systems-based practice and practice-based learning and improvement
.
J Gen Intern Med
.
2008
;
23
(
7
):
931
936
.
21.
Hildebrand
C
,
Trowbridge
E
,
Roach
MA
,
Sullivan
AG
,
Broman
AT
,
Vogelman
B
.
Resident self-assessment and self reflection: University of Wisconsin-Madison's five-year study
.
J Gen Intern Med
.
2009
;
24
(
3
):
361
365
.
22.
Fussell
JJ
,
Farrar
HC
,
Blaszak
RT
,
Sisterhen
LL
.
Incorporating the ACGME educational competencies into morbidity and mortality review conferences
.
Teach Learn Med
.
2009
;
21
(
3
):
233
239
.
23.
Batalden
P
,
Berwick
D
,
Bisognano
M
,
Splaine
M
,
Baker
G
,
Headrick
L
.
Knowledge Domains for Health Professional Students Seeking Competency in the Continual Improvement and Innovation of Health Care
.
Boston, MA
:
Institute for Healthcare Improvement
;
1998
.
24.
Tomolo
A. M
,
Lawrence
RH
,
Aron
DC
.
A case study of translating ACGME practice-based learning and improvement requirements into reality: systems quality improvement projects as the key component to a comprehensive curriculum
.
Qual Saf Health Care
.
2009
;
18
:
217
224
.
25.
Ogrinc
G
,
Headrick
LA
,
Mutha
S
,
Coleman
MT
,
O'Donnell
J
,
Miles
PV
.
A framework for teaching medical students and residents about practice-basedlLearning and improvement, synthesized from a literature review
.
Acad Med
.
2003
;
78
(
7
):
748
756
.
26.
Lawrence
RH
,
Tomolo
AM
.
Development and preliminary evaluation of a practice-based learning and improvement tool for assessing resident competence and guiding curriculum development
.
J Grad Med Educ
.
2011
;
3
(
1
):
41
48
.
27.
Splaine
ME
,
Ogrinc
G
,
Gilman
SC
,
et al
.
The Department of Veterans Affairs National Quality Scholars Fellowship Program: experience from 10 years of training quality scholars
.
Acad Med
.
2009
;
84
(
12
):
1741
1748
.
28.
Djuricich
AM
.
A continuous quality improvement curriculum for residents
.
Available at: http://www.aamc.org/mededportal. (search ID  =  468). MedEdPORTAL/AAMC Web site. Accessed July 21, 2010
.
29.
Batalden
PB
,
Davidoff
F
.
What is quality improvement and how can it transform healthcare
?
Qual Saf Health Care
.
2007
;
16
(
1
):
2
3
.
30.
Langley
GJ
,
Nolan
KM
,
Nolan
TW
,
et al
.
The Improvement Guide: A Practical Approach to Enhancing Organizational Performance
.
San Francisco, CA
:
Jossey-Bass
;
1996
.
31.
Ogrinc
G
,
West
A
,
Eliassen
MS
,
Liuw
S
,
Schiffman
J
,
Cochran
N
.
Integrating practice-based learning and improvement into medical student learning: evaluating complex curricular innovations
.
Teach Learn Med
.
2007
;
19
(
3
):
221
229
.
32.
Eva
KW
,
Neville
AJ
,
Norman
GR
.
Exploring the etiology of content specificity: factors influencing analogic transfer and problem solving
.
Acad Med
.
1998
;
73
(
10
):
S1
S5
.
33.
Accreditation Council for Graduate Medical Education
.
Advancing education in practice-based learning and improvement: an educational resource from the ACGME Outcome Project
. .
34.
Varkey
P
,
Karlapudi
S
,
Rose
S
,
Nelson
R
,
Warner
M
.
A systems approach for implementing practice-based learning and improvement and systems-based practice in graduate medical education
.
Acad Med
.
2009
;
84
(
3
):
335
339
.
35.
Batalden
P
,
Davidoff
F
.
Teaching quality improvement: the devil is in the details
.
JAMA
.
2007
;
298
(
9
):
1059
1061
.

Author notes

At the time of this writing, Anne M. Tomolo, MD, MPH, was a staff physician in Medical Service and the Department of Veterans Affairs Health Services Research and Development's (VA HSR&D) Center for Implementation Practice and Research Support at the Louis Stokes Cleveland Veterans Affairs Medical Center, and an Assistant Professor, Department of Medicine at Case Western Reserve University School of Medicine. Dr. Tomolo is currently affiliated with the Atlanta Veterans Affairs Medical Center; Renée H. Lawrence, PhD, is a health research scientist at Medical Service and VA HSR&D Center for Implementation Practice and Research Support at the Louis Stokes Cleveland Department of Veterans Affairs Medical Center; Brook Watts, MD, MS, is a staff physician at Medical Service and VA HSR&D Center for Implementation Practice and Research Support at the Louis Stokes Cleveland Department of Veterans Affairs Medical Center, and an Assistant Professor, Department of Medicine at Case Western Reserve University School of Medicine; Sarah Augustine, MD, is a staff physician at Medical Service at the Louis Stokes Cleveland Department of Veterans Affairs Medical Center, and an Assistant Professor, Department of Medicine at Case Western Reserve University School of Medicine; David C. Aron, MD, MS, is a staff physician at Medical Service and VA HSR&D Center for Implementation Practice and Research Support at the Louis Stokes Cleveland Department of Veterans Affairs Medical Center, and a Professor, Department of Medicine at CaseWestern Reserve University School of Medicine; and Mamta K. Singh, MD, MS, is a staff physician at Medical Service at the Louis Stokes Cleveland Department of Veterans Affairs Medical Center, and an Assistant Professor, Department of Medicine at Case Western Reserve University School of Medicine.

This work was funded in part by a Department of Veterans Affairs' Health Services Research and Development Service SHP grant (08‐194, D.C.A. and A.M.T.). All statements and descriptions expressed herein do not necessarily reflect the opinions or positions of the Department of Veterans Affairs.