Study Objective

The purpose of this study was to develop an objective method of evaluating resident competency in systems-based practice.

Study Design

Faculty developed a 12-station examination, the Objective Structured System-Interaction Examination (OSSIE), patterned after the Objective Structured Clinical Examinations (OSCEs), to evaluate residents' ability to effectively work within the complex medical system of care. Scenarios consisted of multiple situations, such as patient hand-offs, consultations, complicated discharges, and family meetings, in which residents interacted with simulated professionals, simulated patients, and simulated family members to demonstrate the systems-based skills. Twelve second-year residents participated in the OSSIE.

Findings

Along with the standardized professionals, a faculty member provided the resident with immediate feedback and completed an evaluation form designed specifically to assess systems-based practice. Residents, faculty, and staff evaluated the OSSIE and felt it provided a rich learning experience and was a beneficial means of formative assessment. The residents' third-year learning experiences were adapted to meet their needs, and suggestions were offered for curriculum revision.

Discussion

The OSSIE is unique in that it uses standardized professionals, involves scenarios in a variety of settings, and incorporates current technology, including an electronic health record and a state-of-the-art simulation laboratory, into the examination. Challenges to implementation include faculty time, scheduling of residents, and availability of resources.

Conclusion

By using the OSSIE, faculty are able to assess, provide constructive feedback, and tailor training opportunities to improve resident competence in systems-based practice. Reliability and validity of an instrument developed for use with the OSSIE are currently being determined.

The definition of a competent physician is evolving from “the doctor who possesses the right attributes” to “the doctor who does the right thing.” The public is concerned about the delivery of competent care, which can be assessed by concrete measures such as quality, safety, and cost. Competent care is provided through collaboration with the other members of the health care team and is proven by good outcomes.1 In light of this new definition of competence, the Accreditation Council for Graduate Medical Education (ACGME) has identified systems-based practice as 1 of the 6 core competencies needed by residents to practice effectively in today's health care milieu. To be competent in systems-based practice, “residents must demonstrate an awareness of and responsiveness to the larger context and system of health care, as well as the ability to call effectively on other resources in the system to provide optimal health care.”2 Good patient care no longer resides in the hands of an individual physician. Rather, residents are challenged to work effectively with a team of professionals, coordinate care among them, consider costs when weighing risks and benefits, and help the patient navigate the complex health care system.36 

Competency in systems-based practice has posed many challenges, both related to how to teach it and how to assess resident competency in this complex area. In a 2005 survey, in terms of importance, program directors assigned the lowest rank to the competency of systems-based practice. At the time of the survey, 70.4% of the programs had no method to evaluate systems-based practice.7 A taxonomy of observable behaviors and a method of evaluation that allows observation of these behaviors is needed.8 The purpose of the article is to explore the difficulties with current methods used to measure systems-based practice and to describe the development of the Objective Structured System-Interaction Examination (OSSIE). The OSSIE is patterned after the format of the Objective Structured Clinical Examination (OSCE), which is commonly used in medical schools. The OSSIE, as an evaluation method, addresses some of the difficulties in current evaluation methods of systems-based practice including subjectivity, inconsistency, and lack of structure.

Typically, systems-based practice has been evaluated by using subjective measures.911 The 360-degree global rating is commonly recommended to measure residents' performance in the larger system of care and the resident's ability to advocate for the patient. In the 360-degree evaluation, physician supervisors, other residents, nurses, patients, and families complete surveys about the resident's teamwork, communication, management skills, and decision making. However, it is difficult to construct a single survey that can be used by all the various evaluators, which makes 360-degree evaluations less feasible in many clinical environments. Additionally, a large number of evaluations must be completed for each resident to obtain reliable results.9 To obtain a high degree of reliability, at least 40 surveys are needed for each resident from each type of evaluator.1214 This is not a reasonable target, given the current construct of most residency programs. In 1 study, no significant correlation was found between the 360-degree evaluation during the first year of residency and the same subject's 4th-year objective measures including OSCE performance, class rank, and United States Medical Licensing Examination (USMLE) scores.12 The 360-degree evaluation may also be limited by language and literacy problems for patients and families. Another limiting factor for the use of 360-degree evaluations is that a fair amount of resources in staff and time must be allotted to collect, aggregate, and report survey responses. Finally, little information can be retrieved from these surveys that can be used for constructive feedback.13,14 

Checklists are another approach for evaluating systems-based practice. By using this method, a checklist of all desired activities that reflect systems-based practice is developed. This method is limited in that the resident has to be observed in multiple situations in which the issues occur. Additionally, for consistent scores to be obtained, observers must be well trained to focus on the skills of systems-based practice.13 

The multiple-choice examination is the only objective measure that has been used in assessment of systems-based practice. The multiple-choice examination has been used to evaluate knowledge gained from innovative programs that explored the complex health care system.1517 While the multiple-choice examination can effectively test the resident's knowledge of systems-based practice, it does not lend itself to evaluation of the resident's ability to put this knowledge into action.13 Outcome-based methods are needed to evaluate other components of systems-based practice, such as safety errors of omission that involve the health care system. Additionally, methods are needed that facilitate provision of constructive feedback to residents to improve their competency in systems-based practice.18 

The ACGME has proposed the use of simulations, OSCEs, and standardized patients (SPs) for assessing resident performance and for providing formative evaluations. These approaches place residents in circumstances that simulate real-life situations. Objective Structured Clinical Examinations standardize the evaluation by controlling the variance that is inherent in the real-life situations.13,19 In a 2005 survey, only 11.1% of programs reported using any type of simulation for evaluation of residents.7 

Petrusa20 suggested that simulations that portray multiple persons in complex interactions should be developed for resident evaluations. Some of the shortcomings of the 360-degree evaluation, checklist, and patient surveys might be diminished by using a modified OSCE, in which simulations of situations that reflect systems-based practice were replicated by using SPs, standardized families, and other health care professionals within an OSCE. Video recordings of the evaluations would allow residents to view their own performance along with faculty. This method could be a useful tool in formative assessments.

Numerous studies support the psychometric properties of OSCEs. Twelve to 18 stations are recommended to obtain reliable measurements of performance. A separate performance score is determined for each station. Then scores are combined to determine a pass/fail score.18 Both construct and concurrent validity have been established.21 

With the use of OSCEs, SPs as well as physician observers can evaluate the participants. Standardized patients supply the patient-oriented or team-member perspective. McLaughlin22 suggested that the SP evaluation within an OSCE is a valuable tool for formative evaluations. Scores given by SPs correlate to those given by physicians; however, the SP score generally is higher than the physician score. McLaughlin speculated that SPs may not be able to differentiate students with superficial medical knowledge from those with extensive grasp of the medical problem. By contrast, SPs do not overrate students' performance in communication and effectiveness of the physician/patient interaction.22 

A potential drawback of development of the OSCE for resident evaluations is the faculty time required to create appropriate cases. Objective Structured Clinical Examinations may also be an expensive way to assess residents in terms of faculty time and SP training. Fortunately, however, most medical schools already have structure in place for recruiting and training SPs, which might be shared by residency programs. The undergraduate staff is skilled in developing and managing OSCEs. These resources could facilitate use of OSCEs for other purposes such as resident evaluation.13 

In response to the challenge found in measuring resident competency in systems-based practice, the faculty of the Division of General Internal Medicine at Southern Illinois University School of Medicine developed an innovative method of evaluation of systems-based practice. Using the recommendations from the literature, the faculty is developing an objective form of measurement patterned after the OSCE. We will now share the steps of development of the OSSIE for evaluating the resident's ability to interact with the health care team and coordinate care across settings.

We invited all general internal medicine faculty to develop an OSSIE case. Initially, 4 faculty members volunteered to write the cases. Since systems-based practice encompasses many skills, the OSSIE cases took a variety of forms. In the diverse scenarios, the physician needed to consider cost, risk versus benefits, safety issues, and problems created by inefficient functioning of the health care system. We planned 1 pilot evaluation session that would include the 4 OSSIEs developed by faculty volunteers.

Throughout our initial effort to develop cases, we targeted 1 of the ACGME's skills for systems-based practice for each OSSIE case. During our pilot OSSIE, residents were required to contact and arrange for various resources, work with the health care team, and coordinate care across settings. One case focused on utilization of resources, the second on cost-effective care, the third on working with a health care team, and the fourth on coordination of care.

Development and implementation of the OSSIE was a collaborative effort among the faculty of General Internal Medicine, the Office of Education and Curriculum, and the Internal Medicine Residency Program. The Office of Education and Curriculum provided expertise and resources. A curriculum specialist with experience in developing cases and conducting standardized patient encounters and OSCEs with medical students helped to detail the cases, write training notes, organize the evaluation sessions, and train the SPs. The professional development laboratory was made available for our use. The laboratory is equipped with 8 rooms surrounding the observation room, which has 1-way windows through which faculty can observe the encounters.

Approximately 6 months were allotted to the development and planning of the OSSIE evaluation. Training of the SPs was done 1 week before the OSSIE evaluation. Faculty who wrote the case participated in the training session for their case. The afternoon before conducting the OSSIE was devoted to preparing the rooms in the laboratory. During the first evaluation, 13 second-year internal medicine residents participated. The faculty who created the cases served as the observers. The cases included 30 minutes for the clinical encounter, with 5 minutes reserved for evaluation and 5 minutes for travel time to the next OSSIE. Immediately after completion of the OSSIE, the resident was provided feedback from the faculty and SPs. Each resident was videotaped performing the OSSIE for the pilot. The OSSIE was considered a method of formative evaluation; therefore, no “pass” or “fail” ratings were given. Faculty and SPs helped residents examine their own performance for strengths and weakness. They helped residents identify areas for improvement to target during their third year of residency.

During the next 4 months, faculty reviewed all of the videotapes, while focusing on the ACGME skills for systems-based practice. Faculty found that although each case was intended to target only 1 of the systems-based skills, it was difficult to isolate a single skill because of the interrelatedness of skills necessary to function in a system. With small revisions, each case addresses most, if not all, of the skills. During this time, we also began developing an instrument to measure systems-based practice to be used with the OSSIE. We currently are determining the psychometric properties of the instrument and will modify the instrument accordingly.

During our pilot year, a 4-case OSSIE was administered, for which residents were required to contact and arrange for various resources, work with the health care team, and coordinate care across settings. During the next 6 months, 8 additional cases were developed. Faculty began the development process with the list of skills to meet the competency of systems-based practice and then developed cases with elements that could be used to evaluate all the skills. The set of 12 cases will be used in the evaluation of the second-year internal medicine residents during the next academic year.

Unique Aspects of the OSSIE

Brief descriptions of the OSSIE cases are included in table 1.

The OSSIE allowed for direct observation of the residents during which specific deficiencies or concerns were identified. These were shared with the program director and chief residents so adaptations to the individual resident's schedule could be made to include learning experiences that would help to address those concerns.

Findings

Residents, faculty, and staff involved all felt that this was a successful, useful endeavor. Structured scenarios and the opportunity for feedback from multiple observers created a rich learning environment for the residents. For example, residents were appreciative of the end-of-life scenario about a patient who was not responding to treatment. The resident led a family meeting with the patient's wife, daughter, nurse, and ethicist. After the scenario was completed, the ethicist, as well as the faculty, provided feedback to the resident. This OSSIE provides opportunities for residents to learn from this situation and from other difficult situations in a controlled environment.

Residents were asked for feedback and to rate the cases for learning potential. As can be seen from table 2, the residents rated the code blue scenario, chronic disease management of a patient with narcotic addiction, and the family meeting discussion as the top 3 cases. These are 3 of the most complex cases, and all involve challenging and stressful situations.

Finally, faculty involved was asked to give curricular feedback based on their observations. We were able to identify several deficiencies in our curriculum, such as lack of understanding of how to determine patient competency for informed consent and how to obtain informed consent, how to communicate with patients about decision making, how to encourage team communication, and how to communicate empathetically with family during stressful times. Curricular changes are underway to address these deficiencies.

Although patterned after the OSCEs, the OSSIE has some unique aspects.

Standardized Professionals

In addition to simulated patients, the OSSIE utilizes simulated members of the health care team. Actors who were accustomed to portraying patients were trained to act as standardized professionals, such as nurses, radiology technicians, other physicians, or pharmacists. Many of the encounters required multiple participants, such as a patient, family member, and various members of the health care team. In 1 situation, which centers on a family meeting case, an actual clinical ethicist participated. Because of the unique aspects of the case, we found it beneficial to have someone very knowledgeable in the area of ethics be involved in the case and offer feedback after the OSSIE was completed.

Training of standardized patients for professional roles necessitated unique approaches. In training standardized patients, the trainer can focus on teaching the patient about the history and symptoms of the patient who is being portrayed. The standardized patient has knowledge of detailed information and can be totally prepared to respond to questions. The format is predictable and the patient knows that the student will be obtaining a history and then performing an examination. Portraying a professional is quite different. The standardized professional can be provided with some knowledge of the case. However, there is more than 1 way that a resident might approach issues that involve systems-based practice; thus, how the encounter will flow is much less predictable. In training of the standardized professional, the trainer focuses heavily on the mood and countenance of the response to the resident's communications. The standardized professional can be briefed on anticipated actions, but as the OSSIE unfolds, the resident may choose a different course of action. This higher degree of unanticipated actions requires extemporaneous responses from the simulator. For this reason, in choosing standardized patients to play professionals, the curriculum specialist considered many aspects of their background and talents to ensure a good fit.

Variety of Settings

In contrast to most standardized patient encounters, which are set in the simulated clinical office, the OSSIE cases used a variety of simulated settings. Objective Structured System-Interaction Examination cases were situated in simulated hospital rooms, physician offices, nurses' stations, and resident lounges. These settings required using desks and other furniture instead of examination tables. In addition, charts, x-rays, and other props had to be prepared for accurate simulation. Some charts were needed to replicate hospital records and thus the need for simulated history and physical documents, progress notes, laboratory reports, radiology reports, etc. Many hours were devoted to ensure that these documents looked as authentic as possible.

Use of Technology

To realistically simulate systems-based scenarios, technology including telephones, pagers, and computers with electronic health records was used. In some cases, the resident needed to telephone another health professional. A phone had to be connected in the examination room and a number provided to a nearby room where the standardized professional would be waiting to receive the call. An integral component of systems-based practice is being able to use the electronic health records efficiently and accurately. Thus, for many of the cases, simulated patient information had to be entered into the test electronic health record, and residents were required to use or enter information appropriately.

Challenges in Implementation

Lack of Familiarity of Residents with Standardized Encounters

Medical students who are educated in the United States are quite familiar and comfortable in testing situations that involve standardized patient encounters. However, there are many residents in internal medicine in the United States who are from other countries. Most international medical graduates have rarely participated in simulated encounters. Thus, we anticipated a higher level of anxiety with international residents. To lessen the residents' apprehension, the formative nature of the evaluation was emphasized before the examination. Residents were told that there were no scores, ratings, or rankings and that the OSSIE was a method of individualizing their third year of residency.

Faculty Time for Planning and Development

A large amount of time was required for the development of the OSSIE cases. Faculty who wrote the cases estimated that they spent 4 hours in composing the case. Much thought had to go into each scenario, and special attention was given to the setting and the props to make the situation as realistic as possible. For some OSSIE cases, documents such as hospital records with complete sets of progress notes had to be created. Time was needed to create the electronic records for some of the OSSIE cases. Training notes had to be developed for the standardized professionals. Since the portrayal of a professional was quite different from that of a patient, more time was required for this process. On the day of the OSSIE, faculty members were needed to observe the 4 cases over an 8-hour period as each of the residents rotated through each encounter.

Scheduling of Residents

One of the greatest challenges was scheduling the residents. The simulation laboratory had to be reserved for a time when it was not in use by the medical school. The Residency Office staff and the chief of residents, who were familiar with the schedule of each resident, were integral to the efficient coordination of the OSSIE. Planning the schedule began 6 months in advance to enable divisions to plan for residents' absences. Each OSSIE case required about 40 minutes to complete, so each resident had to be present in the laboratory for approximately 4½ hours to complete 4 OSSIEs. The residents' previously scheduled rotations were interrupted and the faculty in charge of those rotations was made aware of the OSSIE evaluation. To keep residents on schedule, they were paged 1 hour before their scheduled time.

Resource Requirements

In addition to the commitment of faculty time to the development and implementation of the OSSIEs, a significant level of resources is needed to hire and train the standardized personnel required for each scenario as well as to equip the physical setting for each case. The simulation laboratory with its videotaping capability, as well as the staff of the Office of Education and Curriculum, already a part of our institution's infrastructure, provided both intellectual and physical resources for the development of OSSIE.

The OSSIE is a new method for evaluating the skills of systems-based practice. Although we do not have outcomes data to share as yet, we received positive comments from the residents and other participants. As well as being a method of evaluation, the OSSIE stimulated self-reflection and further learning as residents responded to feedback following the sessions. The first 4 OSSIE cases were videotaped and qualitatively examined to develop an instrument to measure systems-based practice. Twelve cases now exist that can be used to assess systems-based practice. We are in the process of developing 8 more OSSIEs to create an evaluation session of 12 modules. The instrument to measure systems-based practice will then be applied by the observing faculty and the standardized professionals in the 12 OSSIE sessions. We are currently determining the reliability of the instrument when used by faculty observers and also by lay observers. Reliability and validity, as well as the evaluation tool, will be reported in a future article. From our pilot study, we can conclude that the OSSIE (1) was well received by residents, (2) was evidence-based as an objective measurement, (3) provided a rich learning experience for the residents, and (4) is a potential measurement activity for systems-based practice.

1
Klass
,
D.
A performance-based conception of competence is changing the regulation of physician's behavior.
Acad Med
2007
.
82
(
6
):
529
534
.
2
Accreditation Council for Graduate Medical Education
Common Program Requirements: General Competencies— 2/13/2007
.
.
3
Nadolski
,
G. J.
,
M. A.
Bell
,
B. B.
Brewer
,
R. M.
Frankel
,
H. E.
Cushing
, and
J. J.
Brokaw
.
Evaluating the quality of interaction between medical students and nurses in a large teaching hospital.
BMC Medl Educ
2006
.
6
:
23
.
4
Stokes
,
T.
,
C.
Tarrant
,
Q. G.
Mainous
,
H.
Schers
,
G.
Freeman
, and
R.
Baker
.
Continuity of care: is the personal doctor still important—a survey of general practitioners and family physicians in England and Whales, the United States, and the Netherlands.
Ann Fam Med
2005
.
3
(
4
):
353
359
.
5
David
,
R. A.
and
L. M.
Reich
.
The creation and evaluation of a systems-based practice/managed care curriculum in a primary care internal medicine residency program.
Mt Sinai J Med
2005
.
72
(
5
):
296
299
.
6
Povar
,
G. J.
,
H.
Blumen
, and
J.
Daniel
.
Ethics in practice: managed care and the changing health care environment.
Am Coll Physicians
2004
.
141
:
131
136
.
7
Delzell
,
J. E.
,
E. N.
Ringdahl
, and
R. L.
Kruse
.
The ACGME core competencies: a national survey of family medicine program directors.
Fam Med
2005
.
37
(
8
):
576
580
.
8
Graham
,
M. J.
,
Z.
Naqvi
, and
J. A.
Encandela
.
What indicates competency in systems based practice—an analysis of perspective consistency among healthcare team members.
Adv Health Sci Educ Theory Pract
2009
.
14
(
2
):
187
203
.
9
Englander
,
R.
,
W.
Agostinucci
,
E.
Zalneraiti
, and
C. L.
Carraccio
.
Teaching residents systems-based practice through a hospital cost-reduction program: a “win-win” situation.
Teach Learn Med
2006
.
18
(
2
):
150
152
.
10
Rivo
,
M. S.
,
D. R.
Keller
,
A.
Teherani
,
M. T.
O'Connell
,
B. A.
Weiss
, and
S. A.
Rubenstein
.
Practicing effectively in today's health system: teaching systems-based care.
Fam Med
2004
.
36
(
suppl
):
S63
S70
.
11
Ziegelstein
,
R. C.
and
N. H.
Fiebach
.
“The mirror” and “the village”: a new method for teaching practice-based learning and improvement and systems-based practice.
Acad Med
2004
.
79
(
1
):
83
88
.
12
Kahn
,
M. J.
,
W. W.
Merrill
,
D. S.
Anderson
, and
H. M.
Szerlip
.
Residency program director evaluations do not correlate with performance on a required 4th-year objective structured clinical examination.
Teach Learn Med
2001
.
13
(
1
):
9
12
.
13
Accreditation Council for Graduate Medical Education, American Board of Medical Specialties
Toolbox of Assessment Methods
Accreditation Council for Graduate Medical Education and American Board of Medical Specialties
.
9
2000
.
14
Silber
,
C. G.
,
T. J.
Nasca
,
D. L.
Paskin
,
G.
Eiger
,
M.
Robeson
, and
J.
Veloski
.
Do global rating forms enable program directors to assess the ACGME competencies?
Acad Med
2004
.
79
(
6
):
549
556
.
15
Peters
,
A. S.
,
J.
Kimura
,
M. D.
Ladden
,
E.
March
, and
G.
Moore
.
A self-instructional model to teach systems-based practice and practice-based learning and improvement.
J Gen Inten Med
2008
.
23
(
7
):
931
936
.
16
Turley
,
C. B.
,
R.
Roach
, and
M.
Marx
.
Systems survivor: a program for house staff in systems-based practice.
Teach Learn Med
2007
.
19
(
2
):
128
138
.
17
Kerfoot
,
P.
,
P.
Conlin
,
T.
Travison
, and
T.
McMahon
.
Web-based education in systems-based practice.
Arch Intern Med
2007
.
167
:
361
366
.
18
Battles
,
J. B.
,
S. L.
Wilkinson
, and
S. J.
Lee
.
Using standardised patients in an objective structured clinical examination as a patient safety tool.
Qual Saf Health Care
2004
.
13
(
suppl 1
):
i46
i50
.
19
Simpson
,
D.
,
R.
Helm
, and
T.
Drewniak
.
Objective structured video examinations (OSVEs) for geriatrics education.
Gerontol Geriatr Educ
2006
.
26
(
4
):
7
24
.
20
Petrusa
,
E. R.
Taking standardized patient-based examinations to the next level.
Teach Learn Med
2004
.
16
(
1
):
98
110
.
21
Kissela
,
B.
,
S.
Harris
, and
D.
Kleindorfer
.
The use of standardized patients for mock oral board exams in neurology: a pilot study.
BMC Med Educ
2006
.
6
:
22
.
22
McLaughlin
,
K.
,
L.
Gregor
,
A.
Jones
, and
S.
Coderre
.
Can standardized patients replace physicians as OSCE examiners?
BMC Med Educ
2006
.
27;6
:
12
.

Author notes

Richard B. Rosher, MD, is Professor of Internal Medicine at Southern Illinois University School of Medicine; Susan Hingle, MD, is Associate Professor of Internal Medicine at Southern Illinois University School of Medicine; Sherry Robinson, PhD, RNCS, is Assistant Professor of Internal Medicine and Clinical Nurse Specialist at Southern Illinois University School of Medicine; Nancy McCann-Stone, MA, is Curriculum Development Specialist at Southern Illinois University School of Medicine; Christine Todd, MD, is Associate Professor of Internal Medicine at Southern Illinois University School of Medicine; and Michael Clark, MA, is Educational Innovations and Practice Coordinator at Southern Illinois University School of Medicine.

Financial support for this project was provided through the Stemmler Medical Education Research Fund, National Board of Medical Examiners.