Background

Physical exam skills of medical trainees are declining, but most residencies do not offer systematic clinical skills teaching or assessment.

Objective

To assess knowledge of clinical signs and physical exam performance among incoming internal medicine residents.

Method

For this study, 45 incoming residents completed a multiple choice question test to assess knowledge of clinical signs. A random selection of 20 underwent a faculty-observed objective structured clinical examination (OSCE) using patients with abnormal physical findings. Mean percentage scores were computed for the multiple choice question test, overall OSCE, and the 5 individual OSCE systems.

Results

The mean scores were 58.4% (14.6 of 25; SD 11. 5) for the multiple choice question test and 54.7% (31.7 of 58; SD 11.0) for the overall OSCE. Mean OSCE scores by system were cardiovascular 30.0%, pulmonary 69.2%, abdominal 61.6%, neurologic 67.0%, and musculoskeletal 41.7%. Analysis of variance showed a difference in OSCE system scores (P < .001) with cardiovascular and musculoskeletal scores significantly lower than other systems.

Conclusion

Overall, physical exam knowledge and performance of new residents were unsatisfactory. There appears to be a pressing need for additional clinical skills training during medical school and residency training and we are planning a new clinical skills curriculum to address this deficiency.

Editor's Note: The online version of this article includes the sample questions for the multiple choice test and the faculty scoring sheet for the OSCE for the cardiac exam station.

Physical examination skills traditionally have been viewed among the most valuable skills taught during medical education,14 contributing to more cost-effective use of diagnostic services, while rewarding physicians with the excitement and satisfaction of making a diagnosis using their knowledge and skills.1,2 These skills also increase direct contact with patients, and the therapeutic value of the human touch is impossible to quantify.2 

Several investigators have reported an overall decline in clinical skills of medical students and residents,510 with residents less well prepared for taking an adequate medical history, performing a reliable physical examination, and effectively communicating with patients,11 while relying on ordering tests without always knowing how to interpret them.1,11 In an era of increasing health care costs we need to reconsider the importance of physical examination skills.1,2 

Despite documented deficiencies in clinical skills, medical school and residency curricula do not emphasize clinical skills teaching or assessment.3,7,12 Reported barriers to teaching clinical exam skills include a scarcity of good teaching patients, lack of time for teaching at the bedside, an over-reliance on technology, and a shortage of skilled faculty to impart this knowledge.11,13 

Before developing a new clinical skills curriculum for our internal medicine residency program, we wished to explore the physical exam skills of our incoming residents as needs assessment.

Our study objectives were

  1. To investigate the knowledge of clinical signs as well as the physical exam skills of new postgraduate year 1 (PGY-1) residents using volunteer patients during an objective structured clinical examination (OSCE)

  2. To explore system-specific strengths and weaknesses in their physical exam skills

Setting and Participants

Incoming internal medicine PGY-1 residents at Boston University School of Medicine completed a written multiple choice question (MCQ) test on clinical signs and a randomly selected subsample completed a physical exam assessment during their residency orientation in June 2006. These tests were designed as a pretest prior to implementation of a new clinical skills curriculum. The protocol for this study was approved by the Institutional Review Board at Boston University School of Medicine.

A planning committee consisting of generalist and subspecialist faculty from the Department of Medicine discussed and finalized the MCQ test questions and the OSCE scoring sheets after review of questions from the Membership of the Royal College of Physicians and other examinations as well as detailed discussions of essential elements of system-specific physical examination. This core group of faculty consistently taught residents bedside clinical skills, had a reputation of being skilled clinical diagnosticians, and served as preceptors for the OSCE.

The written test consisted of 25 MCQs designed to evaluate the ability of the residents to interpret and diagnose physical exam findings. The 5 major systems, cardiovascular, neurologic, pulmonary, gastrointestinal, and musculoskeletal, were represented.

The OSCE used volunteer patients with abnormal physical findings recruited from the medical wards and clinics. The 5 stations included cardiac, pulmonary, abdominal, neurologic, and musculoskeletal systems. At each station a faculty examiner instructed the residents to perform a focused physical exam, completed a scoring sheet, and provided feedback in 10 minutes. All faculty preceptors underwent an orientation to the procedures of observation and feedback during the OSCE as well as the scoring. A sample score sheet is included in Appendix B. Each item on the scoring sheet was graded on a 3-point rating scale from 0 to 2 points. Two points were awarded if the element was performed correctly, 1 point if the element was performed with room for improvement, and no points awarded if the exam was omitted. The OSCE encompassed 5 to 6 elements on exam technique and 1 to 2 elements on interpretation and diagnosis, for a maximum score of 58 points.

Statistical Analysis

Descriptive analysis was performed on the data collected for the PGY-1 cohort, and we calculated mean percentage scores for the MCQ test and the OSCE. The OSCE scores were then analyzed by individual organ system. We wished to examine whether there were differences in physical exam performance for different systems. Scores for each individual system were ranked and then compared using analysis of variance. If a difference was detected, we explored further where the differences lay by pair-wise comparison using Tukey minimum significant differences. All analysis was run at α  =  .05 level using SAS version 9.0 (SAS Inc, Cary, NC) and Excel XP (Microsoft, Redmond, WA).

A total of 45 internal medicine PGY-1 residents at Boston University School of Medicine completed the MCQ test and 20 completed the 5-station physical exam OSCE. Most were US medical graduates from several different medical schools; 2 were international medical graduates.

The overall mean score for the written test was 58.4% with a standard deviation of 11.5 and range 36.0 to 80.0 (n  =  45). There was no statistical difference between the scores of those residents chosen to undergo the OSCE and those not selected (P  =  .261).

The mean overall OSCE score was 54.7% with a standard deviation of 11.0 and range 39.7 to 84.5 (n  =  20). Analysis of variance showed significant differences in the OSCE scores for individual systems (P < .05) with the cardiovascular and musculoskeletal examination scores being significantly lower than the pulmonary, neurology, and abdominal examinations scores. The overall MCQ, OSCE, and individual system scores are shown in the table.

Table

Mean Percent Scores

Mean Percent Scores
Mean Percent Scores

Examples of errors observed in residents' physical exam include

  1. Faulty exam technique

    1. a. Not using bell and diaphragm of stethoscope

    2. b. Not eliciting shifting dullness correctly

  2. Lack of systematic exam

    1. a. Skipping inspection or palpation completely

    2. b. Not following a stepwise exam such as motor strength, tone, reflexes, gait, and so forth

  3. Failure to identify findings

    1. a. Identification of diastolic murmur

    2. b. Identification of bronchial breath sounds

  4. Failure to interpret findings and make a diagnosis

    1. a. Differentiating between upper and lower motor neuron signs

  5. Difficulty formulating differential diagnosis for a given finding

    1. a. Causes of ascites

    2. b. Causes of knee effusions

The newly graduated medical students entering internal medicine residency in our study scored less than 60% on average in a knowledge test of clinical signs as well as a physical exam OSCE. Errors were noted in physical exam technique as well as diagnosis. Many studies have reported less than satisfactory physical exam skills among trainees. Dupras and Li14 found mean scores of 50 ± 11 in physical examination and diagnosis stations of an OSCE assessing postgraduate year 2 internal medicine residents.14 Vukanovic-Criley and colleagues9 reported that cardiac exam skills improved between years 1 and 2 of medical school and reached a plateau thereafter.

The variability of exam performance between systems may be a result of different exposures during medical school clinical rotations and a measure of interest in the given field. However, based on the substantial difference from other systems we surmise that there are deficiencies in both cardiovascular and musculoskeletal teaching.

The strengths of this study are the use of trained faculty members to observe and assess the OSCE allowing for more consistent scoring. The study used real patients with abnormal physical examination findings rather than standardized patients with scripted history and answers to questions. This is a more accurate representation of the kind of encounters that medical trainees will be exposed to in clinical practice. The OSCE also served as a learning tool as participants were provided with immediate feedback from faculty on their physical examinations skills.

Our study has some limitations. We tried to limit variability in scoring by faculty orientation and use of objective and easily observable physical exam behaviors, but different faculty observers may exhibit variable leniency in scoring. There also may have been variability in the degree of difficulty between patients at different stations. We present cross-sectional data that may reflect many different factors including prior medical school training and interest. We hope to ascertain in subsequent studies the trajectory of knowledge and skills following implementation of a clinical skills curriculum. We did not set out to test the psychometric properties of our assessment instrument but tried to address content validity to ensure that the instrument is appropriate to measure what we intended to measure, but future studies need to test the validity of instruments that can assess physical diagnosis skills with precision while being reliable.

There is a strong indication that additional clinical skills training and assessment are needed during medical school as well as residency training. Standardized patient examinations should supplement, but not replace, direct observation of trainees as only real patient interactions can help educators document longitudinal growth of trainees' clinical skills.14 Faculty development is also crucial as many clinical faculty do not feel confident of their own clinical skills.13 Questions have been raised about the validity of standardized patient encounters alone in assessing higher-level trainees as they cannot assess diagnostic skills.1517 

Although many educators espouse the value of clinical skills,13,18 some skeptics have questioned whether these skills are anachronistic.19 Future studies will need to investigate whether a formal curriculum will improve and sustain the clinical skills of residents. The predictive values of many physical examination techniques and findings have been evaluated,2,18 and the JAMA Rational Clinical Examination series is the foremost example of evidence-based physical diagnosis.20 It is vital that clinical educators continue to study the clinical utility of physical findings and discard signs or maneuvers of little value; modern technology can be very helpful in achieving these goals.

Our pilot study examined the status of physical exam skills in a sample of newly graduated medical students. It is one step in a comprehensive attempt to assess the details of physical examination deficiencies among our residents to advance our understanding of the underlying concepts of such deficiencies. Ideally, before the findings are used in the design of a clinical skills curriculum, a larger group of residents needs to be studied using precurriculum and postcurriculum intervention assessment, and the reliability and validity of instruments to assess physical diagnosis skills need to be tested, to assess whether previously validated OSCE assessments would be useful. To reinforce and maintain the improvement in clinical skills that could be achieved through a systematic curriculum, educators also need to reinforce these skills in the clinical setting, role-model their use, and demonstrate their value in quality patient care. Finally, research is needed to assess whether improvement in clinical skills is essential for patient care in modern-day medicine, namely, whether improved clinical skills lead to more timely diagnosis, reductions in inappropriate use of resources, and improvements in patient satisfaction.

1
Bordage
,
G.
Where are the history and the physical?
CMAJ
1995
.
152
:
1595
1598
.
2
Mangione
,
S.
and
S. J.
Peitzman
.
Physical diagnosis in the 1990s: art of artifact?
J Gen Intern Med
1996
.
11
:
490
493
.
3
Kern
,
D. C.
,
T. A.
Parrino
, and
D. R.
Korst
.
The lasting value of clinical skills.
JAMA
1985
.
254
:
70
76
.
4
Holmboe
,
E. S.
Faculty and the observation of trainees' clinical skills: problems and opportunities.
Acad Med
2004
.
79
:
16
22
.
5
Wiener
,
S.
and
M.
Nathanson
.
Physical examination: frequently observed errors.
JAMA
1976
.
236
:
852
855
.
6
Mangione
,
S.
,
W. P.
Burdick
, and
S. J.
Peitzman
.
Physical diagnosis skills of physicians in training: a focused assessment.
Acad Emerg Med
1995
.
2
:
622
629
.
7
Mangione
,
S.
and
L. Z.
Nieman
.
Cardiac auscultatory skills of internal medicine and family practice trainees: a comparison of diagnostic proficiency.
JAMA
1997
.
278
:
717
722
.
8
Mangione
,
S.
and
L. Z.
Nieman
.
Pulmonary auscultatory skills during training in internal medicine and family practice.
Am J Respir Crit Care Med
1999
.
159
:
1119
1124
.
9
Vukanovic-Criley
,
J. M.
,
S.
Criley
,
C. M.
Warde
, et al
.
Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: a multicenter study.
Arch Intern Med
2006
.
166
:
610
616
.
10
Johnson
,
J. E.
and
J. L.
Carpenter
.
Medical house staff performance in physical examination.
Arch Intern Med
1986
.
146
:
937
941
.
11
Fred
,
H. L.
Hyposkillia: deficiency of clinical skills.
Tex Heart Inst J
2005
.
32
:
255
257
.
12
Mangione
,
S.
and
F. D.
Duffy
.
The teaching of chest auscultation during primary care training: has anything changed in the 1990s?
Chest
2003
.
124
:
1430
1436
.
13
Ramani
,
S.
,
J. D.
Orlander
,
L.
Strunin
, and
T. W.
Barber
.
Whither bedside teaching?: a focus-group study of clinical teachers.
Acad Med
2003
.
78
:
384
390
.
14
Dupras
,
D. M.
and
J. T.
Li
.
Use of an objective structured clinical examination to determine clinical competence.
Acad Med
1995
.
70
:
1029
1034
.
15
Kopelow
,
M. L.
,
G. K.
Schnabl
,
T. H.
Hassard
, et al
.
Assessing practicing physicians in two settings using standardized patients.
Acad Med
1992
.
67
:
S19
S21
.
16
Ram
,
P.
,
C.
van der Vleuten
,
J. J.
Rethans
,
R.
Grol
, and
K.
Aretz
.
Assessment of practicing family physicians: comparison of observation in a multiple-station examination using standardized patients with observation of consultations in daily practice.
Acad Med
1999
.
74
:
62
69
.
17
Rethans
,
J. J.
,
F.
Sturmans
,
R.
Drop
,
C.
van der Vleuten
, and
P.
Hobus
.
Does competence of general practitioners predict their performance?: comparison between examination setting and actual practice.
BMJ
1991
.
303
:
1377
1380
.
18
Feddock
,
C. A.
The lost art of clinical skills.
Am J Med
2007
.
120
:
374
378
.
19
Jauhar
,
S.
The demise of the physical exam.
N Engl J Med
2006
.
354
:
548
551
.
20
Sackett
,
D. L.
The rational clinical examination: a primer on the precision and accuracy of the clinical examination.
JAMA
1992
.
267
:
2638
2644
.

Author notes

Subha Ramani, MBBS, MMEd, MPH, is Associate Program Director, Internal Medicine Residency Program, Associate Professor of Medicine, Boston University School of Medicine; Brandi N. Ring, MA, is Medical Student, Boston University School of Medicine; Robert Lowe, MD, is Education Director, Division of Gastrenterology, Department of Medicine, Evans Educator for Professionalism, and Associate Professor of Medicine, Boston University School of Medicine; and David Hunter, MBBS, MSc, PhD, is Professor and ARC Future Fellow, School of Medicine, Northern Clinical School, University of Sydney, Australia.

We gratefully acknowledge the support received from the General Clinical Research Center Grant M01 RR00533 for allowing us to use their facility for the OSCE testing.

We are indebted to the residents for participating in the study, the Internal Medicine Residency Program Office staff for helping to organize the OSCE, Department of Medicine Faculty who served as the teachers and examiners, and, most importantly, our patients who enthusiastically participated in the study for the sole purpose of educating our trainees.

Supplementary data