Abstract
Physical exam skills of medical trainees are declining, but most residencies do not offer systematic clinical skills teaching or assessment.
To assess knowledge of clinical signs and physical exam performance among incoming internal medicine residents.
For this study, 45 incoming residents completed a multiple choice question test to assess knowledge of clinical signs. A random selection of 20 underwent a faculty-observed objective structured clinical examination (OSCE) using patients with abnormal physical findings. Mean percentage scores were computed for the multiple choice question test, overall OSCE, and the 5 individual OSCE systems.
The mean scores were 58.4% (14.6 of 25; SD 11. 5) for the multiple choice question test and 54.7% (31.7 of 58; SD 11.0) for the overall OSCE. Mean OSCE scores by system were cardiovascular 30.0%, pulmonary 69.2%, abdominal 61.6%, neurologic 67.0%, and musculoskeletal 41.7%. Analysis of variance showed a difference in OSCE system scores (P < .001) with cardiovascular and musculoskeletal scores significantly lower than other systems.
Overall, physical exam knowledge and performance of new residents were unsatisfactory. There appears to be a pressing need for additional clinical skills training during medical school and residency training and we are planning a new clinical skills curriculum to address this deficiency.
Editor's Note: The online version of this article includes the sample questions for the multiple choice test and the faculty scoring sheet for the OSCE for the cardiac exam station.
Background
Physical examination skills traditionally have been viewed among the most valuable skills taught during medical education,1–4 contributing to more cost-effective use of diagnostic services, while rewarding physicians with the excitement and satisfaction of making a diagnosis using their knowledge and skills.1,2 These skills also increase direct contact with patients, and the therapeutic value of the human touch is impossible to quantify.2
Several investigators have reported an overall decline in clinical skills of medical students and residents,5–10 with residents less well prepared for taking an adequate medical history, performing a reliable physical examination, and effectively communicating with patients,11 while relying on ordering tests without always knowing how to interpret them.1,11 In an era of increasing health care costs we need to reconsider the importance of physical examination skills.1,2
Despite documented deficiencies in clinical skills, medical school and residency curricula do not emphasize clinical skills teaching or assessment.3,7,12 Reported barriers to teaching clinical exam skills include a scarcity of good teaching patients, lack of time for teaching at the bedside, an over-reliance on technology, and a shortage of skilled faculty to impart this knowledge.11,13
Before developing a new clinical skills curriculum for our internal medicine residency program, we wished to explore the physical exam skills of our incoming residents as needs assessment.
Our study objectives were
To investigate the knowledge of clinical signs as well as the physical exam skills of new postgraduate year 1 (PGY-1) residents using volunteer patients during an objective structured clinical examination (OSCE)
To explore system-specific strengths and weaknesses in their physical exam skills
Methods
Setting and Participants
Incoming internal medicine PGY-1 residents at Boston University School of Medicine completed a written multiple choice question (MCQ) test on clinical signs and a randomly selected subsample completed a physical exam assessment during their residency orientation in June 2006. These tests were designed as a pretest prior to implementation of a new clinical skills curriculum. The protocol for this study was approved by the Institutional Review Board at Boston University School of Medicine.
A planning committee consisting of generalist and subspecialist faculty from the Department of Medicine discussed and finalized the MCQ test questions and the OSCE scoring sheets after review of questions from the Membership of the Royal College of Physicians and other examinations as well as detailed discussions of essential elements of system-specific physical examination. This core group of faculty consistently taught residents bedside clinical skills, had a reputation of being skilled clinical diagnosticians, and served as preceptors for the OSCE.
The written test consisted of 25 MCQs designed to evaluate the ability of the residents to interpret and diagnose physical exam findings. The 5 major systems, cardiovascular, neurologic, pulmonary, gastrointestinal, and musculoskeletal, were represented.
The OSCE used volunteer patients with abnormal physical findings recruited from the medical wards and clinics. The 5 stations included cardiac, pulmonary, abdominal, neurologic, and musculoskeletal systems. At each station a faculty examiner instructed the residents to perform a focused physical exam, completed a scoring sheet, and provided feedback in 10 minutes. All faculty preceptors underwent an orientation to the procedures of observation and feedback during the OSCE as well as the scoring. A sample score sheet is included in Appendix B. Each item on the scoring sheet was graded on a 3-point rating scale from 0 to 2 points. Two points were awarded if the element was performed correctly, 1 point if the element was performed with room for improvement, and no points awarded if the exam was omitted. The OSCE encompassed 5 to 6 elements on exam technique and 1 to 2 elements on interpretation and diagnosis, for a maximum score of 58 points.
Statistical Analysis
Descriptive analysis was performed on the data collected for the PGY-1 cohort, and we calculated mean percentage scores for the MCQ test and the OSCE. The OSCE scores were then analyzed by individual organ system. We wished to examine whether there were differences in physical exam performance for different systems. Scores for each individual system were ranked and then compared using analysis of variance. If a difference was detected, we explored further where the differences lay by pair-wise comparison using Tukey minimum significant differences. All analysis was run at α = .05 level using SAS version 9.0 (SAS Inc, Cary, NC) and Excel XP (Microsoft, Redmond, WA).
Results
A total of 45 internal medicine PGY-1 residents at Boston University School of Medicine completed the MCQ test and 20 completed the 5-station physical exam OSCE. Most were US medical graduates from several different medical schools; 2 were international medical graduates.
The overall mean score for the written test was 58.4% with a standard deviation of 11.5 and range 36.0 to 80.0 (n = 45). There was no statistical difference between the scores of those residents chosen to undergo the OSCE and those not selected (P = .261).
The mean overall OSCE score was 54.7% with a standard deviation of 11.0 and range 39.7 to 84.5 (n = 20). Analysis of variance showed significant differences in the OSCE scores for individual systems (P < .05) with the cardiovascular and musculoskeletal examination scores being significantly lower than the pulmonary, neurology, and abdominal examinations scores. The overall MCQ, OSCE, and individual system scores are shown in the table.
Examples of errors observed in residents' physical exam include
Faulty exam technique
a. Not using bell and diaphragm of stethoscope
b. Not eliciting shifting dullness correctly
Lack of systematic exam
a. Skipping inspection or palpation completely
b. Not following a stepwise exam such as motor strength, tone, reflexes, gait, and so forth
Failure to identify findings
a. Identification of diastolic murmur
b. Identification of bronchial breath sounds
Failure to interpret findings and make a diagnosis
a. Differentiating between upper and lower motor neuron signs
Difficulty formulating differential diagnosis for a given finding
a. Causes of ascites
b. Causes of knee effusions
Discussion
The newly graduated medical students entering internal medicine residency in our study scored less than 60% on average in a knowledge test of clinical signs as well as a physical exam OSCE. Errors were noted in physical exam technique as well as diagnosis. Many studies have reported less than satisfactory physical exam skills among trainees. Dupras and Li14 found mean scores of 50 ± 11 in physical examination and diagnosis stations of an OSCE assessing postgraduate year 2 internal medicine residents.14 Vukanovic-Criley and colleagues9 reported that cardiac exam skills improved between years 1 and 2 of medical school and reached a plateau thereafter.
The variability of exam performance between systems may be a result of different exposures during medical school clinical rotations and a measure of interest in the given field. However, based on the substantial difference from other systems we surmise that there are deficiencies in both cardiovascular and musculoskeletal teaching.
The strengths of this study are the use of trained faculty members to observe and assess the OSCE allowing for more consistent scoring. The study used real patients with abnormal physical examination findings rather than standardized patients with scripted history and answers to questions. This is a more accurate representation of the kind of encounters that medical trainees will be exposed to in clinical practice. The OSCE also served as a learning tool as participants were provided with immediate feedback from faculty on their physical examinations skills.
Limitations
Our study has some limitations. We tried to limit variability in scoring by faculty orientation and use of objective and easily observable physical exam behaviors, but different faculty observers may exhibit variable leniency in scoring. There also may have been variability in the degree of difficulty between patients at different stations. We present cross-sectional data that may reflect many different factors including prior medical school training and interest. We hope to ascertain in subsequent studies the trajectory of knowledge and skills following implementation of a clinical skills curriculum. We did not set out to test the psychometric properties of our assessment instrument but tried to address content validity to ensure that the instrument is appropriate to measure what we intended to measure, but future studies need to test the validity of instruments that can assess physical diagnosis skills with precision while being reliable.
Conclusions
There is a strong indication that additional clinical skills training and assessment are needed during medical school as well as residency training. Standardized patient examinations should supplement, but not replace, direct observation of trainees as only real patient interactions can help educators document longitudinal growth of trainees' clinical skills.14 Faculty development is also crucial as many clinical faculty do not feel confident of their own clinical skills.13 Questions have been raised about the validity of standardized patient encounters alone in assessing higher-level trainees as they cannot assess diagnostic skills.15–17
Although many educators espouse the value of clinical skills,1–3,18 some skeptics have questioned whether these skills are anachronistic.19 Future studies will need to investigate whether a formal curriculum will improve and sustain the clinical skills of residents. The predictive values of many physical examination techniques and findings have been evaluated,2,18 and the JAMA Rational Clinical Examination series is the foremost example of evidence-based physical diagnosis.20 It is vital that clinical educators continue to study the clinical utility of physical findings and discard signs or maneuvers of little value; modern technology can be very helpful in achieving these goals.
Our pilot study examined the status of physical exam skills in a sample of newly graduated medical students. It is one step in a comprehensive attempt to assess the details of physical examination deficiencies among our residents to advance our understanding of the underlying concepts of such deficiencies. Ideally, before the findings are used in the design of a clinical skills curriculum, a larger group of residents needs to be studied using precurriculum and postcurriculum intervention assessment, and the reliability and validity of instruments to assess physical diagnosis skills need to be tested, to assess whether previously validated OSCE assessments would be useful. To reinforce and maintain the improvement in clinical skills that could be achieved through a systematic curriculum, educators also need to reinforce these skills in the clinical setting, role-model their use, and demonstrate their value in quality patient care. Finally, research is needed to assess whether improvement in clinical skills is essential for patient care in modern-day medicine, namely, whether improved clinical skills lead to more timely diagnosis, reductions in inappropriate use of resources, and improvements in patient satisfaction.
References
Author notes
Subha Ramani, MBBS, MMEd, MPH, is Associate Program Director, Internal Medicine Residency Program, Associate Professor of Medicine, Boston University School of Medicine; Brandi N. Ring, MA, is Medical Student, Boston University School of Medicine; Robert Lowe, MD, is Education Director, Division of Gastrenterology, Department of Medicine, Evans Educator for Professionalism, and Associate Professor of Medicine, Boston University School of Medicine; and David Hunter, MBBS, MSc, PhD, is Professor and ARC Future Fellow, School of Medicine, Northern Clinical School, University of Sydney, Australia.
We gratefully acknowledge the support received from the General Clinical Research Center Grant M01 RR00533 for allowing us to use their facility for the OSCE testing.
We are indebted to the residents for participating in the study, the Internal Medicine Residency Program Office staff for helping to organize the OSCE, Department of Medicine Faculty who served as the teachers and examiners, and, most importantly, our patients who enthusiastically participated in the study for the sole purpose of educating our trainees.