Abstract
Resident engagement in quality improvement is a requirement for graduate medical education, but the optimal means of instruction and evaluation of resident progress remain unknown.
To determine the accuracy of self-reported chart audits in measuring resident adherence to primary care clinical practice guidelines.
During the 2010–2011 academic year, second- and third-year internal medicine residents at a single, university hospital–based program performed chart audits on 10 patients from their primary care clinic to determine adherence to 16 US Preventive Services Task Force primary care guidelines. We compared residents' responses to independent audits of randomly selected patient charts by a single external reviewer.
Self-reported data were collected by 18 second-year and 15 third-year residents for 330 patients. Independently, 70 patient charts were randomly selected for review by an external auditor. Overall guideline compliance was significantly higher on self-reported audits compared to external audits (82% versus 68%, P < .001). Of 16 guidelines, external audits found significantly lower rates of adherence for 5 (tetanus vaccination, osteoporosis screening, colon cancer screening, cholesterol screening, and obesity screening). Chlamydia screening was more common in audited charts than in self-reported data. Although third-year residents self-reported higher guideline adherence than second-year residents (86% versus 78%, P < .001), external audits for third-year residents found lower overall adherence (64% versus 72%, P = .040).
Residents' self-reported chart audits may significantly overestimate guideline adherence. Increased supervision and independent review appear necessary to accurately evaluate resident performance.
What was known
Residents' adherence to prevention and screening guidelines for ambulatory care is an important area for ambulatory quality improvement (QI).
What is new
Resident self-review and reporting overstated adherence to guidelines, compared to external chart review.
Limitations
Single institution, single specialty, and potential for sampling bias limit ability to generalize from the findings.
Bottom line
Ongoing external review and improving resident education about chart review could enhance the reliability and utility of chart self-audits as a QI tool.
Editor's Note: The online version of this article contains preventive care guidelines used for the study.
Introduction
Adherence to national health care guidelines may be as low as 50%,1 and initiatives to increase compliance could save $150 billion in health care spending annually.2 The Institute of Medicine recommends that quality improvement (QI) instruction begin in residency and continue throughout practice,3 and the Accreditation Council for Graduate Medical Education identified a QI curriculum as a residency program requirement across all specialties.4
How to best incorporate QI remains undetermined. Chart audits have proven effective at improving resident adherence to primary care guidelines5,6; however, recruiting an appropriate auditor can be difficult and resource intensive.7 Previous investigators have advocated residents as the auditors of their own charts to increase investment in quality patient care, but the impact on audit validity is undefined.8 Self-reported questionnaires and face-to-face interviews have demonstrated that both residents and attending physicians overstate clinical performance, compared to independent chart review.9–11 Although our program hoped to minimize bias by anchoring reported performance to specific patients, the validity of these self-audits remains unknown. Residents could bias their results by preferentially selecting (either consciously or unconsciously) patients who received guideline-adherent care.
The purpose of our study was to determine if resident-selected chart audits were a reliable means of assessing patient care performance. We compared adherence to primary care guidelines from chart reviews selected by residents on their own patients (self-audits) to those audits of charts randomly selected by an external reviewer (external audits).
Methods
Study Design
The Penn State Hershey Medical Center Internal Medicine Residency Training Program recruits 20 categorical residents per year, with primary care clinic 1 half-day per week for each resident. During the 2010–2011 academic year, clinic records consisted of electronic physician notes, as well as a paper record documenting patient problems, medications, and vaccinations. Our curriculum includes periodic audits performed by resident physicians of their primary care patient charts. For 6 months, residents were instructed to randomly select 10 patient charts to review and report guideline adherence based on a subset of 16 US Preventive Services Task Force (USPSTF) guidelines (provided as online supplemental material).12 There were no specific instructions on chart selection. Residents' chart audit results were submitted to attending preceptors, and residents were charged with creating a QI plan to improve their performance on subsequent reviews. In presenting this assignment, emphasis was placed on the creation of a QI plan over strict adherence to guidelines. Although residents received instruction in the USPSTF guidelines, there was no specific training in their application to this initiative.
After receiving approval from the Institutional Review Board, we randomly selected independent audits for second- and third-year residents during the 2010–2011 academic year. We excluded first-year residents, who had not established care with sufficient patients to assess adherence. The external auditor (K.T.) was a medical student instructed in USPSTF guidelines and usage of the electronic health record.
Data Analysis
Self-audits were available from 18 second-year and 15 third-year residents (n = 330 charts). Patient lists were available from 7 second-year and 7 third-year residents. Five patient charts were randomly selected from each list to undergo external audit (n = 70 total charts). Externally audited charts were not matched to self-audited charts. Based on sample size estimations, this provided sufficient power to detect an approximate 15% difference in individual guideline adherence. For both self-reported and external audits, patients were classified as receiving guideline-adherent care, not receiving guideline-adherent care, or not meeting inclusion criteria. We used Microsoft Excel 2010 for our analysis. Due to the number of comparisons made between groups as well as the small number of anticipated charts for several guidelines, Fisher exact test was used as a conservative test for statistical significance.
Results
Overall compliance by self-audit was 82% (2763 of 3387). Third-year residents reported significantly higher guideline compliance than second-year residents (86% versus 78%, P < .001). Self-audited compliance with individual guidelines ranged from 16% (15 of 96) for abdominal aortic aneurysm screening to 98% (316 of 322) for screening for tobacco use. Detailed results are shown in table 1.
In contrast, overall guideline compliance by external audit was 68%. Individual guidelines ranged from 0% (0 of 18) for herpes zoster vaccination to 100% (70 of 70) for blood pressure measurement (table 1). Second-year residents' externally audited guideline compliance was significantly higher than for third-year residents (72% versus 64%, P < .001).
Residents' self-reported guideline adherence was significantly higher than adherence from external audit (82% versus 68%, P < .001). Guideline-specific results are listed in table 1. External audits found significantly lower rates of adherence for 5 of 16 guidelines. Only chlamydia screening was found to be significantly higher in external audits than in self-reported data.
The difference between self-reported and externally audited performance was less for second-year residents than third-year residents (table 2). Overall, second-year residents reported adherence to 78% (1502 of 1921) of guidelines, while third-year resident self-reported adherence was 86% (1261 of 1466, P < .001). In contrast, externally audited charts had significantly lower adherence for third-year residents (64% versus 72%, P = .040) and we did not find a significant difference in adherence between second- and third-year residents for any of the guidelines.
Discussion
We found higher guideline adherence reported by self-audit than external audits. In addition, third-year resident self-audits reported higher guideline adherence in comparison to second-year self-audits despite similar external audit performance. Previous studies have shown significant physician overestimation in self-reported guideline compliance when compared to questionnaire or interviews.13 However, to our knowledge this is the first study comparing resident self-selected and performed chart audits with randomly selected external audits. Overall, resident adherence to preventive care guidelines, estimated at 68% by the external reviewer, was comparable (or slightly superior) to previously published US patient data.2,14–16
These differences between self-audit and external audit could be a result of differences in audit methodology or selection bias. Although residents and the external reviewer had equal access to USPSTF guidelines, interpretation of these guidelines may have varied due to lack of initiative-specific instruction. It is also possible residents did not rely solely on recorded data, but also personal recollections of office visits not captured in the chart. Specific training in chart auditing and USPSTF guidelines should be provided to residents in validation studies. Standardization of chart selection could also limit future selection bias.
Adherence to individual guidelines varied widely and may correlate to clinic design. Guidelines that were part of routine screening on every visit, such as hypertension screening, were identified as areas of high compliance both for self-reported and externally audited data. One-time or widely spaced interventions (eg, tetanus vaccination) were performed at lower frequencies, suggesting that changes in clinic workflow processes could affect resident compliance with guidelines. No externally audited charts were found to be compliant for herpes zoster vaccine (0 of 18), likely due to vaccine shortage.17 These findings reflected low vaccination rates nationwide.18
Only chlamydia screening had higher guideline adherence in self-audits compared to external audits. This may be due to differences in determining which patients should receive screening between residents and the external auditor. Only 16 of 70 externally audited charts (23%) were identified as meeting criteria for screening, compared to 112 of 330 self-audited charts (34%). The 2007 USPSTF guideline recommends screening for women aged 24 years or younger and recommends against screening for women not at increased risk.19 Increased risk is defined by criteria validated in a 1996 study by Scholes et al20 and may have been overestimated by resident physicians.
There could be increased bias in selecting patients with higher rates of guideline-concordant care among third-year residents. This could represent increased pressure on residents to perform as they approach the end of their training and assume the responsibilities of unsupervised care delivery. Alternatively, third-year residents may be less vigilant in documenting patient care in response to higher patient volumes, and identifying preventive services as up-to-date in their self-audit, which then were indicated missing in the external review. Previous investigations have found second-year residents are more likely to self-report suboptimal patient care practices,21 yet there have been few objective comparisons of primary care performance between second- and third-year residents. The discrepancies could also be due to differences between specific resident cohorts and may not be generalizable. A longitudinal study, following residents over time, could address this question. Previous investigators22 also found the second year of internal medicine residency an ideal time to initiate QI projects owing to the time, energy, and potential for follow-through. These findings are mirrored in our study, as the higher accuracy of second-year residents may represent greater investment in the project.
Limitations of our study include the small sample size and single center design. Selection bias may also exist due to the limited availability of resident patient panel lists and incomplete sampling. The findings of the external audit also were limited by the available documentation in the written and electronic chart; incomplete or absent documentation could yield false reports of guideline noncompliance. Although this study was not designed to distinguish between resident overstatement through biased chart selection versus inaccurate audit, further investigations to distinguish between these modes of bias could improve audit reliability and curriculum design.
Conclusion
Self-audits may significantly overestimate guideline adherence in resident QI projects when compared to external audits. These differences may increase with further residency training and could mar the accurate identification of problem areas for future improvement efforts. Ongoing external review of a random sample or increased training in chart review techniques could enhance the utility of resident chart audits as a curricular tool.
References
Author notes
Ethan F. Kuperman, MD, MSc, is Clinical Assistant Professor, Department of Internal Medicine, University of Iowa; Kristen Tobin, MD, is Resident Physician, Department of Internal Medicine, Loyola University Chicago; and Jennifer L. Kraschnewski, MD, is Assistant Professor, Department of Internal Medicine, Penn State College of Medicine.
Funding: The authors report no external funding source for this study.
Conflict of interest: The authors declare they have no competing interests.