ABSTRACT
Approaches for teaching neurology documentation include didactic lectures, workshops, and face-to-face meetings. Few studies have assessed their effectiveness.
To improve the quality of neurology resident documentation through payroll simulation.
A documentation checklist was created based on Medicaid and Medicare evaluation and management (E/M) guidelines. In the preintervention phase, neurology follow-up clinic charts were reviewed over a 16-week period by evaluators blinded to the notes' authors. Current E/M level, ideal E/M level, and financial loss were calculated by the evaluators. Ideal E/M level was defined as the highest billable level based on the documented problems, alongside a supporting history and examination. We implemented an educational intervention that consisted of a 1-hour didactic lecture, followed by e-mail feedback “paystubs” every 2 weeks detailing the number of patients seen, income generated, income loss, and areas for improvement. Follow-up charts were assessed in a similar fashion over a 16-week postintervention period.
Ten of 11 residents (91%) participated. Of 214 charts that were reviewed preintervention, 114 (53%) had insufficient documentation to support the ideal E/M level, leading to a financial loss of 24% ($5,800). Inadequate documentation was seen in all 3 components: history (47%), examination (27%), and medical decision making (37%). Underdocumentation did not differ across residency years. Postintervention, underdocumentation was reduced to 14% of 273 visits (P < .001), with a reduction in the financial loss to 6% ($1,880).
Improved documentation and increased potential reimbursement was attained following a didactic lecture and a 16-week period in which individual, specific feedback to neurology residents was provided.
Good documentation is instrumental for quality of care and appropriate reimbursement, yet residents do not appear to receive adequate education in this area.
An educational intervention using lecture and feedback in the form of “paystubs” that quantifies the financial impact of neurology residents' documentation practices.
Single institution, single specialty intervention limits generalizability.
A 1-hour lecture and individual feedback was effective in improving residents' documentation and increasing potential billing.
Introduction
Documentation serves as the primary communication tool among clinicians and is also the basis for health insurance reimbursement. Good documentation leads to higher-quality patient care and is essential to a practice's financial health.1–3 Furthermore, studies from multiple specialties report inadequate resident documentation and suggest a need for additional education.4–7 A survey of 6 academic programs reported that only 31% of trainees received education on billing level differences, and 21% were informed of financial consequences from poor documentation.8 A national, multispecialty panel of residents and fellows felt that the current training system fell short of adequately preparing them for practice, and additional training was needed in the billing and legal aspects of practice.9
Coding and billing are complex. New and follow-up outpatient visits are reimbursed based on evaluation and management (E/M) codes 99201 to 99205 and 99211 to 99215, respectively. Neurology practices often follow the 1997 E/M guidelines, which acknowledge subspecialty examinations.10 Underbilling for services provided results in lost revenue; overbilling can lead to costly audits,2 termination of insurance participation, and investigation under the False Claims Act, with civil penalties of $5,000 to $10,000 per incident.3,11,12
Evaluation and management checklists were previously used to assess financial losses from underbilling.4 We incorporated E/M checklists into an educational intervention, simulating a real-life payroll with goals to improve resident documentation and minimize financial losses.
Methods
This study was conducted in a neurology resident continuity clinic utilizing the Epic electronic health record (EHR) system (Epic Systems, Verona, WI). Participants consisted of all residents in the program minus 1 senior resident, the study investigator (n = 10). Residents attend clinic 1 half-day per week, with an increasing patient load by training year. Only follow-up visits were included in the study. For blinding and maintenance of anonymity, the program coordinator assigned each participant a secret identifier, deidentified charts for distribution to raters, and served as the communication link between raters and participants.
Checklist
A single-page checklist (provided as online supplemental material) was created using Centers for Medicare & Medicaid Services (CMS) documentation guidelines.10,13 To establish interrater reliability, 3 reviewers independently evaluated 10 random notes, as previously described.4
The level of service was determined by 3 main components: history, examination, and medical decision making (MDM). History had 3 subcomponents: history of present illness (HPI), review of systems (ROS), and past history (PHx). Examination has a maximum of 23 points. MDM had 3 subcomponents: number of diagnosis/management options (DxMgt), complexity of data, and morbidity and mortality risk. The MDM level was the highest billable E/M level, and needed to be matched by either the history or examination level.
Preintervention
All preintervention follow-up notes (n = 214) were evaluated over a 16-week period to determine current E/M level supportable by documentation, ideal E/M level, and financial loss (difference between ideal and current E/M). Additionally, there was no feedback given during this period. A monetary value was assigned based on the 2014 CMS physician fee schedule.14
The ideal E/M level was defined as the highest billable level for the presenting set of problems in the absence of documentation deficiencies. To determine the ideal E/M level, raters reviewed the MDM, and E/M was considered optimized if (1) all visit problems were assessed with a management plan; (2) all unstable or improved problems were stated as such; (3) all medication adjustments or refills were documented; and (4) all problems that posed a threat to life or body function had the concern clearly stated or implied. After MDM was optimized, raters checked whether the history or examination level met or exceeded the MDM: if so, the note was at the ideal E/M level; if not, raters assessed the history subcomponents and examination for areas of improvement.
Intervention
The aggregate 16-week preintervention data for all participants was presented (figure 1), followed by a 1-hour lecture on documentation requirements, the financial implications of underbilling, and the legal aspect of overbilling.2,4,12
Over the following 16 weeks, all postintervention follow-up notes (n = 273) were assessed in a similar fashion. To simulate a real-life payroll model, feedback (“paystubs”) reports were generated every 2 weeks by the raters and distributed via e-mail by the program coordinator, showing the number of patients seen, the income generated, the potential income lost, and the areas of deficiency. An updated performance summary was on display every 2 weeks for 5 minutes prior to the start of a scheduled lecture (provided as online supplemental material). Residents were not mandated to read e-mails or arrive for viewing of performance summary. At the conclusion of the study, all participants were given an anonymous survey to rate their documentation knowledge at baseline, immediately postlecture, and at the conclusion of the training program (provided as online supplemental material).
This study was part of a departmental quality improvement initiative with Institutional Review Board exemption.
Statistical Analysis
Statistical analysis was performed with Stata version 13.2 software (StataCorp LP, College Station, TX). For statistical analysis, all first-year residents were grouped together because they examined fewer patients per week compared with upper-level residents. For nominally encoded data, chi-square tests were used to evaluate pretest versus posttest differences. For ordinal, nonnormal data, Mann-Whitney tests were used to assess pre- versus posttest differences. For multiple, correlated response judgments, chi-square tests with Sidak adjustments were used to control the false-discovery rate.
Results
This program required 3 h/wk of combined rater time to audit and generate feedback for 13 to 17 notes per week. The program coordinator spent 1 hour every 2 weeks printing and deidentifying notes. There was good interrater reliability for determining E/M levels using the checklist (κ = 0.93).
Preintervention
A total 114 charts (53%) did not meet the ideal E/M level, leading to a 24% ($5,800) financial loss (table). Inadequate documentation was evident in all major components: 47% history, 27% examination, and 37% MDM. The average number of problems addressed was 1.5. Documentation did not differ among training levels (P = .81). Compared with national CMS neurology data, our documentation supported an excess of levels 2 and 3, a shortage of level 4, and an absence of level 5 (figure 1).15
Postintervention
There was improvement in all main components of documentation with 37 charts (14%) not meeting an ideal E/M level, resulting in a 6% financial loss ($1,880, P < .001). Improvement occurred immediately after the didactic lecture, continued to improve with feedback, and plateaued after 1 month (figure 2). There was increased documentation in the average number of problems addressed (P < .001), and our documentation supported E/M levels that mirrored national norms (figure 1).
Program Evaluation
The majority of residents reported low baseline documentation knowledge, improved knowledge immediately postlecture, and felt “well-versed” at project conclusion (provided as online supplemental material). All reported voluntarily reading feedback e-mails.
Discussion
In this single center study, we attempted to provide feedback to neurology residents that could resemble periodic feedback from internal auditors when in practice. We improved note documentation to support appropriate billing levels for the residents' work and the patients' diagnoses. Residents reported poor baseline knowledge on this subject and felt competent at the conclusion of training. Estimated financial losses dropped significantly, and billing levels mirrored national neurology norms.
Prior studies have demonstrated similarly positive results but required significant time commitments. One neurology department developed a 6-lecture curriculum, dictation templates, standardized review of systems forms, and face-to-face feedback meetings.16 One medicine department ran a 3-year program of monthly 1-hour workshops.17 Another department simulated “virtual practice” by creating twelve 30-minute didactic modules and incorporated continuous feedback with individual and group productivity reports.18
Our training program required time allocation for raters and the program coordinator. It was less time consuming for the trainee than time reported in other studies. After the lecture and 2 feedback reports, resident documentation significantly improved and then plateaued, suggesting a shorter intervention period would be adequate. High resident engagement was evident by improved documentation and self-reported voluntary review of e-mails. Although not formally assessed, the intervention was viewed as favorable and beneficial.
There are several limitations to this study. We are a small residency group in a single specialty, which may limit generalization. Because of the small size, there was no control group. It is possible improvement occurred due to ongoing learning during residency; however, such improvement was not noted in the preintervention period. Notably, patient encounters were not observed by the raters. It is possible that residents documented more work to garner a higher E/M level. We educated participants on the negative aspects of overdocumentation during the didactic lecture, but raters did not formally assess that. Anecdotally, elements of the residents' notes were highlighted, suggesting a conscious effort to document clinically relevant information while satisfying financial requirements.
A 1-year follow-up study to assess sustainability of the intervention when feedback ceases would be informative. Future studies could include replication in a larger training program and adaptation to other specialties.
Conclusion
A 1-hour lecture followed by individualized feedback reports every 2 weeks improved neurology residents' documentation significantly and increased potential billing. This educational intervention is 1 option to improve neurology residents' documentation.
References
Author notes
Funding: The authors report no external funding source for this study.
Competing Interests
Conflict of interest: The authors declare they have no competing interests.
This project was presented as a platform speech during the 67th American Academy of Neurology Annual Meeting, Washington, DC, April 18–25, 2015.
The authors would like to thank David Lucido, PhD, for his statistical assistance; Weiyi Gao, MD, for her checklist testing; and Jean Peng, MPH, for project coordination.
Editor's Note: The online version of this article contains the documentation checklist, the percentage of financial loss among trainees, and the trainee self-reported knowledge list.