ABSTRACT
Secure messages exchanged between patients and family medicine residents via an electronic health record (EHR) could be used to assess residents' clinical and communication skills, but the mechanism is not well understood.
To design and test a secure messaging competency assessment for family medicine residents in a patient-centered medical home (PCMH).
Using the existing literature and evidence-based guidelines, we designed an assessment tool to evaluate secure messaging competency for family medicine residents training in a PCMH. Core faculty performed 2-stage validity and reliability testing (n = 2 and n = 9, respectively). A series of randomly selected EHR secure messages (n = 45) were assessed from a sample of 10 residents across all years of training.
The secure message assessment tool provided data on a set of competencies and a framework for resident feedback. Assessment showed 10% (n = 2) of residents at the novice level, 50% (n = 10) as progressing, and 40% (n = 8) as proficient. The most common deficiencies for residents' secure messages related to communication rather than clinical competencies (n = 37 [90%] versus n = 4 [10%]). Interrater reliability testing ranged from 60% to 78% agreement and 20% to 44% disagreement. Disagreement centered on interpersonal communication factors. After 2 stages of testing, the assessment using residents' secure messages was incorporated into our existing evaluation process.
Assessing family medicine residents' secure messaging for patient encounters closed an evaluation gap in our family medicine program, and offered residents feedback on their clinical and communication skills in a PCMH.
Introduction
Electronic communication (e-communication) between patients and providers has become a standard part of primary care.1–4 An expanding literature describes the prevalence, nature, and guidelines involving e-communication between patients and providers.4–9
Secure messaging through the electronic health record (EHR) and nonsecure e-mail are the primary methods of e-communication. Although physician informatics training has been reported since the 1990s, there is sparse literature on the assessment of secure messaging in family medicine residencies.10–14 Moreover, none exist for residents training in a patient-centered medical home (PCMH).
Our sponsoring institution, Group Health (GH), an integrated care and health plan system, began to use EHRs in 2003 and developed an EHR secure message portal between patients and providers in 2008 as part of its PCMH rollout.2,15,16 The GH primary care clinic visit provider schedule includes time for e-communication, and by 2010, 60% of primary care patient care encounters occurred via electronic visits.17 The experience in the family medicine residency program was similar, and by 2012, 50% of our residents' clinic encounters were electronic visits.17 Yet we lacked a tool to systematically assess these encounters. Consequently, we identified a critical need for a competency assessment tool to assess and provide feedback for this type of patient care, and we describe our experiences with the design, testing, and implementation of a novel secure messaging assessment for faculty training residents in a PCMH.
Methods
Setting and Participants
The GH Family Medicine Residency is an 18-resident (6-6-6), community-based, urban program affiliated with the University of Washington School of Medicine. Our residency clinic visit schedule is identical to those for GH nonresident physicians and includes scheduled time for face-to-face, telephone, and secure-message visits. Residents provide care for their patient panel throughout residency and utilize the EpicCare EHR for secure messaging (Epic Systems Corp). Core faculty (n = 9) are medical and behavioral health providers in the GH system.
Intervention
Utilizing the existing literature, we designed an assessment for encounters via secure messages that evaluates residents' clinical and communication criteria. The criteria are a synthesis of the American Medical Informatics Association guidelines, the guidelines of Prady et al,7 and the GH Clinical Guidelines (Carl G. Morris, MD, MPH, unpublished data, March 2010). The figure describes the assessment's content. We considered secure messaging to be the exchange of e-communication between patients and physicians through an EHR secure portal.
Our initial tool asked faculty to indicate the presence (yes, no, or not applicable [N/A]) of clinical and communication components for each message, and an open-ended field was provided to enter comments and a qualitative summary of all messages. The tool was subsequently modified to improve feedback and to reduce completion time. In the final version of the tool, faculty evaluators provided formative feedback for each message, a summative statement, and a rating of overall competency. Competency scales for each secure message used a 3-rubric modified Dreyfus scale: novice, progressing, and proficient. The summative competency assessment scale for all reviewed messages is described in the figure.
Outcome Measures
To measure the validity and reliability of the tool, core faculty participated in 2 testing stages. Stage 1 involved 2 faculty investigators who independently evaluated 10 randomly selected resident messages from 2 third-year residents using the initial iteration. The 2 investigators then compared assessments for content validity and interrater reliability, discussed their reviews, and modified the tool. This was repeated with 15 and 20 additional randomly selected messages from 10 residents across all years of training (using the second and final iterations of the tool, respectively).
Stage 2 evaluated interrater reliability among all core faculty physicians during existing faculty development sessions. Using the tool and assessment criteria, faculty assessed messages that were scored as either proficient (model message) or novice (needs improvement) from stage 1 testing. Agreement between scores was used to measure scale reliability and to create a norm for the process of resident feedback.
An exemption for the use of human subjects was granted by the Group Health Human Subjects Division.
Analysis
Content validity and interrater reliability were assessed by comparing the message components identified and the competency scores.18 Data are presented by frequency and proportion. Descriptive analysis was performed using SAS Enterprise Guide software version 6.1 (SAS Institute Inc).
Results
During stage 1, content validity of the tool matched the assessment criteria as shown in the figure. After competency scales were inserted into the final iteration of the tool, we scored 10% of messages (n = 2) as novice, 50% as progressing (n = 10), and 40% as proficient (n = 8).
The table describes the most commonly found deficiencies in resident secure messages and interrater reliability testing for 41 valid secure messages sent by 10 residents. The most common deficiency in resident secure messages were communication errors (90%) rather than clinical errors (10%). The interrater reliability in both stages ranged from 60% to 78% of messages reviewed. Disagreement of scores ranged from 20% to 44% of messages, with greater disagreement for messages in the needs improvement category. We identified that differences in faculty members' competency assessments were due to faculty disagreement regarding appropriate message length and style rather than clinical content.
Feedback from the secure messaging assessment tool was integrated into residents' semiannual evaluation. We built the assessment with our residency management software and administered it prior to the residents' semiannual evaluation. The resident's faculty advisor was given access to the resident's EHR inbox and instructed to assess 10 secure messages. The advisor reported the results of the assessment as part of the resident's summative review and to the resident during the advisor-advisee meeting. Core faculty reported that the tool was easily integrated into the resident review process, took 15 to 20 minutes to complete, and provided assessment of an important aspect of patient care that was not previously evaluated.
Discussion
We believe this is the first tool to assess family medicine residents' e-communication in a PCMH and to highlight the importance of using new modes of assessment as primary care practice continues to transform and new technologies are implemented.4,19,20 Use of an assessment tool for residents' clinical encounters using an EHR secure messaging function allowed us to close an evaluation gap in a PCMH training model. Faculty time commitment was feasible because we used both existing faculty development time and resident evaluation processes. Since incorporating the tool, we have achieved a 100% completion rate.
Our experience with faculty discordance when evaluating “needs improvement” secure messages highlights the preliminary nature of our assessments and the need for additional reliability testing. Moreover, comparing our assessment tool to other interpersonal communication assessments in our program and future publications in e-communication will help us continue to develop the tool.
We acknowledge the limitations of a tool designed and tested in a single residency program, but anticipate that our assessment will help residencies begin to evaluate resident competency in e-communication as this becomes a standard part of clinical practice. Other limitations include the challenges of defining secure messaging competency and how to achieve agreement on communication styles. Future iterations of this tool will help create improved definitions with secure messaging competency.
Conclusion
We designed a novel assessment tool for residency programs to monitor residents' electronic patient encounters. This tool can be incorporated into existing assessment systems with little impact on faculty effort and other resources.
References
Author notes
Funding: The authors report no external funding source for this study.
Competing Interests
Conflict of interest: The authors declare they have no competing interests.
This research was presented at the American Academy of Family Physicians Workshop for Directors of Family Medicine Residencies, Kansas City, Missouri, April 5–7, 2013, and at the Society of Teachers of Family Medicine 46th Annual Spring Conference, Baltimore, Maryland, May 1–5, 2013.
The authors would like to thank Laurel Woods, MD, and Paul Ford, MA, for their review and feedback.