Background

A portable electronic method of providing instructional feedback and recording an evaluation of resident competency immediately following surgical procedures has not previously been documented in obstetrics and gynecology.

Objective

This report presents a unique electronic format that documents resident competency and encourages verbal communication between faculty and residents immediately following operative procedures.

Methods

The Microsoft Tag system and SurveyMonkey platform were linked by a 2-D QR code using Microsoft QR code generator. Each resident was given a unique code (TAG) embedded onto an ID card. An evaluation form was attached to each resident's file in SurveyMonkey. Postoperatively, supervising faculty scanned the resident's TAG with a smartphone and completed the brief evaluation using the phone's screen. The evaluation was reviewed with the resident and automatically submitted to the resident's educational file.

Results

The evaluation system was quickly accepted by residents and faculty. Of 43 residents and faculty in the study, 38 (88%) responded to a survey 8 weeks after institution of the electronic evaluation system. Thirty (79%) of the 38 indicated it was superior to the previously used handwritten format. The electronic system demonstrated improved utilization compared with paper evaluations, with a mean of 23 electronic evaluations submitted per resident during a 6-month period versus 14 paper assessments per resident during an earlier period of 6 months.

Conclusions

This streamlined portable electronic evaluation is an effective tool for direct, formative feedback for residents, and it creates a longitudinal record of resident progress. Satisfaction with, and use of, this evaluation system was high.

What was known

Educators have sought efficient approaches for busy faculty to assess and offer feedback on residents' operative skills.

What is new

A streamlined portable electronic tool assessed operative skills and facilitated feedback to obstetrics-gynecology residents.

Limitations

Single program sample limits generalizability; some faculty experienced problems using the electronic tool; nonvalidated satisfaction survey.

Bottom line

Portable electronic evaluation is an effective tool for formative feedback and creates a longitudinal record of residents' progress.

Editor's Note: The online version of this article contains the CREOG-based Focused Assessment of Competency, a resident's evaluation on SurveyMonkey, photo of smartphone scanning a resident's TAG, and faculty survey to be completed by the resident.

Finding the optimal method for assessing the surgical skills of residents and offering them feedback has posed a challenge for decades. Early work focused on simply transferring adult learning principles to the operating room table.1 Later, more comprehensive studies emphasized establishing criteria to accurately and reliably document specific objective assessment criteria covering a full range of activities occurring immediately before, during, and following the surgical procedure.2,3 Recent initiatives have added simulation labs and videos of the operations to supplement intraoperative teaching and evaluation of operative skills.4,5 

In June 2003, the Council on Resident Education in Obstetrics and Gynecology (CREOG) Competency Task Force posted recommendations for the evaluation of surgical skills of residents in obstetrics and gynecology via a surgery-focused assessment of competency (S-FAC).6 These recommendations for assessing a resident's performance during a surgical procedure consisted of evaluating the resident on specific points related to positioning the patient, sterile technique, preparing for the procedure, interacting with all members of the operative team, and technical performance (provided as online supplemental material).

Although the metrics were well selected, the methods of recording performance information, adequately reviewing the assessment with the resident, and properly filing the report in the resident's educational folder will vary from institution to institution. Beyond the instruction that occurred during the operation, the resident may receive a thorough face-to-face review of the competencies by the faculty member after the procedure, or he or she may receive no postoperative discussion at all. A portable electronic format that requires direct interaction between the faculty and resident immediately following the procedure could provide a more effective assessment of surgical competencies than an evaluation performed well after the operation has concluded. The aim of this study was to judge the feasibility and acceptability of a novel electronic system for the evaluation of surgical skills.

Setting and Participants

This study was conducted at Louisiana State University Health Sciences Center (LSUHSC) in New Orleans, a major public medical teaching facility consisting of the Schools of Medicine, Dentistry, Public Health, Nursing, and Allied Health. The obstetrics-gynecology residents rotate to a public hospital and a private hospital in New Orleans and to a public hospital in Lafayette, Louisiana. Full-time obstetrics-gynecology faculty members supervise residents' surgeries at the 3 inpatient locations. General obstetrics and gynecology, maternal-fetal medicine, urogynecology, reproductive endocrinology, and gynecologic oncology sections are represented.

During the 6-month assessment of the portable electronic surgical evaluation system, the Department of Obstetrics and Gynecology at LSUHSC comprised 25 residents and 18 full-time faculty members. For the 6-month evaluation of the paper format the previous year, there were 26 residents and 18 full-time faculty members.

Intervention

The Microsoft Tag system and SurveyMonkey platform were linked by a 2-D QR code that was designed using Microsoft QR code generator. Each of the 25 obstetrics-gynecology residents was given a unique code (TAG) embedded on an ID card to be worn with the resident's institutional ID badge.7 A streamlined evaluation based on the CREOG Focused Assessment of Competency Task Force recommendations was developed and linked to a file unique to each resident on SurveyMonkey (provided as online supplemental material). The abbreviated electronic form was agreed on by the principal investigator, the program director, the department chair, the department's research coordinator, and resident leaders from postgraduate year–2 and postgraduate year–4 levels. The group decided that the final version of the electronic evaluation form would contain elements from the major CREOG competencies.

Surgeries included all procedures performed by obstetrics-gynecology residents in an operating room setting. At the conclusion of the surgery, the supervising faculty member scanned the resident's TAG with a smartphone (photo provided as online supplemental material). The resident's unique evaluation page in SurveyMonkey appeared on the smartphone screen. The faculty member then completed the brief evaluation by touching or typing on the screen and immediately reviewing each point of the evaluation with the resident. After completing the face-to-face encounter, the evaluation was closed, and it was automatically submitted electronically to the resident's educational file in SurveyMonkey. Every full-time faculty member was also given a unique TAG on an ID card. After the face-to-face feedback session with the resident, the resident also scanned the faculty member's TAG and anonymously answered 5 yes or no questions about the faculty member's interaction with the resident during the surgical procedure (provided as online supplemental material). After completing this encounter, the questionnaire was closed and submitted electronically to the individual faculty member's file. The purpose of the faculty questionnaire was to provide helpful feedback to the faculty member and to gather system use statistics. There was no formal resident evaluation of faculty performance. Each month, residents and faculty received a summary printout of their assessments and questionnaires for review.

Analysis

An anonymous, untested satisfaction survey was performed 8 weeks after instituting this electronic S-FAC system. All obstetrics-gynecology residents and full-time faculty were included. Questions were asked concerning comparison of the electronic format with the previous handwritten forms, satisfaction with the electronic format, the value of continuing use of this system, and the length of time required to complete the electronic evaluation. To determine the acceptance and functionality of the new electronic evaluation system, utilization rates during the first 6 months (August 2012 through January 2013) were compared with a retrospective analysis of utilization rates of the paper evaluation format for the same 6-month period the previous year (August 2011 through January 2012).

The LSUHSC–New Orleans Institutional Review Board assessed this study and granted it exempt status.

This evaluation system was quickly accepted by the residents and faculty. Of the 38 participating residents and faculty responding to the initial satisfaction survey, 30 (79%) indicated this evaluation tool was more satisfactory than the previously used handwritten S-FAC format; 32 (83%) stated it provided improved educational benefit; and 33 (86%) saw value in continuing this form of resident evaluation. Overall, 33 respondents (86%) were satisfied or very satisfied with this format. Although the time of the face-to-face evaluation varied depending on the circumstances of the operation, the time to complete the SurveyMonkey portion on the smartphone was reported to be less than 1 minute by 6 of the 14 responding faculty (43%) and less than 2 minutes by 11 (78%).

During the initial 6-month period of electronic evaluation usage, a total of 583 focused surgical assessments of competency were performed for 25 obstetrics-gynecology residents compared with the same 6-month period the previous year, when 361 paper evaluations were filed for 26 residents. That represents a mean of 23 electronic evaluations per resident versus 14 paper assessments per resident, thus revealing a 64% increase of evaluations.

The total cost of instituting the project was $108, spent to commercially laminate the resident and faculty ID badges. The LSUHSC Department of Obstetrics and Gynecology had previously paid a subscription fee for SurveyMonkey, so there was no additional cost for this project. The Microsoft Tag smartphone app is free. All work was performed by residents and faculty in their off time, and there was no paid administrative assistance. The general concept of this electronic evaluation was approved by the design team on July 1, 2012. All aspects of the electronic evaluation format were finalized, residents and faculty were briefed on system usage, and the first electronic evaluation of resident surgical performance was logged on August 1, 2012, only 1 month after the initial discussion of the concept. Summary results were downloaded from SurveyMonkey and e-mailed to each resident and faculty member monthly by a member of the residency support staff. That activity required approximately 30 minutes each month.

The most important finding from this study was the satisfaction with, and increased use of, the portable electronic evaluation system with its immediate face-to-face educational component compared with the less utilized paper evaluation format. That earlier focused assessment of competency for resident-performed surgical procedures consisted of a single typed page containing a listing of 20 CREOG-designated competencies, as well as an opinion concerning each resident's ability to perform the procedure with minimal or no supervision (provided as online supplemental material). Residents were instructed to hand the supervising faculty member a blank paper evaluation form at the conclusion of an operation. At a later date, the faculty member would complete the evaluation, with no requirement for any discussion with the resident concerning his or her performance during the surgery. When convenient, the faculty member returned the form to the resident for the resident's review, after which he or she was to take the form to the residency director so it could be placed in his or her education file. There were several steps in this process where a breakdown could occur, preventing the paper assessment form from making its way to the resident's educational file. Furthermore, there was no requirement for an instructional formal face-to-face discussion of the resident's performance at any time following the operation.

Other published methods of evaluating residents' surgical performance include end-of-rotation summary paper evaluations of broad parameters ranging from operative performance to conference attendance,8 6-month summary assessments of general technical abilities,9 extensive paper evaluations tailored to each individual type of operative procedure,10 and Internet-based evaluation logs, either on a local website where a form can be accessed and kept in residents' departmental files,11 or uploaded directly into New Innovations,12 a company specializing in secure, centralized Internet databases for medical education programs. Compared with the present study, none of these evaluation formats are portable, completed immediately following the operation, or require a face-to-face 2-way interaction between faculty and residents involving a focused assessment plus an opportunity to enhance the educational experience.

The utilization of a smartphone to scan the TAGs on the ID badges postoperatively and to complete the electronic evaluation form required less than 2 minutes for 78% (11 of 14) of faculty to accomplish. It was not necessary for every member of the surgical team to have his or her own smartphone. Because any smartphone can be used, one was borrowed if the individual faculty or resident did not have one in his or her possession. The availability of at least 1 smartphone in each operating room was 100% during the evaluation period.

The improvement in quantity of evaluations is likely due in part to the electronic convenience, but the quality of the feedback may be affected by both the convenience factor and the fact that different questions were used.

Use of the tool resulted in a significant increase in the rate of completed evaluations compared with the previously used paper format. The LSUHSC–New Orleans School of Medicine has chosen to study this evaluation method on an institution-wide basis, and it is being offered to all surgical departments on a voluntary basis.

This study has several limitations. Some faculty members expressed discomfort with using the electronic format because it represented unfamiliar technology, and their personal evaluation rates dropped during the study period. Although our electronic assessment tool was based on the well-vetted CREOG evaluation format, we did not examine the interrater reliability of faculty assessments using the tool. Finally, the survey to assess satisfaction with the tool was not validated.

The portability and accessibility of the QR reader used in conjunction with such a streamlined electronic survey provides a valuable tool for direct, formative feedback at the time of a surgical procedure, as well as an electronic record to be used for longitudinal comparison of resident progress. Satisfaction with, and the utilization of, the new evaluation system were high among residents and faculty.

1.
Reznick
RK
.
Teaching and testing technical skills
.
Am J Surg
.
1993
;
165
(
3
):
358
361
.
2.
Mandel
LP
,
Lentz
GM
,
Goff
B
.
Teaching and evaluating surgical skills
.
Obstet Gynecol
.
2000
;
95
(
5
):
783
785
.
3.
Lentz
GM
,
Mandel
LS
,
Lee
D
,
Gardella
C
,
Melville
J
,
Goff
BA
.
Testing surgical skills of obstetric and gynecologic residents in a bench laboratory setting: validity and reliability
.
Am J Obstet Gynecol
.
2001
;
184
(
7
):
1462
1467
.
4.
Moorthy
K
,
Munz
Y
,
Sarker
S
,
Darzi
A
.
Objective assessment of technical skills in surgery
.
BMJ
.
2003
;
327
(
7422
):
1032
1037
.
5.
Diwadkar
GB
,
van den Bogert
A
,
Barber
MD
,
Jelovsek
JE
.
Assessing vaginal surgical skills using video motion analysis
.
Obstet Gynecol
.
2009
;
114
(
2
):
244
251
.
6.
American Congress of Obstetricians and Gynecologists. CREOG Competency Presentations
. .
7.
Resident Evaluation X-press: Portable Resident Evaluation
. .
8.
Texas Tech University Health Sciences Center
.
http://www.ttuhsc.edu. Accessed August 4, 2014
.
9.
Weill Cornell Medical College
.
Department of Obstetrics & Gynecology
.
http://www.cornellobgyn.org. Accessed August 4, 2014
.
10.
The American Board of Surgery
.
http://www.absurgery.org. Accessed August 4, 2014
.
11.
Larson
JL
,
Williams
RG
,
Ketchum
J
,
Boehler
ML
,
Dunnington
GL
.
Feasibility, reliability and validity of an operative performance rating system for evaluating surgery residents
.
Surgery
.
2005
;
138
(
4
):
640
649
.
12.
New Innovations Inc
.
http://www.new-innov.com/pub/. Accessed August 25, 2014
.

Author notes

All authors are with the Department of Obstetrics and Gynecology, Louisiana State University Health Sciences Center. Kellin Reynolds, MD, is a Resident; Danny Barnhill, MD, is Chief, Gynecologic Oncology; Jamie Sias, MD, is a Resident; Amy Young, MD, is a Chair; and Florencia Greer Polite, MD, is a Residency Program Director.

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

Supplementary data