ABSTRACT
The Family Medicine (FM) Milestones are competency-based assessments of residents in key dimensions relevant to practice in the specialty. Residency programs use the milestones in semiannual reviews of resident performance from the time of entry into the program to graduation.
Using a national sample, we investigated the relationship of FM competency-based assessments to resident progress and the complementarity of milestones with knowledge-based assessments in FM residencies.
We used midyear and end-of-year milestone ratings for all FM residents in Accreditation Council for Graduate Medical Education–accredited programs during academic years 2014–2015 and 2015–2016. The milestones contain 22 items across 6 competencies. We created a summative index across the milestones. The American Board of Family Medicine database provided resident demographics and in-training examination (ITE) scores. We linked information to the milestone data.
The sample encompassed 6630 FM residents. The summative milestone index increased, on average, for each cohort (postgraduate year 1 [PGY-1] to PGY-2 and PGY-2 to PGY-3) at each assessment. The correlation between the milestone index that excluded the medical knowledge milestone and ITE scores was r = .195 (P < .001) for PGY-1 to PGY-2 cohort and r = .254 (P < .001) for PGY-2 to PGY-3 cohort. For both cohorts, ITE scores and composite milestone assessments were higher for residents who advanced than for those who did not.
Competency-based assessment using the milestones for FM residents seems to be a viable multidimensional tool to assess the successful progression of residents.
There is interest in correlating milestone ratings with other objective assessments of performance such as the in-training examination (ITE).
A national study of milestone and ITE data showed that ITE performance did not correlate with milestone ratings. Milestone ratings and ITE scores were higher for residents who progressed in their program compared with those who did not.
Single specialty study limits generalizability.
The Family Medicine Milestones, excluding medical knowledge, measure other key dimensions of physician competence.
Introduction
A number of health professions have adopted a focus on competency-based education.1,2 Many undergraduate and postgraduate medical education programs are embracing competency-based medical education,3–5 as it is envisioned to better equip physicians with the skills for practice in a changing health care landscape, compared with time- or task-based education. This shift in educational philosophy, framework, and expectations has led to considerable innovation in curricula, along with challenges for competency-based assessment.6
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) began its Outcome Project.1 The initial 6 competencies ultimately led to the creation of specialty-specific milestones for each competency that describe expectations for entry-level, intermediate, and graduating residents.7 According to the ACGME Advisory Committee on Educational Outcome Assessment, milestones “describe, in behavioral terms, learning and performance levels residents are expected to demonstrate for specific competencies by a particular point in residency education.”8,9 The Family Medicine (FM) Milestones provide a framework for assessing the development of resident physicians in key dimensions of physician competence in the specialty.10 Each group of related milestones includes an introductory statement that describes the specific emphasis of FM within that competency.
Studies of internal medicine and pediatrics residents have found that milestone ratings differ by residents' training year.11,12 Clinical knowledge assessments using the Internal Medicine Milestones tend to positively correlate with scores on the board certification examination for postgraduate year 3 (PGY-3) residents.13 The American Board of Family Medicine (ABFM) in-training examination (ITE) is designed to measure medical knowledge and clinical decision-making ability in FM.
The purpose of this study was to investigate the relation of FM competency-based milestone assessments to resident progress and to investigate the relation of milestones to a knowledge-based assessment (the FM ITE) in a national sample.
Methods
We used midyear and end-of-year milestone ratings for all FM residents in ACGME-accredited programs in academic years 2014–2015 and 2015–2016. The milestones contain 22 items across 6 core competencies: patient care (5 items), medical knowledge (2 items), systems-based practice (4 items), practice-based learning and improvement (3 items), professionalism (4 items), and interpersonal and communication skills (4 items). Each resident is assigned a value from 0 to 5 for each item at each evaluation.14 A score of 0 indicates “not achieved level 1,” and half-point ratings are possible if a resident's performance is a mixture of behaviors from 2 adjacent levels. For this study, we used resident demographics and ITE scores provided by the ABFM administrative database. The ITE is administered in the fall of each year, is based on the blueprint of the ABFM certification examination, and is reported on a scaled score of 200 to 800. There is no passing score, as the ITE is intended to be a formative evaluation of a resident's current medical knowledge.
The American Academy of Family Physicians Institutional Review Board approved this study.
We restricted our analysis to residents who could be matched with ABFM administrative data, had ITE scores in 2014, and had milestone ratings available for all 4 assessment periods.
At each milestone rating, resident training year is documented by the program. We created a variable indicating advanced status if, during the first rating in 2015–2016, the resident had moved from PGY-1 or PGY-2 to PGY-2 or PGY-3, respectively. Residents remaining at PGY-1 or PGY-2 were marked “nonadvance.”
Rather than analyzing each milestone rating, we summed the scores of all milestones at each time point to create an overall score. We opted for this approach because all milestones are considered important for a resident's development. The assessor is not given instructions to weigh any milestone greater than another. Thus, an overall milestone index was deemed to represent the summed assessment of the resident's progress. Milestone MK-1 (medical knowledge) contains a built-in dependence on the ITE scores, which restricts the range of scores for this milestone compared with other milestones. To complete our correlational analysis between the summative index and ITE scores, we computed a restricted summative index that excluded the score for the medical knowledge milestone. This ensured that we would not correlate the medical knowledge milestone assessment with the medical knowledge represented by the ITE score.
We used descriptive statistics to characterize the cohorts. For categorical variables, we used chi-square tests. Age was not normally distributed, and we used nonparametric Wilcoxon rank sum 2-sample tests to determine differences between cohorts. We then used t tests to determine differences in milestone index and ITE scores between advancers and nonadvancers in each cohort and at each assessment period. To assess differences in milestone index scores between assessment periods and in each cohort, we used paired t tests between 2 consecutive assessments. Due to the disparity in sample sizes between the advancers and nonadvancers, we evaluated the equality of variance in the t tests. In cohort PGY-1 to PGY-2, the equality of variance assumption was not fulfilled (P = .009), and we presented the t test results by the Satterthwaite method instead of the pooled method. In cohort PGY-2 to PGY-3, the equality of variance assumption was fulfilled (P = .81), and we used the t test results obtained through the pooled method.
Finally, we calculated Pearson correlations between the restricted summative milestone index from the second 2014–2015 academic year rating (June 2015) and ITE scores from October 2015. SAS version 9.4 (SAS Institute Inc, Cary, NC) was used for all analyses.
Results
We started with 6966 residents with milestones ratings and were able to match 6964 (99.9%) with corresponding ABFM data. We excluded 334 with incomplete milestones data or missing ITE scores for a final sample of 6630 (95.2%). In the PGY-1 to PGY-2 cohort, 57 of 3353 (1.7%) residents did not advance, and 78 of 3277 (2.4%) residents did not advance in the PGY-2 to PGY-3 cohort. We found no significant differences in age, sex, degree type (doctor of medicine versus doctor of osteopathy), or international medical graduate status between residents who advanced and those who did not advance (table 1).
The summative milestone index increased significantly on average for each cohort (PGY-1 to PGY-2 and PGY-2 to PGY-3) at each assessment (figure). For example, the PGY-2 advancing cohort increased from 58.1 to 85.7 (P < .0001), while the nonadvancing cohort increased from 53.4 to 81.4 (P < .0001) between December 2014 and June 2016. The scores for the group who did not advance were significantly lower at each assessment than those of the group who did. For the PGY-1 to PGY-2 cohort, differences at each assessment ranged from 5.8 to 8.0, while the PGY-2 to PGY-3 cohort ranged from 3.2 to 4.2 (P < .0001 for all comparisons). Compared with the group who did not advance, the ITE scores of the group who did were 30 and 19 points higher, respectively, for the 2 cohorts (table 2).
The correlation between the restricted milestone index and ITE scores was r = 0.195 (P < .001) for the PGY-1 to PGY-2 cohort and r = .254 (P < .001) for the PGY-2 to PGY-3 cohort.
Discussion
This national study of FM residents demonstrates that milestones ratings increase with resident year and that the non–knowledge-based competencies have a relatively low correlation with FM ITE scores. Thus, as planned, the FM milestones appear to measure competencies other than medical knowledge, as indicated by the small correlations between milestones and ITE scores.
A competency-based assessment expands in assessment capacity beyond knowledge-based tests and is used for resident advancement. The results in our study are similar to research in pediatrics and internal medicine, in which milestone ratings increased with resident progression in training.11–13 In contrast to our findings in internal medicine, medical knowledge measured via ITE scores correlated with milestones ratings.
There are several limitations to this study. First, although this study is nationally representative, it is based on a single medical specialty. Second, milestones were not created to provide a summative rating. Our creation of a summated index may have inappropriately weighed each milestone equally, whereas residency directors may assign greater weight to some milestones over others when making advancement decisions. Third, only a small number of residents did not advance, giving us a limited number of individuals for whom to examine the relationship between milestones and advancement. Finally, the actual use of milestones in decisions about progress may vary by program. Some programs may have richer formative evaluations that can be used to provide objective evidence to inform milestones ratings, while others may rate residents' performance based on general gestalt.
Future studies in this area should examine the impact of individual competency categories for advancement as well as the impact of these competency assessments on future practice quality.
Conclusion
In conclusion, the competency-based assessment of milestones for FM residents seems to be a viable multidimensional tool to use for successful progression of residents, with milestone scores higher for residents who progressed in their program compared with those who did not.
References
Author notes
Funding: The authors report no external funding source for this study.
Competing Interests
Conflict of interest: The authors declare they have no competing interests.
These data were presented as a poster at the ACGME Annual Educational Conference, National Harbor, Maryland, February 25–28, 2016.