The American Medical Association Accelerating Change in Medical Education (AMA-ACE) consortium proposes that medical schools include a new 3-pillar model incorporating health systems science (HSS) and basic and clinical sciences. One of the goals of AMA-ACE was to support HSS curricular innovation to improve residency preparation.
This study evaluates the effectiveness of HSS curricula by using a large dataset to link medical school graduates to internship Milestones through collaboration with the Accreditation Council for Graduate Medical Education (ACGME).
ACGME subcompetencies related to the schools' HSS curricula were identified for internal medicine, emergency medicine, family medicine, obstetrics and gynecology (OB/GYN), pediatrics, and surgery. Analysis compared Milestone ratings of ACE school graduates to non-ACE graduates at 6 and 12 months using generalized estimating equation models.
At 6 months both groups demonstrated similar HSS-related levels of Milestone performance on the selected ACGME competencies. At 1 year, ACE graduates in OB/GYN scored minimally higher on 2 systems-based practice (SBP) subcompetencies compared to non-ACE school graduates: SBP01 (1.96 vs 1.82, 95% CI 0.03-0.24) and SBP02 (1.87 vs 1.79, 95% CI 0.01-0.16). In internal medicine, ACE graduates scored minimally higher on 3 HSS-related subcompetencies: SBP01 (2.19 vs 2.05, 95% CI 0.04-0.26), PBLI01 (2.13 vs 2.01; 95% CI 0.01-0.24), and PBLI04 (2.05 vs 1.93; 95% CI 0.03-0.21). For the other specialties examined, there were no significant differences between groups.
Graduates from schools with training in HSS had similar Milestone ratings for most subcompetencies and very small differences in Milestone ratings for only 5 subcompetencies across 6 specialties at 1 year, compared to graduates from non-ACE schools. These differences are likely not educationally meaningful.
This study evaluates the effectiveness of medical school health systems science (HSS) curricula by linking medical school graduates to internship Milestones.
Graduates from schools with training in HSS had similar Milestone ratings for most subcompetencies compared to other schools' graduates.
As each school addresses HSS content with different focus, intensity, and pedagogy, it may be difficult to measure effectiveness in residency.
This study did not find meaningful differences in ACGME Milestones. Further investigation is needed to measure the outcomes of HSS curricula taught to students in UME.
In recent years, many US medical schools have embraced a new 3-pillar model of medical education, one that integrates the basic and clinical sciences with health systems science (HSS).1–5 The HSS pillar includes areas such as population health, health care policy, high-value care, interprofessional teamwork, quality improvement, and systems thinking.6
To determine program effectiveness, medical schools will need methods to assess the performance of learners who experience new HSS curricula. However, demonstrating long-term educational and patient outcomes from such new curricula is challenging.7–10 Most medical schools use a set of core competencies which are not fully aligned with residency programs and the Accreditation Council for Graduate Medical Education (ACGME) Milestones.11,12 This disjointed continuum creates challenges in assessing the impact of undergraduate medical curriculum on learner outcomes following medical school.13
One potential assessment method may be to compare ACGME Milestone ratings addressing HSS concepts (eg, transition of care, culture of safety, quality measures, and teamwork)3 for interns who have experienced a new medical HSS curriculum vs those who have not. To date, these comparisons have not been studied. This process may provide opportunities to improve HSS teaching overall, especially for the transition from undergraduate medical education (UME) to graduate medical education (GME).
In 2013, the American Medical Association (AMA) awarded 11 medical schools approximately $1 million each in a 5-year Accelerating Change in Medical Education (ACE) grant.14,15 The goals were to promote competency-based education and assessment, develop better understanding and improve health care systems, and enhance the learning environment. Through this project, the AMA supported the consortium of 11 schools to encourage innovation and dissemination in UME. Many of the innovative curricular changes proposed by consortium medical schools focus on elements within the broad area of HSS (provided as online supplementary data).
The study objective was to evaluate the ACE consortium schools' curriculum innovations through their graduates' performance on ACGME 6- and 12-month Milestone ratings for HSS subcompetencies and compare these ratings to those of graduates from non-ACE schools. We hypothesized that graduates from ACE schools would be more advanced in the HSS related-milestones at both 6 months and 12 months into the residency training.
Study Setting and Context
Seven of the AMA ACE schools (Brody School of Medicine at East Carolina University, Oregon Health & Science University School of Medicine, Penn State College of Medicine, Vanderbilt University School of Medicine, Indiana University School of Medicine, Warren Alpert Medical School of Brown University, and New York University School of Medicine) implemented specific HSS-related curricula that were hypothesized to influence the learning and skills for medical student graduates. The online supplementary data describes the HSS-related curricula of the 7 schools.
This study followed a cohort of medical school graduates from ACE consortium schools into residency (Figure). Starting in 2014, students at the 7 participating ACE schools experienced innovative HSS curricula. The cohort graduated in 2018, started residency, and were assessed on specialty-specific milestones. Graduates from HSS-ACE schools were compared to controls from non-ACE schools nested in each residency program.
Inclusion and Exclusion Criteria
Inclusion criteria for the ACE intervention group were graduates from 1 of the 7 ACE schools in 2018 who entered internal medicine, emergency medicine, obstetrics and gynecology (OB/GYN), surgery, family medicine, and pediatrics. These specialties were included due to sufficient numbers of graduates entering each specialty. Exclusion criteria for ACE graduates included taking extra time to graduate, for example, because of issues requiring extra time in medical school such as academic, personal, or medical difficulty or academic enhancement (getting additional degrees). As a result of this interruption, some students may not have had full exposure to the new HSS curriculum. Non-ACE graduates who took extra time to graduate from medical school were not excluded from the control group.
Inclusion criteria for the control group were students who graduated from a non-ACE school in the same year and entered in one of the 6 targeted residencies.
Outcomes and Mapping of HSS Content to ACGME Milestones
To map the HSS curricular objectives to the ACGME Milestones (Table 1; online supplementary data), the research team developed a process to determine alignment. The investigator and curricular leader from each school, content experts with first-hand knowledge of the learning objectives, mapped their schools' curricula to each subcompetency and indicated if the objectives/curricula were hypothesized to positively influence the subcompetency (yes or no), noting if there was an expectation that graduates experiencing their curriculum would perform better on the subcompetency than other schools' graduates (provided as online supplementary data). Thus, for each school, curricular changes were mapped to the ACGME subcompetencies for internal medicine, emergency medicine, OB-GYN, surgery, family medicine, and pediatrics. Subcompetencies were included in the analysis if at least 6 of the 7 schools mapped the HSS-related curricula to that ACGME subcompetency. The HSS related subcompetencies are included in the online supplementary data.
Data Analysis and Statistical Considerations
The data analysis was performed in aggregate by the ACGME using the 6-month (December 2018) and 12-month (June 2019) Milestone ratings that each residency program reports to the ACGME. Statistical analysis was performed to determine if there were significant differences among ACE and non-ACE graduates' (control) performance on HSS Milestones nested in their residency programs. On each ACGME subcompetency, group-mean difference in rating was compared between the ACE consortium residents and the control group. To account for potential dependency of Milestone ratings within each program, a generalized estimating equation (GEE) model,16,17 using an exchangeable covariance matrix, was employed to see if there was any difference in Milestone ratings between the ACE consortium residents and the control group. The intraclass correlation and design effect17,18 are substantially high for the subcompetencies included in current study (provided as online supplementary data), which indicates the necessity of accounting for correlations among observations in the analysis. Statistical tests were conducted based on 2-sided P values. In this study the particular list of subcompetencies selected for statistical analysis was based on a priori theoretical considerations, therefore a correction for multiple comparisons was not indicated. For example, in emergency medicine, this allowed us to eliminate 19 of the 23 subcompetencies from further analysis, thus avoiding unnecessary exposure to over-interpretation of multiple comparisons.
This study was approved by the American Institutes for Research Institutional Review Board.
Table 2 shows the comparison of graduates of ACE schools vs non-ACE schools, for each specialty, in 6- and 12-month HSS-related Milestone subcompetencies. The analysis of the 6-month HSS-related competencies, across the 6 specialties, demonstrated statistically significant differences between ACE and non-ACE graduates on only one HSS subcompetency in OB/GYN. There was minimal variation in Milestone ratings at 6 months, which may account for the inability to find differences.
In the 12-month analysis, the only differences were for 2 specialties, OB/GYN and internal medicine, where graduates of ACE schools scored higher on some of the HSS-related subcompetencies compared to non-ACE graduates (2 of 4 OB/GYN HSS-related competencies and 3 of 5 internal medicine). For these specialties, the difference in mean Milestone ratings between cohorts of ACE and non-ACE graduates ranged from a 0.08 to 0.14 difference in each subcompetency. There was no difference in ACE and non-ACE graduates in the other specialties.
This first study, to compare the intern 6- and 12-month pertinent Milestone ratings of graduates from schools with new HSS curricula to graduates without the new curricula, found no differences in subcompetency ratings at 6 months and trivial differences in some subcompetencies at 12 months for just 2 of the specialties. This strategy, using a consortium of medical schools, large ACGME datasets, and longitudinal follow-up may be a model for examining curricular changes across the continuum of UME and GME.
There are several reasons for these findings of no significance. This study provides opportunities for further investigation and development of HSS across the UME-GME continuum. These findings may represent a true lack of effect from the new curriculum, but may also relate to other considerations, such as variability in curricular content, faculty, and other contextual issues.
Variability in Curricula
Each school addresses HSS content with its own focus, intensity, and pedagogy (provided as online supplementary data), which limits uniformity of HSS education across all studied medical schools. This study shows how difficult it is to implement and measure the effect of the HSS cross-institutional curriculum reform at scale. Further, many of the schools have only begun to evaluate the curriculum and for the most part have not focused their evaluations on assessment of achieving the HSS learning objectives or the impact on students, patients, or the health system. As HSS assessment and instruction develops further, including the second edition of the Health Systems Science textbook,6 the National Board of Medical Examination HSS examination,19 and faculty development sessions,20 there is opportunity for HSS content alignment between schools and into residency. As schools mature their HSS curriculum and assessments, future work might directly link performance on HSS assessments to residency subcompetencies.
The Problem of Transfer
Transfer of knowledge and skills is an important issue in higher education. What may be learned in one setting or context such as medical school may not transfer to the next phase of training. For example, if students are taught cost-conscious care using case-based learning or standardized patients, there is no guarantee that the knowledge and skill will be transferred to clinical practice as an intern. Further, the complex nature of HSS and the importance of differences in context may make transfer additionally problematic. Therefore, the limited differences found with this study may be affected by the difficulty of transfer of what was learned in medical school to internship. This problem is confounded by the rapid learning in internship and shift to navigating new systems and roles, such that interns may not have the time or opportunity to apply, much less excel in the HSS knowledge and skills. The residency context may also affect transfer; it is not known to what extent an environment with different priorities and higher stakes alters application of HSS learned in medical school. Perhaps OB/GYN and internal medicine may be more similar and facilitate transfer from UME HSS to GME.
It is important to remember that the medical schools' curricula have their own objectives and competencies. While ideally these skills, knowledge, and attitudes transfer with the graduates to residency, there are not direct connections to residency practice and specialty-specific Milestones.
Assessment of HSS
When considering the overall findings of this study, it is possible that the Milestones are not sensitive indicators of educational interventions in medical school because the content and skills were not aligned with the ACGME subcompetencies.
Within GME, there are limitations to the current Milestones assessment system,15 including concerns about effectively measuring some of the competencies due to poor understanding of the domain by the faculty or residents may not be observed performing that competency. The ACGME has noted lower scores21 and decreased variability of the 6-month intern subcompetencies. Interns may not have sufficient experience and faculty may not have adequate opportunity to observe the interns. In terms of the HSS related subcompetencies, the ACGME is encouraging harmonized (similar) Milestones across all specialties for some competency domains that may create better alignment between UME and GME as well as between specialties.
The issues discussed above provide threats to the validity of assessment of HSS in the transition from UME to GME. This study illustrates how difficult it is to implement and measure the effect of a carefully conceived HSS intervention across institutions. Unless implementation is more consistent and outcome measures are more consistently assessed, it is hard to determine effectiveness. This work highlights the need for a more uniform HSS curriculum across medical schools and into residency and robust HSS assessment.
A limitation of this study is that the HSS curriculum of non-ACE schools is not known. For the control group, not excluding students that had a prolonged medical school course could have affected their Milestones. Some specialties show limited variability of subcompetency scores for the cohorts of residents. For example, the vast majority of emergency medicine interns are scored level 2 on many patient care Milestones. Finally, it is possible that Milestone ratings across the subcompetencies chosen for analysis were not independent of each other. In that case, statistical correction for multiple comparisons would be warranted and some of the statistically significant findings observed here would have been eliminated.
While this study did not find meaningful differences in ACGME Milestones, future work will explore application of UME HSS experiences to residency through qualitative interviews of graduates from ACE schools. More investigation is needed to assess the relationship between HSS curricula taught to students in UME and their subsequent competencies as GME residents.
Graduates from AMA ACE schools with training in HSS received slightly higher but not educationally meaningful Milestone ratings for only 5 subcompetencies across 6 specialties at one year compared to graduates from non-ACE schools.
The authors would like to thank Karen Hauer, Tonya Fancher, Michelle Daniel, Stephanie Starr, Judee Richardson, Michael Dekhtyar, and Nick Yaghmour for conceptualization of the study, and Heather Brickley for manuscript preparation.
Editor's Note: The online version of this article contains each school's health systems science curricular changes, a map of curricula to each Milestone for each of the schools, and a table of intraclass correlation and design effect for focal subcompetencies.
Funding: This study was funded by an American Medical Association Accelerating Change in Medical Education grant.
Conflict of interest: The authors declare they have no competing interests.