Physicians require the expertise to care for an increasingly aging population. A robust understanding of geriatric educational interventions is needed to improve geriatric training for physicians.
To map the breadth of geriatric educational interventions for residents (in non-geriatric specialties).
We used a scoping review methodology. We searched MEDLINE, Embase, EMCare, CENTRAL, ERIC, and Scopus from 2004 to September 2019 for search terms related to “educational approaches” AND “geriatric” AND “residents.” Two authors independently selected eligible studies, extracted data (categorized by educational approaches and Kirkpatrick level outcomes), and critically appraised studies using the Mixed Methods Appraisal Tool.
There were 63 included studies, with a total of 6976 participants. Twelve studies had comparators, including 5 randomized controlled trials. Fifty-three studies (84%) described multicomponent interventions, incorporating combinations of didactic or self-directed approaches with interactive, simulation, experiential, and/or group-based learning. Use of curricular process was explicitly reported in 34 studies (59%). Most studies met at least 4 of 5 Mixed Methods Appraisal Tool criteria. Studies commonly measured outcomes at Kirkpatrick levels 1 and 2 (reaction and learning), with 15 studies measuring performance outcomes (Kirkpatrick levels 3 and 4b). All included studies had at least one positive result.
All educational interventions had positive outcomes; however, curriculum-informed multicomponent interventions were the most common. This scoping review demonstrates that robust methodology with comparators, longer-term designs, and use of higher-level Kirkpatrick outcome measures is possible but not commonly used. Clear direction for future research is provided.
The increasing population of older persons, with disproportionately more complex health needs,1 accentuates the need for physicians to become competent in the care of the older person.2 While medical schools have increasingly targeted geriatric education (United States,3 Europe,4 and Australia5), research suggests that gaps still exist upon entry into residency training programs.6
These gaps prompted the 2010 publication of consensus minimum geriatric competencies aligned to the Accreditation Council for Graduate Medical Education (ACGME) Program Requirements for Internal Medicine.7 This led to more extensive development of geriatric education interventions and the incorporation of geriatric training into other specialty programs.6 These competencies have been more recently encapsulated by the Geriatrics 5Ms: mind (eg, dementia, delirium, depression); medication (eg, polypharmacy, aging adverse effects); mobility (eg, falls, function); multi-morbidity (eg, complexity, chronic, age-related illness); and matters most (eg, person-specific care, goals and transitions of care).8
Two reviews of geriatric educational interventions specific to residents were published in the past 5 years in internal medicine9 (14 studies) and emergency medicine10 (9 studies) settings. They found a range of mostly multicomponent interventions, consisting of teaching approaches such as lectures, readings, web modules, simulations, clinical experiences, feedback, and group discussion. Outcomes were generally positive, with more studies measuring learner reactions (eg, satisfaction, usefulness) and learning (eg, attitudes, knowledge) than performance (eg, physician behavior, patient outcomes). Given the relatively narrow inclusion criteria of the 2 reviews (internal medicine or emergency medicine only and exclusion of noncomparative studies or those with no baseline assessments, respectively), it is difficult to determine if their findings are transferable to other settings.
As the body of research in this field continues to grow, a comprehensive analysis of current educational approaches in geriatrics is needed. This scoping review therefore aims to describe the range of educational approaches and outcomes for geriatric education for residents in non-geriatric specialties to guide educational practice and future research directions.
This study used a scoping review approach where the authors worked iteratively as a team,11 using the descriptive analytical method (reinterpreting the retrieved literature under a constructivist framework) proposed by Arksey and O'Malley.12 We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses for Scoping Reviews (PRISMA-ScR).13
Inclusion and Exclusion Criteria
All study designs published in English, with residents as participants, were included. Residents were defined as those who had completed their medical degree but were still undergoing postgraduate training (not in geriatric training). Acknowledging international variation in terminology, search terms broadly included interns, registrars, foundation/junior/prevocational physicians, and basic and advanced trainees. Mixed learner studies were only included when a minimum of 50% of study participants were residents. In addition, we included studies where single or multicomponent educational interventions targeted geriatric topics with educational approaches that could be classified based on a combination of the John Hopkins Continuing Medical Education conceptual model14 and the Interactive Constructive Active Passive (ICAP) framework (illustrated in Table 1).15 Furthermore, studies had to report outcomes classifiable by Kirkpatrick levels (Table 2).16,17
Opinion letters and conference abstracts were excluded due to limited information. We also excluded interventions targeting topics that were not largely exclusive to geriatrics and those that were not primarily educational in nature.
The following databases were searched from 2004 to September 2019: MEDLINE, Embase, EMCare, CENTRAL (Cochrane Controlled Register of Trials), ERIC, and Scopus. Related terms (including MeSH) were used to express “educational approaches” and “geriatrics” and “residents” modified for each database (full search strategies provided as online supplementary data). Two grey literature databases were searched: OpenGrey (http://www.opengrey.eu) and APAIS-Health. We also hand searched reference lists.
Two reviewers (E.Y.O., L.N. or K.J.B.) independently screened titles and abstracts for inclusion based on the selection criteria. Full-text articles were then retrieved and reviewed by 2 reviewers independently (E.O., L.N. or K.J.B.) to identify studies for inclusion. Any disagreements between the 2 reviewers were resolved by discussion with the third reviewer. If multiple publications were based on the same study data, they were collated and described as a single study.
Data Charting and Synthesis
One reviewer (E.Y.O.) extracted data from all included studies, and a second reviewer (L.N. or K.J.B.) reviewed the extracted data. An iteratively optimized11 data extraction form based on a Best Evidence Medical Education (BEME) coding sheet18 was used to collect the following uniform datasets:
Critical appraisal of each study was independently performed by 2 reviewers (E.Y.O., L.N. or K.J.B.) using the Mixed Methods Appraisal Tool.21 This tool can be used to appraise all study designs with criteria tailored to the type of study design (online supplementary data). Discrepancies were resolved through discussion with a third reviewer (L.N. or K.B.).
Study Characteristics and Critical Appraisal
A total of 9154 citations were identified. Following the selection process (illustrated in the Figure), we included 63 studies8,22–84 (detailed in online supplementary data) with a total of 6976 participants. Most studies were from the United States (54 studies, 86%). More than half of the studies (60%) reported receiving funding, with 23 studies supported by either the Hartford or Reynolds Foundations, both US-based. There were 16 studies (25%) with > 100 participants (range 6 to 876). The majority of residents were within 3 years postgraduation. There were residents from several disciplines, including internal medicine (in 30 studies, 48%), family medicine (in 15 studies, 24%), surgery (in 12 studies, 19%), and emergency medicine (in 9 studies, 11%).
Studies were critically appraised using the Mixed Methods Appraisal Tool (online supplementary data); 58 (92% of studies) met at least 4 of the 5 criteria. The most common study design was quantitative (52 studies, 83%). There were 12 studies with controls or comparators,8,31,34,41,46,53,54,65,67,71,79,82 including 5 randomized controlled trials (RCTs).34,46,53,79,82 For outcome measurements, 22 studies (35%) had response rates of >80%, and 20 studies (32%) had both immediate and longer-term measurements. Validated outcome tool use (see Table 2 and online supplementary data) was reported in 21 studies (33%).
Interventions with a single educational approach were uncommon (10 studies, 16%). These were a lecture,75 video,43 readings,36,73 e-learning,34,55,79 role-play,84 a hiking experience,78 and group case discussion.63 The rest (53 studies, 84%) had multicomponent interventions with different combinations of didactic (39 of 53 studies, 62%), self-directed (22 of 53, 35%), interactive (16 of 53, 25%), simulation (10 of 53, 16%), experiential (31 of 53, 49%), and group-based (40 of 53, 63%) educational approaches (provided as online supplementary data). The most common combination included passive (didactic or self-directed), practical constructive (simulation or experiential), and group collaborative learning (14 studies, 23%).23,24,28,31,32,37,38,51,57,59,62,65,67,69 One example combined a perioperative ward experience with didactic teaching and group debriefing.67 The next most common combinations were passive approaches combined with either group learning (9 studies, 15%)22,27,40,42,45,49,50,58,82 or with practical constructive approaches (8 studies, 13%).25,26,35,56,66,70,72,81 Use of curriculum was explicitly reported in 37 studies (59%). Four studies included a system approach: one utilized point-of-care,68 2 had faculty development,24,52 and 1 had both.28 Of all the educational approaches used, 78% were live (face-to-face), 11% were digital (e-learning, online, mobile app, computer-based recordings), and 11% were print media.
Interventions were block (one-off or clustered), periodic (intermittent engagement spread out once a week/month), or at the learner's discretion (eg, readings,36,73 videos43). Block or one-off interventions (37 studies, 59%) included a 30-minute lecture,33 2-day workshops,42,50 a 30-hour role-play with feedback,84 and a 4-week inpatient experience.47,80 Periodic interventions (22 studies, 35%) ranged from 24 hours of web-based learning over 6 weeks,61 to outpatient exposure for 1 hour per month over a year,26,27 to a whole day per week for 12 weeks.41
Forty-four studies (70%) covered more than one geriatric topic (range 1–14) with the most common topics being: cognition, medication management, falls, and functional assessment. The other 19 studies covered a range of single topics, with the most common being medication management (4 studies), cognition (3 studies), and transitions of care (3 studies).
A wide variety of tools were used to measure outcomes across the Kirkpatrick hierarchy, including surveys (Likert questionnaires, free-text evaluations), assessments (multiple-choice questions), essays, objective structured clinical examinations (OSCEs), video recordings, chart audits, interviews, and focus groups. All studies reported at least one positive outcome, and 46 (73%) studies reported positive outcomes across every measure used. Of the 47 studies (75%) that analyzed for statistical significance, 44 had at least one, and 35 had statistically significantly improvements across all measures (Table provided as online supplementary data).
Across all studies (Table provided as online supplementary data), reaction outcomes (Kirkpatrick Level 1) were the most commonly measured (52 studies, 83%), with 26% of studies exclusively measuring these. Outcomes were generally positive: 35 of 38 studies were positive for satisfaction (Level 1a) and 35 of 36 for utility reactions (Level 1b) such as confidence and self-assessments. Self-assessment (Level 1b) overestimated actual OSCE results (Level 2bii)47 and video-recorded behavior (Level 3).72 For attitude (Level 2a) outcomes, only 5 of 12 studies showed statistically significant improvements without notable differences in study approaches or duration. For knowledge (Level 2bi), 27 of 31 found statistically significant improvement. For skills (Level 2bii), 6 of 7 reported positive findings. One study found no correlation between knowledge (measured through multiple-choice questions) and skills (measured with standardized patients).36
Performance outcomes (Kirkpatrick levels 3–4) were measured by 15 studies (24%). For behavior outcomes (Level 3), 4 of 6 studies had positive findings: 3 were curriculum-guided clinic experiences complemented by lectures,31,72 role-play,72 research projects, journal clubs,31 or electronic medical record (EMR) prompts,68 while librarian-driven geriatric case-based discussions resulted in a 14-fold increase in geriatric consultations.63 For patient outcomes (Level 4b), 8 of 10 studies had positive findings.28,39,54,71,75,79,80 They were mostly curriculum-guided,28,54,71,75,79,80 investigating either lectures,39,75 cue cards,39 e-learning,79 or experiential approaches28,46,54,71,80 complemented by lectures,80 readings,46 e-learning,46,80 role-plays71 or academic detailing,28 group discussion,54,71,75 or electronic prompting and faculty development.28 Of these, Mecca et al54 found that focused interprofessional group discussions prior to clinic resulted in medication reductions in 85% of clinic patients (Level 4b), while Caton et al28 found that academic detailing by geriatricians, lectures, and cue cards, reinforced by electronic prompting and faculty development, was effective in increasing the falls risk screening completion of clinic patients to 92% (Level 4b). The positive findings resulted in electronic prompting being “permanently embedded into EMR.”28 In contrast, a multicomponent clinic experience with lectures and e-learning by Chang et al30 reported improvements in knowledge (Level 2bi) immediately post-intervention but noted a statistically significant decline in documentation audit (Level 4b) 6 months post-intervention, postulated to be due to “training, system, and culture.” Notwithstanding the small number of studies (n = 2),28,68 all studies with electronic prompting (point-of-care system approach) were associated with positive changes in performance (Levels 3 and 4b) outcomes.
There were 5 RCTs identified in this review. Three RCTs compared academic detailing (one-on-one teaching on cognition, continence, malnutrition, and capacity),82 e-learning (cognition, mental health, falls, and continence),79 and a mobile device app (geriatric tools and scales),53 respectively, with reading material as their control.53,79,82 One cluster RCT investigated a clinic performance improvement audit project in falls or goals of care (with e-learning and reading in both arms).46 One RCT compared medication e-learning with usual practice.34 All found statistically significant improvement in Level 2bi (knowledge) immediately,53,79 at 334 and 7 months.82 Three studies performed documentation audits (Level 4b).46,53,79 The geriatric tool app did not find statistically significant improvement (author reported poor uptake of the app),53 while the clinic performance improvement project found a statistical significant improvement.46 The e-learning study had a statistical significant improvement compared to printed material control in 1 of the 7 modules; however, a corresponding decline in the knowledge post-test for this same module for the controls suggested possible content issues.79
This scoping review of geriatric education for residents included 63 studies with a broad range of educational interventions (typically multicomponent) and geriatric topics. Irrespective of study intervention or topic, interventions were found to improve measures of satisfaction and utility reactions (Kirkpatrick levels 1a and 1b) and knowledge (Kirkpatrick Level 2b), but there were mixed findings for attitudes (Kirkpatrick Level 2a) and performance (Kirkpatrick levels 3 and 4). Five studies were RCTs demonstrating that more cognitively engaging interventions15 that used e-learning, mobile device apps, academic detailing, and performance improvement audit projects were more effective than interventions that used reading materials or other more traditional methods.
These positive results were comparable to previous medical education reviews,9,10,85 and there were a number of factors that likely contributed to these findings. Multicomponent interventions were common and typically combined “knowledge transfer” (didactic teaching, self-directed pre-reading) with subsequent opportunities for “practical application” (simulation or experiential approaches) and/or “collaboration” (group-based learning). These interventions therefore were likely to synergistically leverage the inherent theoretical advantages of each approach.85,86 In particular, experiential learning allows for application of learning into real-world settings with exposure to role-modeling of good patient care,87–90 and group-based learning allows a collaborative deepening of learning through reflection, feedback, and social engagement.87,88,91 In addition, more than 50% of studies reported utilizing a curricular process, which can optimize interventions by including needs analysis, setting objectives, designing congruent educational approaches, and evaluation following implementation (eg, Kern's 6-step method).92 Finally, around 30% of studies spaced learning over an extended period, potentially aiding retention of knowledge.85,93
Our included studies predominantly measured lower Kirkpatrick levels (reactions and learning). While some may question the usefulness of these, satisfaction reactions (Level 1a) do provide an indication of engagement and enjoyment of learning and can be a useful way for educators to demonstrate value to learners. Self-assessed behavior change was classified as a utility reaction (Level 1b)17 as it may not reflect prospective behavior change (Level 3). Less than half of the included studies found improvements in attitudes (Level 2a), which may reflect limitations of available measurement tools94 and the possible ineffectiveness of hospital-based, short-term, empathically limited educational interventions.10,95 Improvements in knowledge (Level 2bi) should be interpreted with caution in terms of reliability given the measurement methods (eg, small numbers of multiple-choice questions or same question sets pre- and post-intervention).9,10,85 Similar to prior medical education literature,17,85 we found no evidence that lower Kirkpatrick measures predicted higher-level outcomes (behavior and patient results), which are arguably more important yet challenging to evaluate and improve. Prior medical education literature has similarly found that “reinforcing or enabling constructs”20,96 utilizing system or organizational change theory97 approaches in the workplace improve these performance outcomes.19,88 Examples from our review included: (1) developing faculty or supervisors to support new clinical practice28,52; (2) presenting relevant and timely knowledge (even passive lectures alone75 or with cue cards39), prompts, and templates (eg, via EMR) at the point-of-care28,68 ; or (3) new clinical protocols to embed new practice into organizational culture.28
Results should be interpreted in the context of common methodological weaknesses across the included studies despite the relatively high Mixed Methods Appraisal Tool ratings (eg, lack of controls or blinding, small sample sizes, short-term durations, non-validated outcome tools, and no effect sizes). Interpretation should also consider publication bias, the Hawthorne effect, and the relationships between learner, teacher, and investigator. Participants were predominantly internal medicine residents, which may limit transferability of findings. Heterogeneity in study design and the lack of direct comparisons meant we could not determine if one particular approach was most effective. Conversely, the largely positive results and lack of negative studies limited the identification of ineffective approaches.
Recommendations for Best Practice
Our recommendations are based on approaches that have been most used and findings from higher quality trials. Implications for educators include: (1) using curricular processes in developing educational programs, incorporating needs analysis, aligned objectives, and program evaluation (including learner reactions) in the local context; (2) multicomponent programs to harness the synergies of the different approaches for maximizing learning, especially including experiential and/or group approaches; (3) intentionally targeting change in behavior and clinical practice (eg, incorporating system or organizational change approaches), considering relevance and timeliness in the workplace context; and (4) advocacy for policy and resourcing to broaden research and implementation of postgraduate geriatric education.
This scoping review highlighted significant gaps in the current literature that can be researched in the future: (1) using robust methodology with adequately powered controlled trials, use of validated outcome measures, and statistical analysis for significant differences and the inclusion of effect sizes; (2) comparing different educational approaches; (3) using common frameworks to understand, define, and classify educational approaches and outcomes; (4) exploring longer-term retention of learning; (5) investigating approaches and collecting data targeting behavior and patient outcomes: considering system or organizational change theory approaches and workplace or situated learning; and (6) collecting clarification (“Why does it work?”) data to test theories underpinning learning, including confirming the findings of previous studies in different contexts.
All studies in this review reported positive outcomes for their interventions, with curriculum-informed multicomponent interventions being the most common. This scoping review has described a range of studies, including those which are robust, have comparators, longer-term designs, and/or positive higher-level Kirkpatrick outcomes.
Editor's Note: The online version of this article contains a full list of search strategies used in the study, a critical appraisal of studies using the Mixed Method Appraisal Tool, and a table of study characteristics and summary of outcomes.
Funding: The authors report no external funding source for this study.
Conflict of interest: The authors declare they have no competing interests.
This work was previously presented in poster format at the OTTAWA Conference, Lumpur, Malaysia, February 29–March 4, 2020, and the virtual Australian and New Zealand Association for Health Professional Educators Conference, July 12–15, 2020.