Background

Theoretical frameworks provide a lens to examine questions and interpret results; however, they are underutilized in medical education.

Objective

To systematically evaluate the use of theoretical frameworks in ophthalmic medical education and present a theory of change model to guide educational initiatives.

Methods

Six electronic databases were searched for peer-reviewed, English-language studies published between 2016 and 2021 on ophthalmic educational initiatives employing a theoretical framework. Quality of studies was assessed using the Grading of Recommendations, Assessment, Development, and Evaluations (GRADE) approach; risk of bias was evaluated using the Medical Education Research Study Quality Instrument (MERSQI) and the Accreditation Council for Graduate Medical Education (ACGME) guidelines for evaluation of assessment methods. Abstracted components of the included studies were used to develop a theory of change model.

Results

The literature search yielded 1661 studies: 666 were duplicates, 834 studies were excluded after abstract review, and 132 after full-text review; 29 studies (19.2%) employing a theoretical framework were included. The theories used most frequently were the Dreyfus model of skill acquisition and Messick's contemporary validity framework. GRADE ratings were predominantly “low,” the average MERSQI score was 10.04, and the ACGME recommendation for all assessment development studies was the lowest recommendation. The theory of change model outlined how educators can select, apply, and evaluate theory-based interventions.

Conclusions

Few ophthalmic medical education studies employed a theoretical framework; their overall rigor was low as assessed by GRADE, MERSQI, and ACGME guidelines. A theory of change model can guide integration of theoretical frameworks into educational initiatives.

A theory is a set of logically related propositions that describe relationships among concepts and help explain phenomena.1  In medical education, theories serve as the basis of theoretical frameworks that provide a lens to explore questions, design initiatives, evaluate outcomes, measure impact, and disseminate findings.2  Studies grounded in theory guide best practices and may serve as “clarification” studies that evoke depth of understanding and propel the field forward.2,3  For example, the Shannon and Weaver Model of Communication has been used to analyze opportunities for error in clinician handoffs,4  and Ericsson's deliberate practice theory has been used to design a simulation course to teach advanced life support skills.5 

However, theoretical frameworks are underutilized in medical education research.3,6  Many educational initiatives, especially within subspecialty medical education, continue to be developed based on the traditional teacher-apprentice model.2,7  Lack of theory-based educational initiatives can preclude meaningful interpretation of study methods and results, as theories ground new scholarly work within current literature, allow application of findings to other settings, and provide a framework for adaptation of existing theories or development of new theories.3,6  Additionally, there is a dearth of studies on the prevalence of theoretical framework usage in subspecialty medical education.8 

This article has 2 purposes: to systematically review the role of theoretical frameworks in subspecialty medical education, using ophthalmology as an example, and to use the findings to construct a theory of change model9  for guiding the development of theory-based graduate medical education curricula. Our primary questions are: What is the prevalence of theoretical framework use in ophthalmic medical education, and how can educators best integrate theory-based educational initiatives? Our findings may benefit educators by highlighting the state of theoretical framework use in subspecialty medical education and by extending these findings into a theory of change model to encourage the more widespread use of theoretical frameworks.

Search Strategy

A research librarian (L.P.) was consulted to develop a comprehensive search strategy. Following updated Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines10  on the conduct and reporting of systematic reviews, we searched 6 online databases (PubMed, Embase, Web of Science, CINAHL, PsycINFO, and ERIC) for articles published between January 1, 2016 and January 16, 2021 (Figure 1). We selected a 5-year period prior to the writing of this review to capture current practices in medical education. Our searches included database-specific thesaurus terms, such as medical subject headings (MeSH) and Emtree, as well as keywords relevant to ophthalmic education and theoretical frameworks (online supplementary data).

Figure 1

PRISMA Flow Diagram

Figure 1

PRISMA Flow Diagram

Close modal

Selection Criteria

Eligibility criteria included peer-reviewed, English-language studies discussing educational initiatives in an ophthalmology setting that employed a theoretical framework at the onset of the initiative. We used the definition of theoretical framework by Varpio et al: “a logically developed and connected set of concepts and premises—developed from one or more theories—that a researcher creates to scaffold a study.”1  Educational initiatives included development of curricula, learning interventions, training strategies, and evaluation methods (eg, rubrics). We also included studies that referenced initiatives informed by a theoretical framework and studies that assessed learners with clinical evaluation methods, such as rubrics, employing a theoretical framework. We excluded reviews, studies that were not explicitly informed a priori by a theoretical framework, and studies that focused on populations other than medical students, ophthalmology trainees, or ophthalmologists. We also excluded studies that employed best practice models without describing a theoretical framework.

Eligible studies were de-duplicated in EndNote (Clarivate Analytics, Philadelphia, PA) using the method by Bramer et al11  and imported into the systematic review software Covidence (Melbourne, Victoria, Australia) for screening, full-text review, and data extraction. Two reviewers (S.L.S, Z.Z.Y.) conducted abstract screening and full-text review independently and in duplicate, with disagreements arbitrated by the senior author (P.B.G.).

Data Extraction

A data extraction template developed in Covidence was used to extract relevant information, including year of publication, location, study design, characteristics of study participants, sample size, educational initiatives, theoretical frameworks, underlying theories, outcomes, and results. Data extraction was completed independently and in duplicate by 2 reviewers (S.L.S., Z.Z.Y.), with disagreements arbitrated by the senior author (P.B.G.).

Quality Assessment

The Grading of Recommendations, Assessment, Development, and Evaluations (GRADE) guidelines12  were used to evaluate the overall quality of the studies. The GRADE approach scores quality of evidence based on risk of bias, inconsistency, indirectness, imprecision, and publication bias; studies can be upgraded by demonstrating large effects, plausible confounding, and dose response gradients. The GRADEPro Guideline Development Tool (Evidence Prime, Ontario, Canada) was used to create a GRADE evidence profile for outcomes.

Comprehensive risk of bias (methodological quality) for experimental, quasi-experimental, and observational studies was measured using the Medical Education Research Study Quality Instrument (MERSQI).13  MERSQI scores medical education studies on 10 questions across 6 domains for a maximum of 18 points.

Comprehensive risk of bias (methodological quality) for studies that developed clinical assessment methods (eg, rubrics) was determined using guidelines developed by the Accreditation Council for Graduate Medical Education (ACGME).14  Unlike the GRADE standards which evaluate overall quality of studies based on outcomes, the ACGME guidelines are the only published method to date that evaluates quality of clinical assessment methods.15  Studies are assigned a letter grade ranging from A to C on 6 domains (reliability, validity, ease of use, resources required, ease of interpretation, and educational impact), an overall level of evidence, and an overall recommendation for uptake into a program's evaluation system. All components of quality assessment and risk of bias analysis were completed independently and in duplicate by 2 co-authors (S.L.S., Z.Z.Y.) with disagreements arbitrated by the senior author (P.B.G.). This study adhered to the tenets of the Declaration of Helsinki.

Developing a Theory of Change Model

Theory of change models are commonly used in large-scale projects16,17  to delineate the steps and interventions needed to achieve a set of long-term outcomes by backwards mapping the required preconditions, assumptions, rationale, and interventions necessary to achieve these outcomes. We used our findings to construct a theory of change model9  depicting the steps and resources required for an educational system to develop theory-based initiatives.

Study Selection

A total of 1661 results were identified: 700 from PubMed and 961 from the other electronic databases (Figure 1). After excluding 666 duplicates, 995 potential studies were identified; 834 were excluded following title and abstract screening. We reviewed 161 articles in full. We excluded 10 articles that were not studies or were not ophthalmology specific. Of the remaining 151 articles discussing educational initiatives in ophthalmology research, 29 (19.2%) were explicitly informed by a theoretical framework and made up the final analytic sample.

Study Quality

According to the GRADE approach for rating certainty of outcomes, 10 outcomes were rated as “very low” certainty, 7 were rated as “low” certainty, 1 as “moderate” certainty, and 1 as “high” certainty; this is consistent with reported ratings for non-randomized studies.12  The online supplementary data contain a GRADE evidence table for the 7 most important outcomes, rated by 3 authors (S.L.S., Z.Z.Y., P.B.G.) using the GRADE guidelines.

The average MERSQI score for all applicable studies was 10.04 out of 18 points; by comparison, recently published mean MERSQI scores ranged from 9.05 to 12 in other surgical subspecialties.18,19  The online supplementary data list MERSQI scores for each applicable study.

For studies that developed clinical assessment methods, overall ACGME guideline scores were mixed for reliability, relatively high for validity, high for ease of use, very high for resources, relatively high for ease of interpretation, and unclear for educational impact. In the absence of large-scale studies or randomized trials, the overall recommendation for all applicable studies was judged as “Class 3” (provisional usage as a component of a program's evaluation system), the lowest rating. These scores are consistent with other reviews of clinical skill assessment methods.15,20  The online supplementary data list ACGME guideline ratings for clinical assessment development studies.

Characteristics of Included Studies and Interventions

The most common study types were prospective cohort21-29  and cross-sectional.30-36  Studies were most commonly conducted in the United States,27,37-42  Denmark,23,28,36,43  and the United Kingdom.25,26,44,45  Only 7 studies22,26,31,32,35,44,46  had sample sizes over 50 participants (range 52-311). Ten studies22,29,34,36,38,44,45,47-49  included attending ophthalmologists, 9 studies21,27,30,33,37,39-42  included residents, 6 studies24,26,31,32,35,46  included medical students, and 4 studies23,25,28,43  included a mixed selection of participants. Seven educational intervention studies were conducted in a hospital/clinic,21,22,29,32,37,39,40  6 in an in-person classroom,24,30,31,35,41,46  9 in simulation centers,23,25,28,33,36,40,42,44,45  1 in a virtual classroom,26  and 1 at an academic conference.34 Table 1 contains characteristics of the included studies.

Table 1

Characteristics of Included Studies

Characteristics of Included Studies
Characteristics of Included Studies

Theories and Theoretical Frameworks

Studies used a variety of theoretical frameworks (Table 1). The most commonly used theoretical frameworks were the Dreyfus model of skill acquisition,22,27,29,33,38,44,45,47-49  Messick's contemporary validity framework,23,28,36,43  and Bloom's taxonomy of cognitive abilities.24,31,34,46 

Outcomes and Results

Table 2 describes outcomes and study results. Most studies measured components of surgical performance and skill, such as intraoperative complications,21,22,29,37,40  surgery performance,22,27-29,36  aesthetic grade,42  surgery completion,22  and surgical efficiency.39,42  Several studies investigated learning outcomes, such as examination performance,24,31,43  learning readiness,35  learning style,30  and learning barriers.33  Nine studies examined components of validity.23,28,38,43-45,47-49  Studies also assessed subjective participant evaluation of the initiative; 8 studies24,26,27,32,34,41,42,46  surveyed participants, and 1 study surveyed surgical teams.25 

Table 2

Outcome Measures and Study Results

Outcome Measures and Study Results
Outcome Measures and Study Results

Theory of Change Model

Our theory of change model (Figure 2) aimed to provide a framework to guide educators in developing, implementing, and evaluating theory-based educational interventions. We abstracted key components of ophthalmic educational initiatives based on theoretical frameworks. Additionally, we analyzed studies that described how theoretical framework usage informed study design to reveal preconditions for designing theory-based initiatives. Given the relative dearth of studies that transparently reported theoretical framework usage, we also referenced literature on theory of change models and the Center for Theory of Change's guidelines50  to further inform the development of our model.

Figure 2

Theory of Change Model

Figure 2

Theory of Change Model

Close modal

Assumptions that must hold true for developing theory-based interventions successfully include flexibility of the curriculum to accommodate change, educators' willingness to learn about and employ theoretical frameworks, participants' willingness to trial curricular interventions, and administrators' willingness and ability to support educators and participants. Resources include educators, participants, administrators, material resources, educational resources, and data collection systems.

In our hypothetical example of an ophthalmology residency curriculum initiative, an area for curriculum improvement or change must first be identified by analyzing performance trends and summative or formative evaluations or conducting a needs assessment. For example, if 40% of first-year residents in an ophthalmology program scored poorly on their national training examination, educators may be asked to develop an educational intervention to rectify the low scores.

Prior to developing the intervention, educators may undergo training in educational theory to better select and apply theoretical frameworks. Administrators may set aside protected time for learning and make funding available to provide educators with learning resources such as webinars and reading lists. Educators are then better equipped to conduct a literature review and select appropriate theories to inform their intervention. Educators may, for example, select Vygotsky's collaborative learning theory,51  which suggests that peer-to-peer learning fosters deeper thinking. With administrative support, they may review available resources and plan to dedicate 30 minutes at the end of weekly didactics for resident-led examination practice. After each session, residents may be invited to fill out evaluations on their satisfaction with the initiative within 72 hours.

Intermediate outcomes include satisfaction with the initiative, improved standardized evaluation metrics (eg, proportion of residents who score well on the national examination), and increased number of learning or graduation competencies fulfilled. Long-term outcomes for learners include improved knowledge base and better performance as resident and practicing physicians.52  Long-term outcomes for educators include increased use of conceptual frameworks in educational initiatives, which may translate to increased scholarly output and funding.3  Ultimately, achieving these outcomes will support the goal of increasing theory-based educational interventions throughout an educational system.

Finally, the initiative development process is iterative, and performance data may be routinely reviewed to inform future modifications. For example, residents may prefer more timed examination simulations; educators may then reexamine the initiative using another theoretical framework.53 

The primary purpose of this systematic review was to evaluate theoretical frameworks in subspecialty medical education, using ophthalmology as an example. We found that less than 20% of ophthalmic medical education studies published between 2016 and 2021 were informed by a theoretical framework. When included studies used frameworks, they often named the theory without describing how it framed the research question, informed the methods, or elucidated the results.6  Several studies incorporated previously designed theory-based courses or evaluation methods into their medical education initiative but did not further describe the theoretical framework.

Few studies have investigated the prevalence of conceptual frameworks in medicine and surgery. Schwartz et al reviewed the use of conceptual frameworks in the study of duty hours regulations for residents and found that several made contradictory predictions.54  Davis et al reviewed the conceptual underpinnings of pediatrics quality-of-life instruments and found that only 7.9% (3 of 38) were based in theory.55 

Our findings are consistent with other studies investigating use of theoretical frameworks in medical education. A review by Bajpai et al on the use of learning theories in digital health professions education reported that 33.4% (81 of 242) were informed by theory.56  Similarly, a review by Hauer et al on behavior change curricula for medical trainees demonstrated that 35.7% (39 of 109) used a theoretical framework.57  In addition, a review by van Gaalen et al on gamification in health professions education found that only 15.9% (7 of 44) of studies employed a theoretical framework,58  and a review by Leslie et al on faculty development programs in medical education found that only 18.2% (4 of 22) of studies employed a theoretical framework.58  Of note, some studies employed different definitions of a theory-based approach,56,58  and others did not define theory or theoretical framework.57,59  These discrepancies obscure accurate prevalence data and highlight the need to adhere to a standardized set of definitions.1,2,53,60-62 

The secondary purpose of this systematic review was to use our findings to construct a theory of change model to guide educators in creating theory-based initiatives. Given the complexity and heterogeneity of medical education systems across institutions, theory of change models are excellent tools to map large-scale initiatives, especially those with multiple outcomes.50  Our theory of change model illustrated the comprehensive process of selecting, integrating, and evaluating theory-based interventions, including the resources required and the underlying assumptions.

There are several limitations of this study. Our definition of a theoretical framework may differ from that of other studies; there is a need for researchers to adopt standardized definitions of the following terms: theory, theoretical framework, and conceptual framework.1  We used the definitions by Varpio et al, as they provide the most current model of these terms, informed by literature review, for health professions education research.1  We also limited our review to studies published between 2016 and 2021 in order to focus our search on current work in ophthalmic medical education6 ; however, this may have masked trends over time. Due to the heterogeneity in study initiatives and outcomes, we were unable to evaluate the efficacy of theoretical framework use and its effects on learner performance. Additionally, given the relatively poor overall quality of included studies and the heterogeneity in reporting use of theoretical frameworks, we were unable to assess the impact of theoretical frameworks on ophthalmic medical education. Moreover, it is possible that effective theoretical frameworks employed in certain study settings (eg, a classroom) may not translate to real-world practical settings (eg, an operating room).

We were also unable to fully determine applicability of many domains of the ACGME guidelines due to ambiguity in their wording; however, reviewers remained consistent in their application of this tool. Other studies have reported similar challenges in evaluating clinical assessment methods, such as evaluation tools for surgical skills, using the ACGME guidelines.15,20  Further investigation is needed into the efficacy of assessment and evaluation tools for surgical subspecialties. Finally, it is possible that some authors used theoretical frameworks without reporting them or without being consciously aware of using them53 ; for a study to be included, authors must have reported usage of a theoretical framework or employed a named intervention or methodology based in theory.

Educators interested in designing curricular interventions or longitudinal programs based on theoretical frameworks may benefit from examining questions and results through several “lenses” of theoretical frameworks and using standardized evaluation and assessment systems. In addition, medical educators may consider testing interventions in more than one study setting or institution. Future studies can be improved by transparently reporting theoretical frameworks, including the rationale for selecting a particular framework and how it informed study design and setting.

In summary, theoretical frameworks are underutilized in ophthalmic medical education research, and many studies that employ them do not do so transparently; in the few studies that integrated a theoretical framework, overall study rigor was low as assessed by GRADE, MERSQI, and ACGME guidelines. A theory of change model may guide educators in selecting, applying, and evaluating theory-based initiatives.

1.
Varpio
L
,
Paradis
E
,
Uijtdehaage
S
,
Young
M
.
The distinctions between theory, theoretical framework, and conceptual framework
.
Acad Med
.
2020
;
95
(7)
:
989
-
994
.
2.
Zackoff
MW
,
Real
FJ
,
Abramson
EL
,
Li
ST
,
Klein
MD
,
Gusic
ME
.
Enhancing educational scholarship through conceptual frameworks: a challenge and roadmap for medical educators
.
Acad Pediatr
.
2019
;
19
(2)
:
135
-
141
.
3.
Bordage
G
.
Reasons reviewers reject and accept manuscripts: the strengths and weaknesses in medical education reports
.
Acad Med
.
2001
;
76
(9)
:
889
-
896
.
4.
Mohorek
M
,
Webb
TP
.
Establishing a conceptual framework for handoffs using communication theory
.
J Surg Educ
.
2015
;
72
(3)
:
402
-
409
.
5.
Wayne
DB
,
Butter
J
,
Siddall
VJ
, et al
Mastery learning of advanced cardiac life support skills by Internal Medicine residents using simulation technology and deliberate practice
.
J Gen Intern Med
.
2006
;
21
(3)
:
251
-
256
.
6.
Cook
DA
,
Beckman
TJ
,
Bordage
G
.
Quality of reporting of experimental studies in medical education: a systematic review
.
Med Educ
.
2007
;
41
(8)
:
737
-
745
.
7.
Hodges
BD
,
Kuper
A
.
Theory and practice in the design and conduct of graduate medical education
.
Acad Med
.
2012
;
87
(1)
:
25
-
33
.
8.
Ho
C-M
,
Wang
J-Y
,
Yeh
C-C
, et al
Efficient undergraduate learning of liver transplant: building a framework for teaching subspecialties to medical students
.
BMC Med Educ
.
2018
;
18
(1)
:
161
.
9.
Weiss
CH
.
Nothing as practical as good theory: Exploring theory-based evaluation for comprehensive community initiatives for children and families
.
New App Eval Comm Initiativ
.
1995
;
1
:
65
-
92
.
10.
Page
MJ
,
McKenzie
JE
,
Bossuyt
PM
, et al
The PRISMA 2020 statement: an updated guideline for reporting systematic reviews
.
BMJ
.
2021
;
372
:
n71
.
11.
Bramer
WM
,
Giustini
D
,
de Jonge
GB
,
Holland
L
,
Bekhuis
T
.
De-duplication of database search results for systematic reviews in EndNote
.
J Med Libr Assoc
.
2016
;
104
(3)
:
240
-
243
.
12.
Guyatt
G
,
Oxman
AD
,
Akl
EA
, et al
GRADE guidelines: 1. Introduction: GRADE evidence profiles and summary of findings tables
.
J Clin Epidemiol
.
2011
;
64
(4)
:
383
-
394
.
13.
Reed
DA
,
Cook
DA
,
Beckman
TJ
,
Levine
RB
,
Kern
DE
,
Wright
SM
.
Association between funding and quality of published medical education research
.
JAMA
.
2007
;
298
(9)
:
1002
-
1009
.
14.
Swing
SR
,
Clyman
SG
,
Holmboe
ES
,
Williams
RG
.
Advancing resident assessment in graduate medical education
.
J Grad Med Educ
.
2009
;
1
(2)
:
278
-
286
.
15.
O'Connor
A
,
McGarr
O
,
Cantillon
P
,
McCurtin
A
,
Clifford
A
.
Clinical performance assessment tools in physiotherapy practice education: a systematic review
.
Physiotherapy
.
2018
;
104
(1)
:
46
-
53
.
16.
Maini
R
,
Mounier-Jack
S
,
Borghi
J
.
How to and how not to develop a theory of change to evaluate a complex intervention: reflections on an experience in the Democratic Republic of Congo
.
BMJ Glob Health
.
2018
;
3
(1)
:
e000617.
17.
Paina
L
,
Wilkinson
A
,
Tetui
M
, et al
Using Theories of Change to inform implementation of health systems research and innovation: experiences of Future Health Systems consortium partners in Bangladesh, India and Uganda
.
Health Res Policy Syst
.
2017
;
15
(Suppl 2)
:
109
.
18.
Anderson
TN
,
Kearse
LE
,
Shi
R
, et al
Surgical endoscopy education research: how are we doing?
[published online ahead of print February 22, 2022].
Surg Endosc
.
19.
Smith
RP
,
Learman
LA
.
A plea for MERSQI: the Medical Education Research Study Quality Instrument
.
Obstet Gynecol
.
2017
;
130
(4)
:
686
-
690
.
20.
Jelovsek
JE
,
Kow
N
,
Diwadkar
GB
.
Tools for the direct observation and assessment of psychomotor skills in medical trainees: a systematic review
.
Med Educ
.
2013
;
47
(7)
:
650
-
673
.
21.
Aslan
F
,
Yuce
B
,
Oztas
Z
,
Ates
H
.
Evaluation of the learning curve of non-penetrating glaucoma surgery
.
Int Ophthalmol
.
2018
;
38
(5)
:
2005
-
2012
.
22.
Bharucha
KM
,
Adwe
VG
,
Hegade
AM
,
Deshpande
RD
,
Deshpande
MD
,
Kalyani
VKS
.
Evaluation of skills transfer in short-term phacoemulsification surgery training program by International Council of Ophthalmology Ophthalmology Surgical Competency Assessment Rubrics (ICO-OSCAR) and assessment of efficacy of ICO-OSCAR for objective evaluation of skills transfer
.
Indian J Ophthalmol
.
2020
;
68
(8)
:
1573
-
1577
.
23.
Forslund
Jacobsen
M,
Konge
L,
la Cour
M,
et al
Simulation of advanced cataract surgery—validation of a newly developed test
.
Acta Ophthalmol
.
2020
;
98
(7)
:
687
-
692
.
24.
Lin
Y
,
Zhu
Y
,
Chen
C
, et al
Facing the challenges in ophthalmology clerkship teaching: is flipped classroom the answer?
PLoS One
.
2017
;
12
(4)
:
e0174829
.
25.
Saleh
GM
,
Wawrzynski
JR
,
Saha
K
, et al
Feasibility of human factors immersive simulation training in ophthalmology: the London pilot
.
JAMA Ophthalmol
.
2016
;
134
(8)
:
905
-
911
.
26.
Tzoumas
N
,
Boote
T
,
Higgs
J
,
Ellis
H
,
Dhillon
B
,
Cackett
P
.
Comment on: transforming ophthalmic education into virtual learning during COVID-19 pandemic: a global perspective. Eye (Lond)
.
2021
;
35
(9)
:
2648
-
2650
.
27.
Vagge
A
,
Gunton
K
,
Schnall
B
.
Impact of a Strabismus surgery suture course for first- and second-year ophthalmology residents
.
J Pediatr Ophthalmol Strabismus
.
2017
;
54
(6)
:
339
-
345
.
28.
Vergmann
AS
,
Vestergaard
AH
,
Grauslund
J
.
Virtual vitreoretinal surgery: validation of a training programme
.
Acta Ophthalmol
.
2017
;
95
(1)
:
60
-
65
.
29.
Yu
AY
,
Wang
QM
,
Li
J
,
Huang
F
,
Golnik
K
.
A cataract surgery training program: 2-year outcome after launching
.
J Surg Educ
.
2016
;
73
(5)
:
761
-
767
.
30.
Hassanzadeh
S
,
Karimi Moonaghi
H
,
Derakhshan
A
,
Masoud Hosseini
S
,
Taghipour
A
.
Preferred learning styles among ophthalmology residents: an Iranian sample
.
J Ophthalmic Vis Res
.
2019
;
14
(4)
:
483
-
490
.
31.
Atta
IS
,
Alghamdi
AH
.
The efficacy of self-directed learning versus problem-based learning for teaching and learning ophthalmology: a comparative study
.
Adv Med Educ Pract
.
2018
;
9
:
623
-
630
.
32.
Nathoo
NA
,
Nazarali
S
,
Gardiner
J
,
Maberley
D
.
Evaluation of ophthalmology clerkships across teaching sites at the University of British Columbia
.
Can J Ophthalmol
.
2019
;
54
(2)
:
150
-
154
.
33.
Ng
DS-C
,
Sun
Z
,
Young
AL
, et al
Impact of virtual reality simulation on learning barriers of phacoemulsification perceived by residents
.
Clin Ophthalmol
.
2018
;
12
:
885
-
893
.
34.
Prior Filipe H, Paton M, Tipping J, Schneeweiss S, Mack HG. Microlearning to improve CPD learning objectives.
Clin Teach.
2020
;
17
(6)
:
695
-
699
.
35.
Sahoo
S
.
Finding self-directed learning readiness and fostering self-directed learning through weekly assessment of self-directed learning topics during undergraduate clinical training in ophthalmology
.
Int J Appl Basic Med Res
.
2016
;
6
(3)
:
166
-
169
.
36.
Thomsen
AS
,
Smith
P
,
Subhi
Y
, et al
High correlation between performance on a virtual-reality simulator and real-life cataract surgery
.
Acta Ophthalmol
.
2017
;
95
(3)
:
307
-
311
.
37.
Borboli-Gerogiannis
S
,
Jeng-Miller
KW
,
Koulisis
N
, et al
A comprehensive surgical curriculum reduced intra-operative complication rates of resident-performed cataract surgeries
.
J Surg Educ
.
2019
;
76
(1)
:
150
-
157
.
38.
Golnik
KC
,
Law
JC
,
Ramasamy
K
, et al
The ophthalmology surgical competency assessment rubric for vitrectomy
.
Retina
.
2017
;
37
(9)
:
1797
-
1804
.
39.
Kang
JM
,
Padmanabhan
SP
,
Schallhorn
J
,
Parikh
N
,
Ramanathan
S
.
Improved utilization of operating room time for trainee cataract surgery in a public hospital setting
.
J Cataract Refract Surg
.
2018
;
44
(2)
:
186
-
189
.
40.
McCannel
CA
.
Continuous curvilinear capsulorhexis training and non-rhexis related vitreous loss: the specificity of virtual reality simulator surgical training (an American Ophthalmological Society thesis)
.
Trans Am Ophthalmol Soc
.
2017
;
115
:
T2
.
41.
Mishra
A
,
Browning
D
,
Haviland
MJ
, et al
Communication skills training in ophthalmology: results of a needs assessment and pilot training program
.
J Surg Educ
.
2018
;
75
(2)
:
417
-
426
.
42.
Mishra
K
,
Mathai
M
,
Della Rocca
RC
,
Reddy
HS
.
Improving resident performance in oculoplastic surgery: a new curriculum using surgical wet laboratory videos
.
J Surg Educ
.
2017
;
74
(5)
:
837
-
842
.
43.
Jørgensen
M
,
Savran
MM
,
Christakopoulos
C
, et al
Development and validation of a multiple-choice questionnaire-based theoretical test in direct ophthalmoscopy
.
Acta Ophthalmol
.
2019
;
97
(7)
:
700
-
706
.
44.
Dean
WH
,
Buchan
J
,
Admassu
F
, et al
Ophthalmic simulated surgical competency assessment rubric (Sim-OSSCAR) for trabeculectomy
.
BMJ Open Ophthalmol
.
2019
;
4
(1)
:
e000313.
45.
Dean
WH
,
Murray
NL
,
Buchan
JC
,
Golnik
K
,
Kim
MJ
,
Burton
MJ
.
Ophthalmic Simulated Surgical Competency Assessment Rubric for manual small-incision cataract surgery
.
J Cataract Refract Surg
.
2019
;
45
(9)
:
1252
-
1257
.
46.
Sahoo
S
,
Mohammed
CA
.
Fostering critical thinking and collaborative learning skills among medical students through a research protocol writing activity in the curriculum
.
Korean J Med Educ
.
2018
;
30
(2)
:
109
-
118
.
47.
Palis
AG
,
Barrio-Barrio
J
,
Mayorga
EP
, et al
The International Council of Ophthalmology Ophthalmic clinical evaluation exercise
.
Indian J Ophthalmol
.
2021
;
69
(1)
:
43
-
47
.
48.
Juniat
V
,
Golnik
KC
,
Bernardini
FP
, et al
The Ophthalmology Surgical Competency Assessment Rubric (OSCAR) for anterior approach ptosis surgery
.
Orbit
.
2018
;
37
(6)
:
401
-
404
.
49.
Swaminathan
M
,
Ramasubramanian
S
,
Pilling
R
,
Li
J
,
Golnik
K
.
ICO-OSCAR for pediatric cataract surgical skill assessment
.
J AAPOS
.
2016
;
20
(4)
:
364
-
365
.
50.
Center for Theory of Change
.
ActKnowledge. Published 2021.
https://www.theoryofchange.org/. Accessed August 15, 2022.
51.
Vygotsky
LS
.
Mind in Society Development of Higher Psychological Processes
.
Boston, MA
:
Harvard University Press;
1978
.
52.
Swing
SR
.
The ACGME outcome project: retrospective and prospective
.
Med Teach
.
2007
;
29
(7)
:
648
-
654
.
53.
Bordage
G
.
Conceptual frameworks to illuminate and magnify
.
Med Educ
.
2009
;
43
(4)
:
312
-
319
.
54.
Schwartz
A
,
Pappas
C
,
Bashook
PG
, et al
Conceptual frameworks in the study of duty hours changes in graduate medical education: a review
.
Acad Med
.
2011
;
86
(1)
:
18
-
29
.
55.
Davis
E
,
Waters
E
,
Mackinnon
A
, et al
Paediatric quality of life instruments: a review of the impact of the conceptual framework on outcomes
.
Dev Med Child Neurol
.
2006
;
48
(4)
:
311
-
318
.
56.
Bajpai
S
,
Semwal
M
,
Bajpai
R
,
Car
J
,
Ho
AHY
.
Health professions' digital education: review of learning theories in randomized controlled trials by the digital health education collaboration
.
J Med Internet Res
.
2019
;
21
(3)
:
e12912.
57.
Hauer
KE
,
Carney
PA
,
Chang
A
,
Satterfield
J
.
Behavior change counseling curricula for medical trainees: a systematic review
.
Acad Med
.
2012
;
87
(7)
:
956
-
968
.
58.
van Gaalen
AEJ
,
Brouwer
J
,
Schönrock-Adema
J
,
Bouwkamp-Timmer
T
,
Jaarsma
ADC
,
Georgiadis
JR
.
Gamification of health professions education: a systematic review
.
Adv Health Sci Educ Theory Pract
.
2021
;
26
(2)
:
683
-
711
.
59.
Leslie
K
,
Baker
L
,
Egan-Lee
E
,
Esdaile
M
,
Reeves
S
.
Advancing faculty development in medical education: a systematic review
.
Acad Med
2013
;
88
(7)
:
1038
-
1045
.
60.
Bordage
G
.
Conceptual frameworks... What lenses can they provide to medical education?
Investigación en Educación Médica
.
2012
;
1
(4)
:
167
-
169
.
61.
Alonso
F
,
López
G
,
Manrique
D
,
Viñes
JM
.
An instructional model for web-based e-learning education with a blended learning process approach
.
Br J Educ Technol
.
2005
;
36
(2)
:
217
-
235
.
62.
Grant
C
,
Osanlo
A
.
Understanding, selecting, and Integrating a theoretical framework in dissertation research: creating the blueprint for your “house.”
Admin Iss J
.
2014
;
4
(2)
:
12
-
26
.

Author notes

Editor's Note: The online version of this article contains further data from the study.

Funding: The authors report no external funding source for this study.

Competing Interests

Conflict of interest: The authors declare they have no competing interests.

Disclaimer: The views expressed here are those of the authors and do not necessarily reflect the positions or policies of the US Department of Veterans Affairs or the US Government.

Supplementary data