Objective

To develop an online, interactive educational tool to deliver an evidence-based clinical practice guideline to faculty members at a Canadian chiropractic college. Second, to evaluate the learning, design, and engagement constructs of the tool in a sample of chiropractic faculty members.

Methods

Using an integrated knowledge translation methodology and the Knowledge to Action Framework, we developed an evidence-based online learning tool. The context of the tool focused on a clinical practice guideline on the management of neck pain. We evaluated the learning, design, and engagement constructs in a sample of faculty members and residents using the Learning Object Evaluation Scale for Students. Participants were also asked to provide suggestions for improvement of the tool.

Results

Sixteen participants completed the evaluation. Most (68.8%) participants were chiropractors, 75% were male and 56% were between the ages of 25 and 44 years. At least 75% of participants agreed that the learning, design, and engagement constructs of the learning tool were adequate. The open-ended suggestions unveiled 3 pedagogical themes, relating to multimedia, thinking skills, and learner control, within the tool that could benefit from further development. These themes informed recommendations to improve the tool.

Conclusion

Our online, interactive, module-based learning tool has sound pedagogical properties. Further research is needed to determine if its use is associated with a change in knowledge.

INTRODUCTION

Musculoskeletal conditions, such as back and neck pain, are primary reasons for accessing chiropractic care.1  Back and neck pain result in millions of days of sick leave and contribute to significant direct and indirect health care expenditures.2,3  One approach used to improve care delivery and expenditures is the implementation of clinical practice guidelines. Evidence-based practice can improve quality of care, reduce health care expenditure, and increase patient safety.4  However, the use of clinical practice guidelines is suboptimal in many health care professions, including chiropractic.2,3,59 

One method to improve the uptake of research into practice by the next generation of chiropractors is to develop high-quality education programs that integrate new evidence into teaching curriculum. Similarly, evidence-based clinical training is a known facilitator of postgraduate acceptance of new evidence.5,10,11  Therefore, ensuring that chiropractic faculty members are informed of new research and clinical practice guidelines will make certain that students receive an up-to-date evidence-based education.

Educating faculty can be challenging because of both intrinsic and extrinsic factors that can affect teaching and learning. Intrinsic factors include elements that are “internal” to the learner, such as motivation.1215  On the other hand, extrinsic factors involve elements that are “external” to the learner, such as time and cost.1215  Therefore, using optimal methods to educate faculty members is essential to facilitate the understanding and application of new knowledge. Traditional methods of learning are time consuming, can be expensive, and require learners to attend sessions in person. Comparatively, online learning allows learners to access information at any time, from any location, and often at a lower cost.1618  Online learning can be presented through various tools and digital platforms, including computers, smartphones, or tablets, and can be used synchronously and asynchronously.19,20  A well-developed online learning tool provides several advantages to learners, including (1) improved accessibility (anywhere, anytime), (2) tailored instructions to meet the needs of specific audiences, and (3) customizable delivery method that include text animated graphics, audio, and games.1922  Finally, online learning tools allow for more complex features, such as immersive simulated environments.20,22  Nevertheless, online learning tools can be challenging to develop and implement. They require significant development resources, technical expertise, and pedagogical knowledge.

We previously conducted a systematic review of the literature to determine the effectiveness of online learning tools designed to improve knowledge of clinical practice guidelines by health care providers.23  We found preliminary evidence suggesting that spaced education in combination with a game or case studies, as well as evidence toward the use of module-based education, are associated with improving knowledge of clinical practice guidelines. Additionally, previous systematic reviews identified that online learning tools are at least as effective as traditional, in-person modes of learning for improving knowledge.4,2426  Furthermore, studies of various online learning methods (such as websites, Web-based workshops, and electronic educational games) that aim to increase knowledge, change clinical behaviors, and improve patient care have been conducted.4,2426  Overall, these studies suggested that learning tools that incorporate an evidence-based design may stand as a promising method to train clinicians about clinical practice guidelines.4,21,22  However, these methods deserve further investigation.

We aimed to create an online learning environment that facilitated the acquisition of new evidence-based knowledge by educators at the Canadian Memorial Chiropractic College (CMCC). Our study had 2 specific objectives: first, to develop an online, interactive module-based learning tool to educate faculty members at the CMCC about the evidence-based management of neck pain using a clinical practice guideline and, second, to evaluate the pedagogical constructs (learning, design, and engagement) of the learning tool and develop recommendations for its further development.

METHODS

Tool Development

We formed a Knowledge User Advisory Committee27  to advise on the development and structure of the learning tool and the barriers and enablers to implementing a new educational intervention (eg, perceptions of new learning tools within this population and suggestions on how to overcome these barriers) and to provide poststudy involvement advice on the dissemination of the study results. The committee included 9 members: 6 chiropractors (1 clinician, 1 teaching faculty member, 2 department administrators, 1 PhD student, and 1 CMCC chiropractic resident) and, additionally, 1 medical doctor, 1 department administrator (dean of undergraduate education) and 1 information technology specialist.

We developed the tool using an integrated knowledge translation (KT) approach. KT refers to the exchange, synthesis, and ethically sound application of knowledge to improve health and provide effective services in health care.28  Additionally, we adopted the Knowledge to Action (KTA) Framework29  to guide the phases of the project. The KTA Framework is a conceptual framework, developed by Graham et al,29  intended to guide the process of KT through a cycle of steps. For the purposes of this study, we focused on the first 4 steps of the KTA cycle: (1) identify the problem and select knowledge, (2) adapt knowledge to local context, (3) assess barriers to knowledge use, and (4) select, tailor, and implement interventions (Fig. 1).30  The framework includes 3 additional steps: monitor knowledge use, evaluate outcomes, and sustain knowledge use. These steps were beyond the scope of this present study; however, we recommend that they be addressed in future research.

Figure 1-

Steps used to develop and evaluate the online educational tool

Figure 1-

Steps used to develop and evaluate the online educational tool

The first step of the framework includes identifying the problem and selecting knowledge. CMCC expressed their interest in implementing clinical practice guidelines into their curriculum; but were interested in first learning which methods might be most appropriate to disseminate the guideline to their faculty members. Next, we selected a guideline on the management of neck pain, as it is a common complaint that chiropractors treat.31  We selected the neck pain guideline by the OPTIMa Collaboration32  to be the topic for our learning tool. We chose this guideline because of its high methodological quality.33,34  Above all, our objective was to create a generic learning tool whereby the guideline topic could be substituted but the foundational pedagogical properties of the intervention would remain.

The second step includes adapting knowledge to the local context. As part of this larger study, we conducted a systematic review of the literature to determine the effectiveness of technology-based learning tools to improve knowledge of clinical practice guidelines for health care professionals.23  Our review included studies investigating the effectiveness of any technology-based learning tools developed to improve knowledge of health care providers (eg, physicians, residents, and nurses) about clinical practice guidelines. The systematic review provided us with evidence for the best format to use for our learning tool. We synthesized the evidence from methodologically valid studies using best-evidence methodology.35,36  The evidence suggests that spaced education may be associated with improvement in knowledge; however, its effectiveness relative to other interventions has not been established.37,38  Similarly, module-based online educational interventions may be associated with improvement in knowledge of clinical practice guidelines; however, these interventions may not be more effective than paper-based self-learning or in-person workshops.3942 

We then applied pedagogical theories and principles to inform the design of the learning tool. We focused on theories and principles relevant to educational motivation (intrinsic and extrinsic factors influencing participation in learning).43,44  We focused on this concept because it was a recurring issue discussed by the Advisory Committee and an important issue identified from the literature for a variety of health care professions.19,22,45,46 

In the third step, we consulted the Advisory Committee (in-person meeting) to explore potential barriers to successful implementation of continuing education programs and learn about their views with online learning. For example, “What is important to you when accessing online material/media? Are there certain visual characteristics you tend to be drawn to?” The comments from this meeting were audio recorded. Following the meeting, we categorized the comments by pedagogical themes and concepts. These categories informed the development phase of the learning tool.

Finally, the fourth step of the framework includes selecting, tailoring, and implementing the intervention. Using the information collected in the first 3 steps, we developed an online, interactive learning tool aiming to translate information from the clinical practice guideline32  into a series of interactive, asynchronous learning modules. The tool included 6 short modules (3 Web pages/module), each focusing on a component of the clinical practice guideline (Fig. 2). The first page of each module outlined the learning outcomes. The second page provided the learners with instructions for completing the module, the learning content, links to external resources, and the main “take-home” messages. The last page provided a short self-evaluation of knowledge (multiple-choice quizzes and feedback). The last section of the tool included 3 clinical case studies of patients with neck pain. A member of the advisory committee developed the case studies from their own clinical experiences. The purpose of the case studies was to reiterate the use of the guideline using a real patient example.

Figure 2-

The learning tool. Top left are the learning outcomes. Bottom left is the learning content. Top right is the self evaluation. Bottom right is the clinical case study evaluation.

Figure 2-

The learning tool. Top left are the learning outcomes. Bottom left is the learning content. Top right is the self evaluation. Bottom right is the clinical case study evaluation.

To tailor the learning tool to our end users (faculty), the Advisory Committee provided feedback on the design of the learning tool through a series of 5 online surveys between June and November 2017. Using an online survey software (SurveyMonkey, San Mateo, CA), the research team sent the Advisory Committee links to portions of the learning tool as well as a link to a survey to provide guidance and feedback. Survey questions were developed in association with pedagogical theories and/or principles that helped formulate the development of the learning tool. For example, “Did you find this section of the learning tool user-friendly? If yes, what made you feel this way? If no, please explain why not.” This question can be linked to the pedagogical concept of usability (usability → navigation and scrolling → learner control).

Evaluation of the Tool

We implemented our learning tool to faculty members at CMCC and conducted an evaluation to gain feedback on its pedagogical quality. The evaluations collected in this phase would provide guidance to amending the tool where necessary.

Study Design and Sample

Eligible participants included all faculty members employed at CMCC between February 1 and May 31, 2018 (n = 127) regardless of teaching focus (eg, anatomy, clinical education, graduate studies). Members of the Knowledge User Advisory Committee were not eligible to participate. All participants completed an online informed consent form prior to participation. This study was approved by the University of Ontario Institute of Technology and the Canadian Memorial Chiropractic College research ethics boards.

We recruited faculty members using 3 strategies: (1) e-mails from department administrators, (2) an online posting on CMCC's learning management system (LMS), and (3) face-to-face interactions. Recruitment e-mails were sent from the department administrators to faculty members and teaching assistants/residents. The e-mails included an information letter describing the study and information regarding participation. An invitation to participate was posted on CMCC's LMS. The platform includes a page from the Office of Research Administration to track, report, and deliver information about upcoming research and resources to faculty members and staff. The invitation to participate included an information letter as well as a direct link to the study, contact information, and ethics details. Finally, we actively recruited participants through face-to-face interactions during formal and informal meetings.

During active recruitment (face-to-face interaction), those who had begun participating informally reported that it was taking too much of their time to complete of the 6 modules. Therefore, we reduced the length of the tool from 6 modules to 1 on April 27, 2018. This modification aimed to minimize the burden on participants and reduced completion time from 30 to approximately 10 minutes. We also reduced the number of case studies from 3 cases to 1.

Data Collection

Participant Characteristics

We collected demographic and professional characteristics using an online baseline questionnaire that included: (1) age and gender, (2) years of experience in the chiropractic profession, (3) years of experience in a chiropractic teaching role, (4) number of hours worked per week at CMCC, (5) department or division, (6) familiarity with the neck pain guideline, (7) experience with technology-based learning tools, and (8) self-rated proficiency with computers.

Outcome Measure

Following completion of the tool, participants were asked to evaluate the tool by completing the Learning Object Evaluation Scale for Students (LOES-S). The LOES is a 13-item self-reported questionnaire that evaluates the impact the tool has had on their learning experience. This impact is categorized into 3 constructs: learning, design, and engagement.47  The learning construct consists of 5 items (eg, working with the tool helped me learn, the feedback from the tool helped me learn). The design construct consists of 4 items (eg, the help features in the tool were useful, the instructions in the tool were easy to follow). Finally, the engagement construct consists of 4 items (eg, I like the overall theme of the tool, I found the tool engaging).

The psychometric properties of the tool were measured in 2 separate studies:47,48  the first in middle and secondary school students (10–22 years old) for any subject appropriate for their respective curriculums48  and the second in middle and secondary school students (11–17 years old) for math and science.47  Both studies evaluated a variety of online learning tools involving experimentation, virtual manipulatives, task-based applications, and formal representation of concepts.47,48  Adequate internal reliability was demonstrated based on Cronbach's α: .93 (learning), .87 (design), and .92 (engagement). Each construct measured was distinct (correlation between learning and design: r = .71, p < .001; correlation between learning and engagement: r = .76, p < .001; correlation between engagement and design: r = .65, p < .001). Convergent validity correlations ranged from 0.36 to 0.65, demonstrating a moderate degree of consistency between student and teacher evaluations. The correlation between the evaluation scale and learning performance was assessed in 4 categories—learning: remembering r = .01, understanding r = .11, application r = .16 (p < .005), analysis r = .37 (p < .001); design: remembering r = −.08, understanding r = .04, application r = .12 (p < .05), analysis r = .30 (p < .005); engagement: remembering r = −.04, understanding r = .01, application r = .016 (p < .005), analysis r = .31 (p < .005).

We selected this outcome measure because of its focus on the student-centered constructs of learning with the tool rather than an evaluation of knowledge. This focus coincided with the primary stages of development of the KTA cycle.30  Alternate evaluation tools focus on the development and design of the learning tool and miss the impact that the learning tool has on the learner.47  This perspective is particularly important, as it has a direct relationship to knowledge gained through the use of the tool.47 

We modified the language of the LOES-S. Specifically, we changed the original term “learning object” to “learning tool.” This was made to limit any confusion by users because the term “learning object” is not widely used in this population. Additionally, we included 1 open-ended statement to ask participants to provide suggestions to improve the tool. Specifically, participants were asked, “Please provide any suggestions you may have to improve this technology-based learning tool.”

Analysis

LOES-S

We report the distribution of responses to the 13 items of the LOES to describe participants' evaluation of the learning, design, and constructs of the tool. Participants who reported a score of 4/5 (agree) or 5/5 (strongly agree) were classified as agreeing with the item. Those who reported a score of 2/5 (disagree) or 1/5 (strongly disagree) were classified as disagreeing. A score of 3/5 was classified as a neutral evaluation. We calculated the median evaluation scores and interquartile ranges for each item. Median scores less than 4 determined a need for improvement. The analysis was conducted using SPSS Statistics Version 24 (IBM Corp, Armonk, NY).

Open-Ended Suggestions

We performed a content analysis of the suggestions provided by participants. Comments provided by participants were separated into individual statements (if more than 1 distinct suggestion was provided). For example, “There could have been more graphics, and the quiz questions were a bit easy.” This statement was separated into 2 distinct comments: (1) “There could have been more graphics” and (2) “The quiz questions were a bit easy.” Three investigators independently completed a content analysis worksheet where they were asked to match each individual comment to the most relevant pedagogical theme. Investigators were provided with a list of pedagogical themes, including their definitions, to select from. Pedagogical themes included coherence, contiguity, learner control, multimedia, personalization, practice, segmenting, and thinking skills. These themes were selected as they were used as a guide during the development of the learning tool. Investigators completed a discussion-based consensus. Results were stratified by recurring pedagogical themes used as references throughout the development phase of the learning tool.

RESULTS

Twenty-eight faculty members participated (28/127, 22%). Twelve participants who started the study did not complete the evaluation questionnaire (completed the baseline questionnaire only); therefore, our sample includes 16 participants who completed the baseline questionnaire and LOES-S evaluation.

Sample Characteristics

Most participants were male (12/16; 75%) and between the ages of 25 and 44 years (9/12, 56.3%) (Table 1). Most (11/16, 68.8%) reported that their highest level of education was a doctor of chiropractic (DC) degree, and 62.5% (10/16) reported no experience with the development of a clinical practice guideline.

Table 1-

Demographic Characteristics

Demographic Characteristics
Demographic Characteristics

Our sample included approximately 13% of the eligible population. A census of CMCC faculty during the 2017–2018 academic year suggests that more employees were male (57%), the average age was 46 years, and a majority (63%) reported their highest level of education to be a DC degree (Faculty demographics provided by CMCC, August 2018). Although the demographic characteristics of these census data were limited, the presented characteristics seem to be like those who participated in this study.

Twelve participants completed the baseline questionnaire only (Table 1). The differences between participants who completed the study (baseline questionnaire and LOES-S evaluation) and those who completed the baseline questionnaire only may have led to attrition bias. For example, differences between groups regarding years of experience in a chiropractic teaching role (Table 1) may have influenced the overall evaluation of the learning tool if they had not been lost to follow-up.

Learning Object Evaluation Scale Constructs

Learning

All items had a median score of 4/5 (Table 2). One participant disagreed with item #4—“The tool helped teach me a new concept”—and 3 participants scored this item as neutral. Three participants scored item #3—“The graphics and animations from the tool helped me learn”—as neutral.

Table 2-

Learning Object Evaluation Scale for Students (LOES-S) Results

Learning Object Evaluation Scale for Students (LOES-S) Results
Learning Object Evaluation Scale for Students (LOES-S) Results

Design

All items within this construct had a median score of 4/5 (Table 2). One participant disagreed with item #7—“The instructions in the tool were easy to follow”—item #8—“The tool was easy to use”—and item #9—“The tool was well organized.” Four participants scored item #6—“The help features in the tool were useful”—as neutral, and 3 participants scored item #7 as neutral.

Engagement

All items within this construct had a median score of 4/5 (Table 2). One participant disagreed with item #10—“I like the overall theme of the tool”—and item #13—“I would like to use the tool again.” Three participants scored item #12—“The tool made learning fun”—as neutral, and 2 participants score item #13 as neutral.

Feedback from Participants

We received 23 comments. Ten comments were removed from the content analysis for the following reasons: (1) 6 entries included no suggestions (eg, no comment or n/a), (2) 1 comment related to enjoying the tool but provided no suggestions for improvements, (3) 1 comment pertained to difficulty experiencing clicking on a button within the tool, (4) 1 comment pertained to the baseline questionnaire, and (5) 1 comment was from a participant who participated more than 1 time (only their first comment was included in the analysis to limit information bias). Thirteen comments remained and were included in the content analysis.

Three pedagogical themes were identified during this analysis: learner control, multimedia, and thinking skills (Table 3). The 2 most frequently occurring comments (n = 6) were regarding the pedagogical theme of learner control. Participants wanted more control over the pace of the learning tool. Comments regarding multimedia (n = 4) emerged regarding graphics, colors, and the inclusion of more videos. Finally, the thinking skills theme comments (n = 3) were in regard to the evaluation components of the learning tool (quizzes and case studies). Participants suggested these components should be more exciting/challenging.

Table 3-

Suggestions From Participants (Open-Ended Question Following Learning Object Evaluation Scale for Students)

Suggestions From Participants (Open-Ended Question Following Learning Object Evaluation Scale for Students)
Suggestions From Participants (Open-Ended Question Following Learning Object Evaluation Scale for Students)

Suggested Revisions to Learning Tool

Based on the suggestions from participants (open-ended suggestions) and LOES-S quantitative evaluations, the following revisions are recommended for the further development of the learning tool for this population. To our knowledge, pedagogically focused recommendations for the further development of technology-based learning tools to improve health care–focused knowledge of health care professionals has not been provided elsewhere.

The following recommendations were developed to directly reflect the pedagogical themes derived through the content analysis as previously described. Recommendations are supported by the literature describing pedagogical theories and principles common to online or Web-based learning.

Recommendation #1: Include a wider variety of media (eg, graphics and videos) to break up the text and keep learners engaged.

“There could have been more graphics and a more exciting case.”

Multimedia is a pedagogical design principle that emphasizes using a combination of text and graphics to provide the learner with a richer learning experience.49  All media should, however, be relevant to the learning material and serve a specific purpose; graphics or animations for the sole purpose of aesthetics are not recommended.49  This recommendation also refers to the modality principle, wherein presenting information in a narrative format rather than text may be beneficial to the learner. Narration, however, should not be used to present long and complex information.50 

Recommendation #2: Consult content experts to ensure that review material (eg, quizzes and case studies) is appropriate for learners and the learning environment.

“The quiz questions were a bit easy; the cases were good and helpful to apply the knowledge.”

The thinking skills pedagogical principle outlines the cognitive processes learners use to accomplish tasks in a learning environment, which are (1) generating new ideas and perspectives; (2) applying, analyzing, synthesizing and evaluating information; and (3) awareness and analysis of one's thoughts.51  Learning tool components, such as review quizzes and case study, allow learners to follow these cognitive processes in order to apply new knowledge more effectively. Consulting content experts ensures that the review of components are appropriate for the learners and the difficulty of the content. This allows for a more adequate stepwise cognitive process to learn the new concept or skill.

Recommendation #3: Include a main menu to allow learners to control the sections of the learning tool they wish to review or skip to.

“Would have liked to see a home page or table of contents to go back to review specific sections.”

Learner control is a pedagogical principle that describes the degree of control a learner has over the learning experience.52  Types of learner control include (1) content sequencing: having control over the order of the course material; (2) pacing: having control of the time spent on each section of the lesson; and (3) access to learning support: having the ability to access additional resources to add to the learning experience.52  It is recommended that asynchronous online learning incorporate some degree of learner control. However, the degree of learner control is directly related to the degree of difficulty and complexity of the learning content, learners' previous knowledge of the subject matter, and learner metacognition.52  Therefore, if the degree of difficulty and complexity is high, learners' previous knowledge of the subject matter is low, and learner metacognition is low, there should be a lesser degree of learner control.

DISCUSSION

Our evaluation suggests that participants agreed that our tool had adequate learning, design, and engagement constructs. The open-ended feedback demonstrates 3 components of the tool that could benefit from further development. These open-ended evaluations are supported by 3 pedagogical themes: learner control, multimedia and thinking skills.

To our knowledge, we developed and evaluated the first online learning tool for chiropractic faculty. However, learning tool evaluations have been conducted in other health care professions. We identified 3 studies that evaluated online educational interventions in health care providers.5355  All 3 measured user satisfaction using a Likert scale and open-ended questions.5355  Two studies provided recommendations for subsequent use of the learning tools based on the satisfaction results.53,54  However, none of these studies referred to pedagogical theories or principles regarding their users' satisfaction. Nevertheless, the technology-based learning tools were evaluated favorably in the physician54,55  and nurse practitioner53  populations. Our study is novel because we evaluated user-centered constructs of the learning tool rather than satisfaction with the tool. This evaluation provided us with a descriptive evaluation of intrinsic and extrinsic factors associated with learning. Pedagogical theory and principles were incorporated into the design of the tool as well as the development of recommendations for its improvement. Finally, we did not incorporate the final steps of the framework to assess the impact of the tool on the end user, the student. Future research should continue through the final steps of the framework to determine its effectiveness following its implementation.

Strengths and Limitations

A strength of our study is the integrated KT approach used to develop the tool.27  The design of the learning tool was informed by 3 sources: (1) a Knowledge User Advisory Committee, (2) a systematic review of the literature, and (3) pedagogical theories and learning principles. This methodology ensured that the intervention was designed based on informed sources and thorough evaluation. This methodology is also designed to increase the uptake and impact of research findings by knowledge users beyond the scientific scope of the study.27  The evaluation informed the creation of recommendations for the further development of the learning tool for this population. Additionally, the inclusion of the open-ended question following the LOES-S complemented the quantitative evaluation and provided a more complete understanding of the evaluation. Finally, we implemented 3 recruitment strategies to maximize participation: multiple direct e-mails to eligible participants from department leads, an internal LMS advertisement, and face-to-face active recruitment. We compared participation rates on a weekly basis to understand recruitment strategy and which strategy was most effective. Our data suggest that in-person active recruitment was most effective for this population. This should be considered for future research of this nature.

This study had limitations. The LOES-S is a valid and reliable tool for use in the middle and high school environments; however, we do not know its psychometric properties for use in this health care provider/educator population.47,48  Although this may be a limitation, there are few evaluation tools designed to evaluate technology-based learning tools that focused on the impact on learners compared to solely evaluating an outcome of the educational intervention.47  Another limitation is the potential for selection bias. It is unclear if those individuals who participated in the study were representative of the entire eligible CMCC faculty population, but basic demographic characteristics of the eligible population suggest that they may be similar. It is possible that those who participated were more willing and/or interested in adopting online learning education. Barriers to participation in educational interventions by health care professionals have been reviewed in the literature.8,56,57  The most frequently occurring barrier to participation is a lack of time. Other common barriers include incentives to participation, financial constraints, personal constraints such as health status and motivation, lack of awareness of educational activities, and job status (part-time versus full-time).8,56,57 

CONCLUSIONS

We developed an online, interactive online learning tool aimed to enhance the uptake of a clinical practice guideline on the management of neck pain for chiropractic faculty at CMCC. Our evaluation suggests that the learning, design, and engagement constructs of the learning tool are adequate. However, the tool requires refinement. Further development of the learning tool is recommended to increase educational engagement for this population. Future research is recommended to investigate chiropractic faculty's barriers to educational participation as well as to investigate the efficacy of the learning tool for increasing knowledge of clinical practice guidelines within this population.

FUNDING AND CONFLICTS OF INTEREST

This work was funded internally. The authors have no conflicts of interest to declare relevant to this work.

REFERENCES

1. 
Beliveau
PJH,
Wong
JJ,
Sutton
DA,
et al.
The chiropractic profession: a scoping review of utilization rates, reasons for seeking care, patient profiles, and care provided
.
Chiropr Man Therap
.
2017
;
25
:
35
.
2. 
Amorin-Woods
LG,
Beck
RW,
Parkin-Smith
GF,
Lougheed
J,
Bremner
AP.
Adherence to clinical practice guidelines among three primary contact professions: a best evidence synthesis of the literature for the management of acute and subacute low back pain
.
J Can Chiropr Assoc
.
2014
;
58
(3)
:
220
237
.
3. 
Brockhusen
SS,
Bussieres
A,
French
SD,
Christensen
HW,
Jensen
TS.
Managing patients with acute and chronic non-specific neck pain: are Danish chiropractors compliant with guidelines?
Chiropr Man Therap
.
2017
;
25
:
17
.
4. 
De Angelis
G,
Davies
B,
King
J,
et al.
Information and communication technologies for the dissemination of clinical practice guidelines to health professionals: a systematic Review
.
JMIR Med Educ
.
2016
;
2
(2)
:
e16
.
5. 
Innes
SI,
Leboeuf-Yde
C,
Walker
BF.
How comprehensively is evidence-based practice represented in councils on chiropractic education (CCE) educational standards: a systematic audit
.
Chiropr Man Therap
.
2016
;
24
(1)
:
30
.
6. 
Adams
J,
Lauche
R,
Peng
W,
et al.
A workforce survey of Australian chiropractic: the profile and practice features of a nationally representative sample of 2,005 chiropractors
.
BMC Complement Altern Med
.
2017
;
17
(1)
:
14
.
7. 
Bussieres
AE,
Al Zoubi
F,
Stuber
K,
et al.
Evidence-based practice, research utilization, and knowledge translation in chiropractic: a scoping review
.
BMC Complement Altern Med
.
2016
;
16
:
216
.
8. 
Schneider
MJ,
Evans
R,
Haas
M,
et al.
US chiropractors' attitudes, skills and use of evidence-based practice: a cross-sectional national survey
.
Chiropr Man Therap
.
2015
;
23
:
16
.
9. 
Walker
BF,
French
SD,
Page
MJ,
et al.
Management of people with acute low-back pain: a survey of Australian chiropractors
.
Chiropr Man Therap
.
2011
;
19
(1)
:
29
.
10. 
Ilic
D,
Diug
B.
The impact of clinical maturity on competency in evidence-based medicine: a mixed-methods study
.
Postgrad Med J
.
2016
;
92
(1091)
:
506
509
.
11. 
Vidyarthi
AR,
Kamei
R,
Chan
K,
Goh
SH,
Lek
N.
Factors associated with medical student clinical reasoning and evidence based medicine practice
.
Int J Med Educ
.
2015
;
6
:
142
148
.
12. 
Delaney
ML,
Royal
MA.
Breaking engagement apart: the role of intrinsic and extrinsic motivation in engagement strategies
.
Ind Organ Psychol
.
2017
;
10
(1)
:
127
140
.
13. 
Wenke
R,
O'Shea
K,
Hilder
J,
Thomas
R,
Mickan
S.
Factors that influence the sustainability of structured allied health journal clubs: a qualitative study
.
BMC Med Educ
.
2019
;
19
(1)
:
6
.
14. 
Alhassan
RK,
Nketiah-Amponsah
E,
Spieker
N,
Arhinful
DK,
Rinke de Wit
TF.
Assessing the impact of community engagement interventions on health worker motivation and experiences with clients in primary health facilities in Ghana: a randomized cluster trial
.
PLoS One
.
2016
;
11
(7)
:
e0158541
.
15. 
Okello
DRO,
Gilson
L.
Exploring the influence of trust relationships on motivation in the health sector: a systematic review
.
Hum Resour Health
.
2015
;
13
:
16
.
16. 
Lam-Antoniades
M,
Ratnapalan
S,
Tait
G.
Electronic continuing education in the health professions: an update on evidence from RCTs
.
J Contin Educ Health Prof
.
2009
;
29
(1)
:
44
51
.
17. 
Nelson
EA.
E-learning. A practical solution for training and tracking in patient-care settings
.
Nurs Adm Q
.
2003
;
27
(1)
:
29
32
.
18. 
Rohwer
A,
Young
T,
van Schalkwyk
S.
Effective or just practical? An evaluation of an online postgraduate module on evidence-based medicine (EBM)
.
BMC Med Educ
.
2013
;
13
:
77
.
19. 
Juanes
JA,
Ruisoto
P.
Computer applications in health science education
.
J Med Syst
.
2015
;
39
(9)
:
97
.
20. 
Ruggeri
K,
Farrington
C,
Brayne
C.
A global model for effective use and evaluation of e-learning in health
.
Telemed J E Health
.
2013
;
19
(4)
:
312
321
.
21. 
Neuhaus
SJ,
Thomas
D,
Desai
J,
Vuletich
C,
von Dincklage
J,
Olver
I.
Wiki-based clinical practice guidelines for the management of adult onset sarcoma: a new paradigm in sarcoma evidence
.
Sarcoma
.
2015
;
2015
:
614179
.
22. 
Yavner
SD,
Pusic
MV,
Kalet
AL,
et al.
Twelve tips for improving the effectiveness of web-based multimedia instruction for clinical learners
.
Med Teach
.
2015
;
37
(3)
:
239
244
.
23. 
Verville
L.
Using technology-based educational interventions to improve knowledge about clinical practice guidelines: A systematic review of the literature?
J Chiropr Educ
.
2020
;
24. 
Sinclair
PM,
Kable
A,
Levett-Jones
T,
Booth
D.
The effectiveness of Internet-based e-learning on clinician behaviour and patient outcomes: a systematic review
.
Int J Nurs Stud
.
2016
;
57
:
70
81
.
25. 
Al Zoubi
FM,
Menon
A,
Mayo
NE,
Bussieres
AE.
The effectiveness of interventions designed to increase the uptake of clinical practice guidelines and best practices among musculoskeletal professionals: a systematic review
.
BMC Health Serv Res
.
2018
;
18
(1)
:
435
.
26. 
Tudor Car
L,
Kyaw
BM,
Dunleavy
G,
et al.
Digital problem-based learning in health professions: systematic review and meta-analysis by the Digital Health Education Collaboration
.
J Med Internet Res
.
2019
;
21
(2)
:
e12945
e
.
27. 
Canadian Institutes of Health Research.
Guide to Knowledge Translation Planning at CIHR: Integrated and End-of-Grant Approaches
.
Ottawa, ON
:
Canadian Institutes of Health Research;
2012
.
28. 
Canadian Institutes of Health Research.
Knowledge translation
2013
.
29. 
Graham
ID,
Logan
J,
Harrison
MB,
et al.
Lost in knowledge translation: time for a map?
J Contin Educ Health Prof
.
2006
;
26
(1)
:
13
24
.
30. 
Canadian Institutes of Health Research.
Knowledge Translation in Health Care: Moving from Evidence to Practice
.
Ottawa, ON: Canadian Institutes of Health Research;
2015
.
31. 
Mior
S,
Wong
J,
Sutton
D,
et al.
Understanding patient profiles and characteristics of current chiropractic practice: a cross-sectional Ontario Chiropractic Observation and Analysis Study (O-COAST)
.
BMJ Open
.
2019
;
9
(8)
:
e029851
e
.
32. 
Côté
P,
Wong
JJ,
Sutton
D,
et al.
Management of neck pain and associated disorders: a clinical practice guideline from the Ontario Protocol for Traffic Injury Management (OPTIMa) Collaboration
.
Eur Spine J
.
2016
;
25
(7)
:
2000
2022
.
33. 
Lin
I,
Wiles
L,
Waller
R,
et al.
What does best practice care for musculoskeletal pain look like? Eleven consistent recommendations from high-quality clinical practice guidelines: systematic review
.
Br J Sports Med
.
2020
;
54
:
79
86
.
34. 
Parikh
P,
Santaguida
P,
Macdermid
J,
Gross
A,
Eshtiaghi
A.
Comparison of CPG's for the diagnosis, prognosis and management of non-specific neck pain: a systematic review
.
BMC Musculoskelet Disord
.
2019
;
20
(1)
:
81
.
35. 
Carroll
LJ,
Cassidy
JD,
Peloso
PM,
et al.
Methods for the best evidence synthesis on neck pain and its associated disorders: the Bone and Joint Decade 2000–2010 Task Force on Neck Pain and Its Associated Disorders
.
J Manipulative Physiol Ther
.
2009
;
32
(suppl 2)
:
S39
S45
.
36. 
Côté
P,
Cassidy
JD,
Carroll
L,
Frank
JW,
Bombardier
C.
A systematic review of the prognosis of acute whiplash and a new conceptual framework to synthesize the literature
.
Spine
.
2001
;
26
(19)
:
E445
E458
.
37. 
Kerfoot
BP,
Kearney
MC,
Connelly
D,
Ritchey
ML.
Interactive spaced education to assess and improve knowledge of clinical practice guidelines: a randomized controlled trial
.
Ann Surg
.
2009
;
249
(5)
:
744
749
.
38. 
Kerfoot
BP,
Baker
H.
An online spaced-education game for global continuing medical education: a randomized trial
.
Ann Surg
.
2012
;
256
(1)
:
33
38
.
39. 
Bell
DS,
Fonarow
GC,
Hays
RD,
Mangione
CM.
Self-study from web-based and printed guideline materials. A randomized, controlled trial among resident physicians
.
Ann Intern Med
.
2000
;
132
(12)
:
938
946
.
40. 
Fordis
M,
King
JE,
Ballantyne
CM,
et al.
Comparison of the instructional efficacy of internet-based CME with live interactive CME workshops: a randomized controlled trial
.
JAMA
.
2005
;
294
(9)
:
1043
1051
.
41. 
Labeau
SO,
Rello
J,
Dimopoulos
G,
et al.
The value of E-learning for the prevention of healthcare-associated infections
.
Infect Control Hosp Epidemiol
.
2016
;
37
(9)
:
1052
1059
.
42. 
Rebbeck
T,
Macedo
L,
Paul
P,
Trevena
L,
Cameron
ID.
General practitioners' knowledge of whiplash guidelines improved with online education
.
Aust Health Rev
.
2013
;
37
(5)
:
688
694
.
43. 
Ryan
RM,
Deci
EL.
Intrinsic and extrinsic motivations: classic definitions and new directions
.
Contemp Educ Psychol
.
2000
;
25
(1)
:
54
67
.
44. 
Keller
J,
Suzuki
K.
Learner motivation and E-learning design: a multinationally validated process
.
Educ Media
.
2004
;
29
(3)
:
229
239
.
45. 
Rodriguez
C,
Victor
C,
Leonardi
N,
Sulo
S,
Littlejohn
G.
Barriers to participation in an online nursing journal club at a community teaching hospital
.
J Contin Educ Nurs
.
2016
;
47
(12)
:
536
542
.
46. 
Walker
BF,
Stomski
NJ,
Hebert
JJ,
French
SD.
Evidence-based practice in chiropractic practice: a survey of chiropractors' knowledge, skills, use of research literature and barriers to the use of research evidence
.
Complement Ther Med
.
2014
;
22
(2)
:
286
295
.
47. 
Kay
R.
Evaluating learning, design, and engagement in web-based learning tools (WBLTs): the WBLT Evaluation Scale
.
Comput Human Behav
.
2011
;
27
:
1849
1856
.
48. 
Kay
R,
Knaack
L.
Assessing learning, quality and engagement in learning objects: the Learning Object Evaluation Scale for Students (LOES-S)
.
Educ Technol Res Dev
.
2009
;
57
:
147
168
.
49. 
Clark
RC,
Mayer
RE.
Applying the multimedia principle: use words and graphics rather than words alone
.
In:
E-learning and the Science of Instruction. 3rd ed
.
San Francisco, CA
:
Pfeiffer;
2011
:
67
90
.
50. 
Clark
RC,
Mayer
RE.
Applying the modality principle: present words as audio narration rather than on-screen text
.
In:
E-learning and the Science of Instruction. 3rd ed
.
San Francisco, CA
:
Pfeiffer;
2011
:
113
130
.
51. 
Clark
RC,
Mayer
RE.
E-learning to build thinking skills. In E-learning and the Science of Instruction. 3rd ed
.
San Francisco, CA
:
Pfeiffer;
2011
:
339
368
.
52. 
Clark
RC,
Mayer
RE.
Who's in control? Guidelines for e-learning navigation
.
In:
E-learning and the Science of Instruction. 3rd ed
.
San Francisco, CA
:
Pfeiffer;
2011
:
309
338
.
53. 
Eardley
SA.
Evaluation of a technology-enabled tool to improve colorectal cancer screening
.
Online J Nurs Inform.
2016
;
20
(1)
.
54. 
Kang
SY,
Kim
SH,
Kwon
YE,
et al.
The virtual asthma guideline e-learning program: learning effectiveness and user satisfaction
.
Korean J Intern Med
.
2018
;
33
(3)
:
604
611
.
55. 
Schroter
S,
Jenkins
RD,
Playle
RA,
et al.
Evaluation of an online interactive Diabetes Needs Assessment Tool (DNAT) versus online self-directed learning: a randomised controlled trial
.
BMC Med Educ
.
2011
;
11
:
35
.
56. 
Bussieres
AE,
Patey
AM,
Francis
JJ,
et al.
Identifying factors likely to influence compliance with diagnostic imaging guideline recommendations for spine disorders among chiropractors in North America: a focus group study using the Theoretical Domains Framework
.
Implement Sci
.
2012
;
7
:
82
.
57. 
Bussieres
AE,
Terhorst
L,
Leach
M,
Stuber
K,
Evans
R,
Schneider
MJ.
Self-reported attitudes, skills and use of evidence-based practice among Canadian doctors of chiropractic: a national survey
.
J Can Chiropr Assoc
.
2015
;
59
(4)
:
332
348
.

Author notes

Leslie Verville is research manager, Knowledge Translation at the Centre for Disability Prevention and Rehabilitation at Ontario Tech University (2000 Simcoe Street North, Oshawa, ON, L1H 7K4, Canada; leslie.verville@uoit.ca). Pierre Côté is a professor in the Faculty of Health Sciences at Ontario Tech University, director of the Centre for Disability Prevention and Rehabilitation at Ontario Tech University, and Canada Research Chair in Disability Prevention and Rehabilitation at Ontario Tech University [ORCiD https://orcid.org/0000-0002-6986-6676] (2000 Simcoe Street North, Oshawa, ON, L1H 7K4; Canada; 7K4; pierre.cote@uoit.ca). Diane Grondin is an associate professor of research and innovation at Canadian Memorial Chiropractic College (6100 Leslie Street North, Toronto, ON, M2H 3J1, Canada; dgrondin@cmcc.ca). Silvano Mior is the director of research partnerships and health policy at the Canadian Memorial Chiropractic College and scientist in the Centre for Disability Prevention and Rehabilitation at Ontario Tech University (6100 Leslie Street North, Toronto, ON, M2H 3J1, Canada; smior@cmcc.ca). Robin Kay is a professor in the Faculty of Education at Ontario Tech University (2000 Simcoe Street North, Oshawa, ON, L1H 7K4, Canada; robin.kay@uoit.ca).

Concept development: LV, PC. Design: LV, PC, DG, SM, RK. Supervision: LV, PC, RK. Data collection/processing: LV. Analysis/interpretation: LV, PC, DG, SM. Literature search: LV, PC, DG, SM, RK. Writing: LV. Critical review: LV, PC, DG, SM, RK.