To develop an online, interactive educational tool to deliver an evidence-based clinical practice guideline to faculty members at a Canadian chiropractic college. Second, to evaluate the learning, design, and engagement constructs of the tool in a sample of chiropractic faculty members.
Using an integrated knowledge translation methodology and the Knowledge to Action Framework, we developed an evidence-based online learning tool. The context of the tool focused on a clinical practice guideline on the management of neck pain. We evaluated the learning, design, and engagement constructs in a sample of faculty members and residents using the Learning Object Evaluation Scale for Students. Participants were also asked to provide suggestions for improvement of the tool.
Sixteen participants completed the evaluation. Most (68.8%) participants were chiropractors, 75% were male and 56% were between the ages of 25 and 44 years. At least 75% of participants agreed that the learning, design, and engagement constructs of the learning tool were adequate. The open-ended suggestions unveiled 3 pedagogical themes, relating to multimedia, thinking skills, and learner control, within the tool that could benefit from further development. These themes informed recommendations to improve the tool.
Our online, interactive, module-based learning tool has sound pedagogical properties. Further research is needed to determine if its use is associated with a change in knowledge.
INTRODUCTION
Musculoskeletal conditions, such as back and neck pain, are primary reasons for accessing chiropractic care.1 Back and neck pain result in millions of days of sick leave and contribute to significant direct and indirect health care expenditures.2,3 One approach used to improve care delivery and expenditures is the implementation of clinical practice guidelines. Evidence-based practice can improve quality of care, reduce health care expenditure, and increase patient safety.4 However, the use of clinical practice guidelines is suboptimal in many health care professions, including chiropractic.2,3,5–9
One method to improve the uptake of research into practice by the next generation of chiropractors is to develop high-quality education programs that integrate new evidence into teaching curriculum. Similarly, evidence-based clinical training is a known facilitator of postgraduate acceptance of new evidence.5,10,11 Therefore, ensuring that chiropractic faculty members are informed of new research and clinical practice guidelines will make certain that students receive an up-to-date evidence-based education.
Educating faculty can be challenging because of both intrinsic and extrinsic factors that can affect teaching and learning. Intrinsic factors include elements that are “internal” to the learner, such as motivation.12–15 On the other hand, extrinsic factors involve elements that are “external” to the learner, such as time and cost.12–15 Therefore, using optimal methods to educate faculty members is essential to facilitate the understanding and application of new knowledge. Traditional methods of learning are time consuming, can be expensive, and require learners to attend sessions in person. Comparatively, online learning allows learners to access information at any time, from any location, and often at a lower cost.16–18 Online learning can be presented through various tools and digital platforms, including computers, smartphones, or tablets, and can be used synchronously and asynchronously.19,20 A well-developed online learning tool provides several advantages to learners, including (1) improved accessibility (anywhere, anytime), (2) tailored instructions to meet the needs of specific audiences, and (3) customizable delivery method that include text animated graphics, audio, and games.19–22 Finally, online learning tools allow for more complex features, such as immersive simulated environments.20,22 Nevertheless, online learning tools can be challenging to develop and implement. They require significant development resources, technical expertise, and pedagogical knowledge.
We previously conducted a systematic review of the literature to determine the effectiveness of online learning tools designed to improve knowledge of clinical practice guidelines by health care providers.23 We found preliminary evidence suggesting that spaced education in combination with a game or case studies, as well as evidence toward the use of module-based education, are associated with improving knowledge of clinical practice guidelines. Additionally, previous systematic reviews identified that online learning tools are at least as effective as traditional, in-person modes of learning for improving knowledge.4,24–26 Furthermore, studies of various online learning methods (such as websites, Web-based workshops, and electronic educational games) that aim to increase knowledge, change clinical behaviors, and improve patient care have been conducted.4,24–26 Overall, these studies suggested that learning tools that incorporate an evidence-based design may stand as a promising method to train clinicians about clinical practice guidelines.4,21,22 However, these methods deserve further investigation.
We aimed to create an online learning environment that facilitated the acquisition of new evidence-based knowledge by educators at the Canadian Memorial Chiropractic College (CMCC). Our study had 2 specific objectives: first, to develop an online, interactive module-based learning tool to educate faculty members at the CMCC about the evidence-based management of neck pain using a clinical practice guideline and, second, to evaluate the pedagogical constructs (learning, design, and engagement) of the learning tool and develop recommendations for its further development.
METHODS
Tool Development
We formed a Knowledge User Advisory Committee27 to advise on the development and structure of the learning tool and the barriers and enablers to implementing a new educational intervention (eg, perceptions of new learning tools within this population and suggestions on how to overcome these barriers) and to provide poststudy involvement advice on the dissemination of the study results. The committee included 9 members: 6 chiropractors (1 clinician, 1 teaching faculty member, 2 department administrators, 1 PhD student, and 1 CMCC chiropractic resident) and, additionally, 1 medical doctor, 1 department administrator (dean of undergraduate education) and 1 information technology specialist.
We developed the tool using an integrated knowledge translation (KT) approach. KT refers to the exchange, synthesis, and ethically sound application of knowledge to improve health and provide effective services in health care.28 Additionally, we adopted the Knowledge to Action (KTA) Framework29 to guide the phases of the project. The KTA Framework is a conceptual framework, developed by Graham et al,29 intended to guide the process of KT through a cycle of steps. For the purposes of this study, we focused on the first 4 steps of the KTA cycle: (1) identify the problem and select knowledge, (2) adapt knowledge to local context, (3) assess barriers to knowledge use, and (4) select, tailor, and implement interventions (Fig. 1).30 The framework includes 3 additional steps: monitor knowledge use, evaluate outcomes, and sustain knowledge use. These steps were beyond the scope of this present study; however, we recommend that they be addressed in future research.
The first step of the framework includes identifying the problem and selecting knowledge. CMCC expressed their interest in implementing clinical practice guidelines into their curriculum; but were interested in first learning which methods might be most appropriate to disseminate the guideline to their faculty members. Next, we selected a guideline on the management of neck pain, as it is a common complaint that chiropractors treat.31 We selected the neck pain guideline by the OPTIMa Collaboration32 to be the topic for our learning tool. We chose this guideline because of its high methodological quality.33,34 Above all, our objective was to create a generic learning tool whereby the guideline topic could be substituted but the foundational pedagogical properties of the intervention would remain.
The second step includes adapting knowledge to the local context. As part of this larger study, we conducted a systematic review of the literature to determine the effectiveness of technology-based learning tools to improve knowledge of clinical practice guidelines for health care professionals.23 Our review included studies investigating the effectiveness of any technology-based learning tools developed to improve knowledge of health care providers (eg, physicians, residents, and nurses) about clinical practice guidelines. The systematic review provided us with evidence for the best format to use for our learning tool. We synthesized the evidence from methodologically valid studies using best-evidence methodology.35,36 The evidence suggests that spaced education may be associated with improvement in knowledge; however, its effectiveness relative to other interventions has not been established.37,38 Similarly, module-based online educational interventions may be associated with improvement in knowledge of clinical practice guidelines; however, these interventions may not be more effective than paper-based self-learning or in-person workshops.39–42
We then applied pedagogical theories and principles to inform the design of the learning tool. We focused on theories and principles relevant to educational motivation (intrinsic and extrinsic factors influencing participation in learning).43,44 We focused on this concept because it was a recurring issue discussed by the Advisory Committee and an important issue identified from the literature for a variety of health care professions.19,22,45,46
In the third step, we consulted the Advisory Committee (in-person meeting) to explore potential barriers to successful implementation of continuing education programs and learn about their views with online learning. For example, “What is important to you when accessing online material/media? Are there certain visual characteristics you tend to be drawn to?” The comments from this meeting were audio recorded. Following the meeting, we categorized the comments by pedagogical themes and concepts. These categories informed the development phase of the learning tool.
Finally, the fourth step of the framework includes selecting, tailoring, and implementing the intervention. Using the information collected in the first 3 steps, we developed an online, interactive learning tool aiming to translate information from the clinical practice guideline32 into a series of interactive, asynchronous learning modules. The tool included 6 short modules (3 Web pages/module), each focusing on a component of the clinical practice guideline (Fig. 2). The first page of each module outlined the learning outcomes. The second page provided the learners with instructions for completing the module, the learning content, links to external resources, and the main “take-home” messages. The last page provided a short self-evaluation of knowledge (multiple-choice quizzes and feedback). The last section of the tool included 3 clinical case studies of patients with neck pain. A member of the advisory committee developed the case studies from their own clinical experiences. The purpose of the case studies was to reiterate the use of the guideline using a real patient example.
The learning tool. Top left are the learning outcomes. Bottom left is the learning content. Top right is the self evaluation. Bottom right is the clinical case study evaluation.
The learning tool. Top left are the learning outcomes. Bottom left is the learning content. Top right is the self evaluation. Bottom right is the clinical case study evaluation.
To tailor the learning tool to our end users (faculty), the Advisory Committee provided feedback on the design of the learning tool through a series of 5 online surveys between June and November 2017. Using an online survey software (SurveyMonkey, San Mateo, CA), the research team sent the Advisory Committee links to portions of the learning tool as well as a link to a survey to provide guidance and feedback. Survey questions were developed in association with pedagogical theories and/or principles that helped formulate the development of the learning tool. For example, “Did you find this section of the learning tool user-friendly? If yes, what made you feel this way? If no, please explain why not.” This question can be linked to the pedagogical concept of usability (usability → navigation and scrolling → learner control).
Evaluation of the Tool
We implemented our learning tool to faculty members at CMCC and conducted an evaluation to gain feedback on its pedagogical quality. The evaluations collected in this phase would provide guidance to amending the tool where necessary.
Study Design and Sample
Eligible participants included all faculty members employed at CMCC between February 1 and May 31, 2018 (n = 127) regardless of teaching focus (eg, anatomy, clinical education, graduate studies). Members of the Knowledge User Advisory Committee were not eligible to participate. All participants completed an online informed consent form prior to participation. This study was approved by the University of Ontario Institute of Technology and the Canadian Memorial Chiropractic College research ethics boards.
We recruited faculty members using 3 strategies: (1) e-mails from department administrators, (2) an online posting on CMCC's learning management system (LMS), and (3) face-to-face interactions. Recruitment e-mails were sent from the department administrators to faculty members and teaching assistants/residents. The e-mails included an information letter describing the study and information regarding participation. An invitation to participate was posted on CMCC's LMS. The platform includes a page from the Office of Research Administration to track, report, and deliver information about upcoming research and resources to faculty members and staff. The invitation to participate included an information letter as well as a direct link to the study, contact information, and ethics details. Finally, we actively recruited participants through face-to-face interactions during formal and informal meetings.
During active recruitment (face-to-face interaction), those who had begun participating informally reported that it was taking too much of their time to complete of the 6 modules. Therefore, we reduced the length of the tool from 6 modules to 1 on April 27, 2018. This modification aimed to minimize the burden on participants and reduced completion time from 30 to approximately 10 minutes. We also reduced the number of case studies from 3 cases to 1.
Data Collection
Participant Characteristics
We collected demographic and professional characteristics using an online baseline questionnaire that included: (1) age and gender, (2) years of experience in the chiropractic profession, (3) years of experience in a chiropractic teaching role, (4) number of hours worked per week at CMCC, (5) department or division, (6) familiarity with the neck pain guideline, (7) experience with technology-based learning tools, and (8) self-rated proficiency with computers.
Outcome Measure
Following completion of the tool, participants were asked to evaluate the tool by completing the Learning Object Evaluation Scale for Students (LOES-S). The LOES is a 13-item self-reported questionnaire that evaluates the impact the tool has had on their learning experience. This impact is categorized into 3 constructs: learning, design, and engagement.47 The learning construct consists of 5 items (eg, working with the tool helped me learn, the feedback from the tool helped me learn). The design construct consists of 4 items (eg, the help features in the tool were useful, the instructions in the tool were easy to follow). Finally, the engagement construct consists of 4 items (eg, I like the overall theme of the tool, I found the tool engaging).
The psychometric properties of the tool were measured in 2 separate studies:47,48 the first in middle and secondary school students (10–22 years old) for any subject appropriate for their respective curriculums48 and the second in middle and secondary school students (11–17 years old) for math and science.47 Both studies evaluated a variety of online learning tools involving experimentation, virtual manipulatives, task-based applications, and formal representation of concepts.47,48 Adequate internal reliability was demonstrated based on Cronbach's α: .93 (learning), .87 (design), and .92 (engagement). Each construct measured was distinct (correlation between learning and design: r = .71, p < .001; correlation between learning and engagement: r = .76, p < .001; correlation between engagement and design: r = .65, p < .001). Convergent validity correlations ranged from 0.36 to 0.65, demonstrating a moderate degree of consistency between student and teacher evaluations. The correlation between the evaluation scale and learning performance was assessed in 4 categories—learning: remembering r = .01, understanding r = .11, application r = .16 (p < .005), analysis r = .37 (p < .001); design: remembering r = −.08, understanding r = .04, application r = .12 (p < .05), analysis r = .30 (p < .005); engagement: remembering r = −.04, understanding r = .01, application r = .016 (p < .005), analysis r = .31 (p < .005).
We selected this outcome measure because of its focus on the student-centered constructs of learning with the tool rather than an evaluation of knowledge. This focus coincided with the primary stages of development of the KTA cycle.30 Alternate evaluation tools focus on the development and design of the learning tool and miss the impact that the learning tool has on the learner.47 This perspective is particularly important, as it has a direct relationship to knowledge gained through the use of the tool.47
We modified the language of the LOES-S. Specifically, we changed the original term “learning object” to “learning tool.” This was made to limit any confusion by users because the term “learning object” is not widely used in this population. Additionally, we included 1 open-ended statement to ask participants to provide suggestions to improve the tool. Specifically, participants were asked, “Please provide any suggestions you may have to improve this technology-based learning tool.”
Analysis
LOES-S
We report the distribution of responses to the 13 items of the LOES to describe participants' evaluation of the learning, design, and constructs of the tool. Participants who reported a score of 4/5 (agree) or 5/5 (strongly agree) were classified as agreeing with the item. Those who reported a score of 2/5 (disagree) or 1/5 (strongly disagree) were classified as disagreeing. A score of 3/5 was classified as a neutral evaluation. We calculated the median evaluation scores and interquartile ranges for each item. Median scores less than 4 determined a need for improvement. The analysis was conducted using SPSS Statistics Version 24 (IBM Corp, Armonk, NY).
Open-Ended Suggestions
We performed a content analysis of the suggestions provided by participants. Comments provided by participants were separated into individual statements (if more than 1 distinct suggestion was provided). For example, “There could have been more graphics, and the quiz questions were a bit easy.” This statement was separated into 2 distinct comments: (1) “There could have been more graphics” and (2) “The quiz questions were a bit easy.” Three investigators independently completed a content analysis worksheet where they were asked to match each individual comment to the most relevant pedagogical theme. Investigators were provided with a list of pedagogical themes, including their definitions, to select from. Pedagogical themes included coherence, contiguity, learner control, multimedia, personalization, practice, segmenting, and thinking skills. These themes were selected as they were used as a guide during the development of the learning tool. Investigators completed a discussion-based consensus. Results were stratified by recurring pedagogical themes used as references throughout the development phase of the learning tool.
RESULTS
Twenty-eight faculty members participated (28/127, 22%). Twelve participants who started the study did not complete the evaluation questionnaire (completed the baseline questionnaire only); therefore, our sample includes 16 participants who completed the baseline questionnaire and LOES-S evaluation.
Sample Characteristics
Most participants were male (12/16; 75%) and between the ages of 25 and 44 years (9/12, 56.3%) (Table 1). Most (11/16, 68.8%) reported that their highest level of education was a doctor of chiropractic (DC) degree, and 62.5% (10/16) reported no experience with the development of a clinical practice guideline.
Our sample included approximately 13% of the eligible population. A census of CMCC faculty during the 2017–2018 academic year suggests that more employees were male (57%), the average age was 46 years, and a majority (63%) reported their highest level of education to be a DC degree (Faculty demographics provided by CMCC, August 2018). Although the demographic characteristics of these census data were limited, the presented characteristics seem to be like those who participated in this study.
Twelve participants completed the baseline questionnaire only (Table 1). The differences between participants who completed the study (baseline questionnaire and LOES-S evaluation) and those who completed the baseline questionnaire only may have led to attrition bias. For example, differences between groups regarding years of experience in a chiropractic teaching role (Table 1) may have influenced the overall evaluation of the learning tool if they had not been lost to follow-up.
Learning Object Evaluation Scale Constructs
Learning
All items had a median score of 4/5 (Table 2). One participant disagreed with item #4—“The tool helped teach me a new concept”—and 3 participants scored this item as neutral. Three participants scored item #3—“The graphics and animations from the tool helped me learn”—as neutral.
Design
All items within this construct had a median score of 4/5 (Table 2). One participant disagreed with item #7—“The instructions in the tool were easy to follow”—item #8—“The tool was easy to use”—and item #9—“The tool was well organized.” Four participants scored item #6—“The help features in the tool were useful”—as neutral, and 3 participants scored item #7 as neutral.
Engagement
All items within this construct had a median score of 4/5 (Table 2). One participant disagreed with item #10—“I like the overall theme of the tool”—and item #13—“I would like to use the tool again.” Three participants scored item #12—“The tool made learning fun”—as neutral, and 2 participants score item #13 as neutral.
Feedback from Participants
We received 23 comments. Ten comments were removed from the content analysis for the following reasons: (1) 6 entries included no suggestions (eg, no comment or n/a), (2) 1 comment related to enjoying the tool but provided no suggestions for improvements, (3) 1 comment pertained to difficulty experiencing clicking on a button within the tool, (4) 1 comment pertained to the baseline questionnaire, and (5) 1 comment was from a participant who participated more than 1 time (only their first comment was included in the analysis to limit information bias). Thirteen comments remained and were included in the content analysis.
Three pedagogical themes were identified during this analysis: learner control, multimedia, and thinking skills (Table 3). The 2 most frequently occurring comments (n = 6) were regarding the pedagogical theme of learner control. Participants wanted more control over the pace of the learning tool. Comments regarding multimedia (n = 4) emerged regarding graphics, colors, and the inclusion of more videos. Finally, the thinking skills theme comments (n = 3) were in regard to the evaluation components of the learning tool (quizzes and case studies). Participants suggested these components should be more exciting/challenging.
Suggested Revisions to Learning Tool
Based on the suggestions from participants (open-ended suggestions) and LOES-S quantitative evaluations, the following revisions are recommended for the further development of the learning tool for this population. To our knowledge, pedagogically focused recommendations for the further development of technology-based learning tools to improve health care–focused knowledge of health care professionals has not been provided elsewhere.
The following recommendations were developed to directly reflect the pedagogical themes derived through the content analysis as previously described. Recommendations are supported by the literature describing pedagogical theories and principles common to online or Web-based learning.
Recommendation #1: Include a wider variety of media (eg, graphics and videos) to break up the text and keep learners engaged.
“There could have been more graphics and a more exciting case.”
Multimedia is a pedagogical design principle that emphasizes using a combination of text and graphics to provide the learner with a richer learning experience.49 All media should, however, be relevant to the learning material and serve a specific purpose; graphics or animations for the sole purpose of aesthetics are not recommended.49 This recommendation also refers to the modality principle, wherein presenting information in a narrative format rather than text may be beneficial to the learner. Narration, however, should not be used to present long and complex information.50
Recommendation #2: Consult content experts to ensure that review material (eg, quizzes and case studies) is appropriate for learners and the learning environment.
“The quiz questions were a bit easy; the cases were good and helpful to apply the knowledge.”
The thinking skills pedagogical principle outlines the cognitive processes learners use to accomplish tasks in a learning environment, which are (1) generating new ideas and perspectives; (2) applying, analyzing, synthesizing and evaluating information; and (3) awareness and analysis of one's thoughts.51 Learning tool components, such as review quizzes and case study, allow learners to follow these cognitive processes in order to apply new knowledge more effectively. Consulting content experts ensures that the review of components are appropriate for the learners and the difficulty of the content. This allows for a more adequate stepwise cognitive process to learn the new concept or skill.
Recommendation #3: Include a main menu to allow learners to control the sections of the learning tool they wish to review or skip to.
“Would have liked to see a home page or table of contents to go back to review specific sections.”
Learner control is a pedagogical principle that describes the degree of control a learner has over the learning experience.52 Types of learner control include (1) content sequencing: having control over the order of the course material; (2) pacing: having control of the time spent on each section of the lesson; and (3) access to learning support: having the ability to access additional resources to add to the learning experience.52 It is recommended that asynchronous online learning incorporate some degree of learner control. However, the degree of learner control is directly related to the degree of difficulty and complexity of the learning content, learners' previous knowledge of the subject matter, and learner metacognition.52 Therefore, if the degree of difficulty and complexity is high, learners' previous knowledge of the subject matter is low, and learner metacognition is low, there should be a lesser degree of learner control.
DISCUSSION
Our evaluation suggests that participants agreed that our tool had adequate learning, design, and engagement constructs. The open-ended feedback demonstrates 3 components of the tool that could benefit from further development. These open-ended evaluations are supported by 3 pedagogical themes: learner control, multimedia and thinking skills.
To our knowledge, we developed and evaluated the first online learning tool for chiropractic faculty. However, learning tool evaluations have been conducted in other health care professions. We identified 3 studies that evaluated online educational interventions in health care providers.53–55 All 3 measured user satisfaction using a Likert scale and open-ended questions.53–55 Two studies provided recommendations for subsequent use of the learning tools based on the satisfaction results.53,54 However, none of these studies referred to pedagogical theories or principles regarding their users' satisfaction. Nevertheless, the technology-based learning tools were evaluated favorably in the physician54,55 and nurse practitioner53 populations. Our study is novel because we evaluated user-centered constructs of the learning tool rather than satisfaction with the tool. This evaluation provided us with a descriptive evaluation of intrinsic and extrinsic factors associated with learning. Pedagogical theory and principles were incorporated into the design of the tool as well as the development of recommendations for its improvement. Finally, we did not incorporate the final steps of the framework to assess the impact of the tool on the end user, the student. Future research should continue through the final steps of the framework to determine its effectiveness following its implementation.
Strengths and Limitations
A strength of our study is the integrated KT approach used to develop the tool.27 The design of the learning tool was informed by 3 sources: (1) a Knowledge User Advisory Committee, (2) a systematic review of the literature, and (3) pedagogical theories and learning principles. This methodology ensured that the intervention was designed based on informed sources and thorough evaluation. This methodology is also designed to increase the uptake and impact of research findings by knowledge users beyond the scientific scope of the study.27 The evaluation informed the creation of recommendations for the further development of the learning tool for this population. Additionally, the inclusion of the open-ended question following the LOES-S complemented the quantitative evaluation and provided a more complete understanding of the evaluation. Finally, we implemented 3 recruitment strategies to maximize participation: multiple direct e-mails to eligible participants from department leads, an internal LMS advertisement, and face-to-face active recruitment. We compared participation rates on a weekly basis to understand recruitment strategy and which strategy was most effective. Our data suggest that in-person active recruitment was most effective for this population. This should be considered for future research of this nature.
This study had limitations. The LOES-S is a valid and reliable tool for use in the middle and high school environments; however, we do not know its psychometric properties for use in this health care provider/educator population.47,48 Although this may be a limitation, there are few evaluation tools designed to evaluate technology-based learning tools that focused on the impact on learners compared to solely evaluating an outcome of the educational intervention.47 Another limitation is the potential for selection bias. It is unclear if those individuals who participated in the study were representative of the entire eligible CMCC faculty population, but basic demographic characteristics of the eligible population suggest that they may be similar. It is possible that those who participated were more willing and/or interested in adopting online learning education. Barriers to participation in educational interventions by health care professionals have been reviewed in the literature.8,56,57 The most frequently occurring barrier to participation is a lack of time. Other common barriers include incentives to participation, financial constraints, personal constraints such as health status and motivation, lack of awareness of educational activities, and job status (part-time versus full-time).8,56,57
CONCLUSIONS
We developed an online, interactive online learning tool aimed to enhance the uptake of a clinical practice guideline on the management of neck pain for chiropractic faculty at CMCC. Our evaluation suggests that the learning, design, and engagement constructs of the learning tool are adequate. However, the tool requires refinement. Further development of the learning tool is recommended to increase educational engagement for this population. Future research is recommended to investigate chiropractic faculty's barriers to educational participation as well as to investigate the efficacy of the learning tool for increasing knowledge of clinical practice guidelines within this population.
FUNDING AND CONFLICTS OF INTEREST
This work was funded internally. The authors have no conflicts of interest to declare relevant to this work.
REFERENCES
Author notes
Leslie Verville is research manager, Knowledge Translation at the Centre for Disability Prevention and Rehabilitation at Ontario Tech University (2000 Simcoe Street North, Oshawa, ON, L1H 7K4, Canada; leslie.verville@uoit.ca). Pierre Côté is a professor in the Faculty of Health Sciences at Ontario Tech University, director of the Centre for Disability Prevention and Rehabilitation at Ontario Tech University, and Canada Research Chair in Disability Prevention and Rehabilitation at Ontario Tech University [ORCiD https://orcid.org/0000-0002-6986-6676] (2000 Simcoe Street North, Oshawa, ON, L1H 7K4; Canada; 7K4; pierre.cote@uoit.ca). Diane Grondin is an associate professor of research and innovation at Canadian Memorial Chiropractic College (6100 Leslie Street North, Toronto, ON, M2H 3J1, Canada; dgrondin@cmcc.ca). Silvano Mior is the director of research partnerships and health policy at the Canadian Memorial Chiropractic College and scientist in the Centre for Disability Prevention and Rehabilitation at Ontario Tech University (6100 Leslie Street North, Toronto, ON, M2H 3J1, Canada; smior@cmcc.ca). Robin Kay is a professor in the Faculty of Education at Ontario Tech University (2000 Simcoe Street North, Oshawa, ON, L1H 7K4, Canada; robin.kay@uoit.ca).
Concept development: LV, PC. Design: LV, PC, DG, SM, RK. Supervision: LV, PC, RK. Data collection/processing: LV. Analysis/interpretation: LV, PC, DG, SM. Literature search: LV, PC, DG, SM, RK. Writing: LV. Critical review: LV, PC, DG, SM, RK.