Objective

The doctor of chiropractic program (DCP) graduate must demonstrate competency in clinical research literacy (CRL), per accreditation standards. This study aimed to compare student CRL knowledge, confidence, and attitudes between the beginning and end of their DCP.

Methods

We collected data on 245 matriculating students’ CRL knowledge, confidence, and attitudes between 2017 and 2018. In 2021 and 2022, 78 of these students enrolled in a course with an extra credit assignment that was used to re-collect CRL data as they approached graduation. We assessed changes between entry and exit using statistical analyses in STATA17.

Results

Paired data were collected for 56 students. The mean CRL scores on a scale of 10 at the DCP beginning and end were 5.25 (SD 2.06) and 6.54 (SD 1.89), respectively (p = .0001). We observed statistically significant (p ≤ .05) positive changes in students’ abilities to answer questions about Medical Subject Headings, the hierarchy of evidence, systematic reviews, meta-analyses, and the limitations of abstracts. There was also a statistically significant increase in confidence, with over 80% of students nearing graduation reporting good or excellent abilities to find and judge health information for their patients. The proportion of students who envisioned searching a database to help manage a challenging clinical case decreased from 96% to 89% (p > .05). The proportion seeing themselves submitting a case report for publication declined from 16% to 4% (p ≤ .05).

Conclusion

Students’ self-perceived CRL abilities and knowledge improved between the beginning and end of their DCP; however, their attitudes toward applying these in practice declined.

According to the 2018 Council on Chiropractic Education (CCE) Accreditation Standards, doctor of chiropractic program (DCP), graduates should be able to “Locate, critically appraise and use relevant scientific literature and other evidence.” This CCE standard aligns with the information literacy (IL) standards developed by the Association of College and Research Libraries (ACRL) for higher education.1  According to ACRL, “An information literate person is able to 1) determine the nature and extent of information needed; 2) access the needed information effectively and efficiently; 3) evaluate information and its sources critically; 4) use information effectively to accomplish a specific purpose; and 5) use information ethically and legally.”

In 2016, our institution developed and began implementing a series of successive courses in clinical research literacy (CRL) to help students meet the above CCE and IL standards. The course series content was drawn from previous course instruction,2  from valuable information presented at the Process of Integrating Evidence (PIE) conferences in 2015 and 2017,3  and from various online resources. Our DCP is on the quarter system (11 weeks per quarter), and students typically took these 1-hour face-to-face weekly courses in the first, fourth, and eighth academic quarters. Instructors used small break-out groups for students to discuss and apply the material. Active learning techniques, in addition to the spiral design, allowed for scaffolding skills which may lead to greater student learning and retention.4,5  The course series was taught by trained research scientists and clinicians, with guest lectures by research librarians. Class sizes varied between 13 and 80 students enrolled per quarterly course.

In CRL1 (introductory course), a research librarian taught students how to: identify a peer- reviewed study, use Medical Subject Headings (MeSH) for locating relevant studies in PubMed, and employ proper citation formatting. The course instructors introduced IL Standard 3—the ability to evaluate information and its sources critically. Instructors described the various study designs and hierarchies of evidence and taught students how to assess whether the study’s description and design accomplished its stated aim. Standard 2 skills were further developed in CRL2 (intermediate) by teaching students how to formulate a patient/population, intervention, comparison, and outcomes (PICO) question based on example clinical cases, a common component of evidence-based practice (EBP) curriculum.6  In CRL3 (proficient), we continued scaffolding students’ skills for IL standards 1–5, focusing on strengthening skills in critical appraisal of randomized controlled clinical trials and secondary sources of information (IL Standard 3). As students progressed and eventually completed the CRL series, IL Standards were reinforced in class assignments throughout the curriculum, such as public health, spinal disorders, and geriatrics. Students demonstrated their knowledge and skills in 1 of their final courses—a clinical case scholarship (CCS) course. In this course, their capstone project was to write a case study on 1 of their health center patients.

The college’s institutional review board determined this study was not human subjects research (NHSR#2021-B.irb) and, therefore, not subject to institutional review board oversight. The 2016 Guideline for Reporting Evidence-based practice Educational interventions and Teaching (GREET) checklist informed the writing of this manuscript.7 

Survey at Entry (RRS1)

In 2017–2018, we collected data about matriculating students’ information literacy knowledge, attitudes, and confidence as an initial assignment in our first quarter CRL course using our Research Readiness Survey (RRS). The RRS was developed by our DCP librarians and modeled after Central Michigan University (CMU)’s Research Readiness Self-Assessment (RRSA).8 

Survey at Exit (RRS2)

In 2021, 2 CRL instructors (KW, DJ) reviewed RRS and selected 16 of the 50 questions most closely aligned with the CRL courses to use as an exit survey with students nearing graduation. Between 2021 and 2022, the exit survey (henceforth RRS2) was given as an extra credit assignment in the Clinical Case Scholarship (CCS) course. Occasionally, a student had to repeat the CCS course. Therefore, only the data from the student’s first completed RRS2 was analyzed for this study. In addition to the 14 questions in Table 1, RRS2 included 1 question for student ID to allow for matching with RRS1 and 1 question for the student’s highest degree completed before entering chiropractic college.

Table 1

Performance on Research Readiness Survey (RRS) Questions From Students As They Matriculated (Entry) and Neared Graduation (Exit) in a Doctor of Chiropractic Program

Performance on Research Readiness Survey (RRS) Questions From Students As They Matriculated (Entry) and Neared Graduation (Exit) in a Doctor of Chiropractic Program
Performance on Research Readiness Survey (RRS) Questions From Students As They Matriculated (Entry) and Neared Graduation (Exit) in a Doctor of Chiropractic Program

Both RRS1 and RRS2 were distributed as an online Qualtrics XM survey (Qualtrics, Provo, UT, USA) via Canvas (Canvas LMS, Instructure, Salt Lake City, UT), the college’s online learning management software.

Statistical Analyses

We created a composite score for the total number of correct responses for the knowledge questions included in both RRS1 and RRS2, with 10 as the highest possible total score. A paired t test was used to compare the total scores from RRS1 and RRS2 on these 10 questions. McNemar’s χ2 was used to compare the proportions of correct/desirable answers for individual questions. All statistical analyses were conducted using STATA 17 SE (StataCorp; College Station, TX).

Among the 245 students who took the entry survey RRS1 in 2017–2018,8  many had graduated or left the college (temporarily or permanently) before data collection began for the exit survey RRS2 in 2021–2022. Of the 78 students who took RRS1 and who were offered the RRS2 extra credit assignment in their CCS course during 2021–2022, 56 completed the RRS2 survey (71.79%). Four (7.14%) of these students reported an associate degree as their highest level of education before chiropractic school, 48 (85.71%) reported a bachelor’s degree, and 4 (7.14%) did not state their prior degree.

The mean summary score for the 10 knowledge questions among the 56 students with paired data increased between RRS1 data collection in 2017/2018 and RRS2 data collection in 2021/2022. The mean summary score for students at matriculation was 5.25 (SD, 2.06; 95% CI, 4.70–5.80) and toward graduation was 6.64 (SD, 1.89; 95% CI, 6.03–7.04), p < .001 for the 1-sided paired t test. The proportion of students selecting the correct answers for knowledge questions increased for 9 out of 10 questions and was statistically significant (p ≤ .05) for 5 of these questions; students’ knowledge about the strength of evidence from peer-reviewed journals compared with other sources, systematic reviews, meta-analyses, abstract limitations, and MeSH (Table 1). For student confidence, we observed a statistically significant increase in students’ self-perceived abilities to find information on patient care and judge the quality of this health information. Regarding IL attitudes, there was no statistically significant change in the proportion of students foreseeing using a database to help manage complex cases in practice. The proportion of students envisioning submitting a case report to a journal declined significantly (Table 1).

Students at our DCP demonstrated improved overall information literacy knowledge between the beginning and end of our curriculum. However, our study highlights that many students still struggle with information literacy concepts as they near graduation. The students’ average percent score on the RRS2 knowledge questions was less than 70%. While we cannot directly compare student results from our 10-point RRS2 nonvalidated survey with the 100-point evidence-based practice (EBP) knowledge test used by the University of Western States College of Chiropractic,9  it is interesting to note that Western States also reported students scoring less than 70% 1 year after students completed their core EBP courses.10  The students in the current study may have struggled with IL as they approached graduation due to several factors, including the 1-year gap between the last CRL course and the RRS2 survey, CRL staff and curricular changes, shifting student priorities, and sampling bias.

Like other DCPs, we developed an EBP curriculum with multiple courses throughout the DCP.11,12  Our 3-part CRL series was developed in 2016/2017 by research staff and librarians. Since then, there have been several CRL1 and CRL2 instructors, each taking a different approach to these classes with not all following the original instructional design, and in some quarters not including the librarian as a guest lecturer. As noted by Whillier et al, “If staff members are left out, or do not participate, or choose not to follow the design, the implementation of the design will fall short of intended outcomes, and at worst it can fall apart—to the detriment of student learning.”12  The Office of Academic Affairs also implemented substantial curricular changes during the timeframe of our study, partly driven by the COVID-19 pandemic shifting the CRL courses online.13  These changes disrupted the sustained delivery of the planned CRL course sequence and content and possibly impacted student learning outcomes and retention over the longer term, from entry to exit. An additional explanation for the observations in this study is that students nearing graduation may place less priority on research literacy as the transition from student to intern involves multiple additional responsibilities and competing priorities. Other investigators have documented a decline in chiropractic student attitudes toward research literacy as students progress toward graduation.10 

We cannot speculate on the extent to which our results represent other students of DCPs. Our study has low generalizability as it was a convenience sample from students at one DCP. Furthermore, we could only pair entry and exit data for 56 students. While 72% of students with baseline data who were offered the extra credit assignment completed RRS2, we do not know if these respondent students differed in their literacy knowledge from nonresponding peers who chose not to complete the RRS2 extra credit. It is also unknown if the respondents differed from students who were not recruited for RRS2 because they already graduated or had withdrawn from the DCP prior to RRS2 being offered as an extra credit in the CCS course.

Throughout our DCP curriculum, which included 3 courses in clinical research literacy, students improved their perceived abilities to acquire and appraise information on patient care. While their average knowledge improved in some areas of research literacy, many students still struggled with research literacy concepts near graduation. There was also a decline in the students’ attitudes about applying these skills in their future clinical practice. Continued efforts are needed to educate DC students on research literacy and improve their motivation for applying these skills in practice.

1.
American Library Association
.
ACRL STANDARDS: Information Literacy Competency Standards for Higher Education
.
2000
.
2000-03-01
2000
;
61
(3)
:
9
.
2.
Smith
M,
Henderson
C,
Marchiori
D,
Hawk
C,
Meeker
W,
Killinger
L.
Report on the development, implementation, and evaluation of an evidence-based skills course
.
J Chiropr Educ
.
2004
;
18
(2)
:
116
126
.
3.
World Federation of Chiropractic Colleges
;
Association of Chiropractic Colleges
.
11th World Federation of Chiropractic Colleges/Association of Chiropractic Colleges Global Education Conference: Leveling Up: Creating Consistency in Chiropractic Education, November 2–5, 2022
.
J Chiropr Educ
.
2022
;
36
(2)
:
179
193
.
4.
Maggio
LA,
Tannery
NH,
Chen
HC,
ten Cate
O,
O’Brien
B.
Evidence-based medicine training in undergraduate medical education: a review and critique of the literature published 2006–2011
.
Acad Med
.
2013
;
88
(7)
:
1022
1028
.
5.
Martin
BA,
Kraus
CK,
Kim
SY.
Longitudinal teaching of evidence-based decision making
.
Am J Pharm Educ
.
2012
;
76
(10)
:
197
.
6.
Maggio
LA,
Kung
JY.
How are medical students trained to locate biomedical information to practice evidence-based medicine? A review of the 2007–2012 literature
.
J Med Libr Assoc
.
2014
;
102
(3)
:
184
191
.
7.
Phillips
AC,
Lewis
LK,
McEvoy
MP,
et al
Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET)
.
BMC Med Educ
.
2016
;
16
(1)
:
237
.
8.
Ward
KL,
Delli Gatti
BL,
Osenga
A,
Odierna
DH,
Smith
M.
Information literacy of matriculating chiropractic students assessed via research readiness survey
.
J Chiropr Educ
.
2023
;
37
(1)
:
20
25
.
9.
Leo
MC,
Peterson
D,
Haas
M,
LeFebvre
R,
Bhalerao
S.
Development and psychometric evaluation of an evidence-based practice questionnaire for a chiropractic curriculum
.
J Manipulative Physiol Ther
.
2012
;
35
(9)
:
692
700
.
10.
Haas
M,
Leo
M,
Peterson
D,
Lefebvre
R,
Vavrek
D.
Evaluation of the effects of an evidence-based practice curriculum on knowledge, attitudes, and self-assessed skills and behaviors in chiropractic students
.
J Manipulative Physiol Ther
.
2012
;
35
(9)
:
701
709
.
11.
Lefebvre
RP,
Peterson
DH,
Haas
M,
et al
Training the evidence-based practitioner: university of Western States document on standards and competencies
.
J Chiropr Educ
.
2011
;
25
(1)
:
30
37
.
12.
Whillier
S,
Spence
N,
Giuriato
R.
A collaborative process for a program redesign for education in evidence-based health care
.
J Chiropr Educ
.
2019
;
33
(1)
:
40
48
.
13.
de Luca
K,
McDonald
M,
Montgomery
L,
et al
COVID-19: how has a global pandemic changed manual therapy technique education in chiropractic programs around the world
?
Chiropr Man Therap
.
2021
;
29
(1)
:
7
.

FUNDING AND CONFLICTS OF INTEREST The authors have none to declare.

Author notes

Krista Ward (corresponding author) is a research specialist in the Research Department at Life Chiropractic College West (25001 Industrial Blvd, Hayward, CA 94545; [email protected]). Dale Johnson is a professor and the research compliance officer in the Basic Science Department and Research Department at Life Chiropractic College West (25001 Industrial Blvd, Hayward, CA 94545; [email protected]). Barbara Delli Gatti is the retired director of the Learning Commons at Life Chiropractic College West (25001 Industrial Blvd, Hayward, CA 94545; [email protected]). Monica Smith is the research director in the Research Department at Life Chiropractic College West (25001 Industrial Blvd, Hayward, CA 94545; [email protected]).

Concept development: KW, BDG, MS. Design: KW, BDG, MS. Supervision: KW, DJ, BDG, MS. Data collection/processing: BDG, DJ, MS. Analysis/interpretation: KW, MS. Literature search: KW, BDG, MS. Writing: KW, BDG, MS. Critical review: KW, BDG, DJ, MS. Other: DJ.