ABSTRACT
We developed a Research Readiness Survey (RRS) to identify students' information literacy needs prior to instruction by a team of faculty members and librarians in our doctor of chiropractic program clinical research literacy courses. In addition to describing students' responses to our RRS, we explored associations between (1) students' overall performance on the RRS and their prior earned degrees and (2) their self-reported ability and performance on questions pertaining to evaluating information quality (standard 3 of the Association of College and Research Libraries [ACRL] Information Literacy Competency Standards for Higher Education).
The RRS is composed of 50 questions, of which 22 pertain to information literacy knowledge assessment per ACRL standards. We calculated means and standard deviations for summary scores on 4 ACRL standards and for a total RRS score. We used analysis of variance to assess whether standard 3 scores differed by students' self-reported ability to judge health information quality and whether there was an association between total RRS scores and students' previously earned degrees.
In 2017–2018, 245 students (70% of matriculates) completed the RRS. Students performed best on standard 3 (average score 67%) and worst on standard 2, the ability to access information (average score = 59%). Students who reported an average ability to judge information quality had higher standard 3 scores than students who reported poor ability (p = .003). Students with bachelor's degrees had higher total RRS scores than students with associate's degrees (p = .004).
Matriculating students had the most difficulty with accessing information, supporting the need to include librarians on the teaching team.
INTRODUCTION
The 2018 Council on Chiropractic Education (CCE) accreditation standards include an information and technology meta-competency that requires doctor of chiropractic programs (DCPs) to prepare students for locating, critically appraising, and using relevant scientific literature.1 These CCE standards align with the 5 information literacy (IL) competency standards for higher education developed by the Association of College and Research Libraries (ACRL)2 (Table 1) and are important for preparing students to use evidence-based practice.
- Standards for Competency in Information Literacy Assessed With the Research Readiness Survey (RRS)

We designed a sequence of 3 clinical research literacy (CRL) courses that were included as required courses for all students matriculating into our chiropractic college from 2016 to 2021. The CRL course series was designed to enable future chiropractic clinicians to critically read and evaluate the existing scientific evidence and strengthen their capacity to “use relevant scientific literature and other evidence to inform patient care.”1 The CRL courses focused on the chiropractor as a research consumer and were designed for chiropractic students who have little or no experience in clinical or epidemiologic research. The series taught fundamental research concepts and techniques necessary for critical reading of the professional and research literature, empowering chiropractic students to take a scientific, evidence-informed approach to chiropractic practice.
The 1st course in our CRL series (CRL-I) set the foundation in critical reading of the literature on clinical intervention research studies and was a required course for all incoming students at our institution beginning in 2016. Instructors for the team-taught CRL course series included our research staff (trained research scientists), faculty, and librarians. As educators, the librarians developed and delivered classroom lectures and created graded assignments with specific learning objectives that were included in determining the student's final grade in these research literacy courses.
In concert with our development and implementation of the CRL course series, our librarians identified a prevalidated IL survey from Central Michigan University (CMU) developed by Ivanitskaya and Casey to “measure basic research skills based on the Information Literacy Competency Standards for Higher Education.”3 In 2016, we used the health information version of this CMU Research Readiness Self-Assessment (RRSA) in a pilot test for surveying our matriculating students in the CRL-I course to assess their IL. We received feedback from our students that they felt the CMU-RRSA survey was too long (56 items), too focused on allopathic medicine, and the demographic questions were too personal. In addition, some students experienced technical difficulties when attempting to take the RRSA survey by externally connecting to the CMU server housing the RRSA.
Using information from our pilot test, we created our own Research Readiness Survey (RRS) covering 4 of the 5 IL standards (Table 1).2 The objective of this article is to report our incoming students' self-reported abilities, attitudes, and knowledge regarding IL, as assessed with our RRS. In addition, we explore 2 hypotheses: (1) whether students' self-reported ability to judge the quality of health information is associated with their performance on standard 3 and (2) whether students' overall performance on the RRS is associated with their prior earned degrees.
METHODS
CRL-I was a required 1st-quarter course in our DCP curriculum. We administered the RRS to 8 successive 1st-quarter cohorts during 2 calendar years, 2017 (4 cohorts) and 2018 (4 cohorts). On the 1st day of CRL-I, students were invited to complete the online RRS, and upon completion, their names were entered into a raffle for an Amazon gift card. This assignment was optional and not graded. Students accessed the RRS via a link within the course schedule as kept on our college's learning management system (Canvas LMS, Instructure, Salt Lake City, UT, USA). The Canvas link seamlessly redirected students to the actual RRS survey, which we administered using the Qualtrics XM survey platform (Qualtrics, Provo, UT, USA). All survey data were collected and managed in-house using our college-licensed Qualtrics and Canvas software. We used STATA 15.1 IC (StataCorp, College Station, TX, USA) for all statistical analyses in this report. The Life Chiropractic College West institutional review board determined that this project did not meet the definition of human subjects research as set out in 45 CFR 46.102.
According to our college's registrar, 353 students matriculated during the 2017 and 2018 calendar years. After cleaning the Qualtrics survey data by reconciling apparent duplicates, our remaining sample of n = 283 students represented 80% of the total 353 entering students during these 2 years. We deleted 38 cases with missing data for any of the variables, leaving an analytic sample of n = 245.
Survey Instrument Details
Over the data collection time frame for this study, we administered 2 versions of our RRS. The 1st RRS version was administered in 2017 to 145 of our sample of students. The survey included a total of 50 questions in the following categories: past education (2 questions), self-assessment (6 questions), attitude toward incorporating IL into their future practice (16 questions), and survey feedback (4 questions). The remaining 22 questions assessed students' IL skills across 4 of the 5 IL standards (Table 1).
The 2nd version of our RRS was administered in 2018 to 100 students. Our 2nd RRS version was shortened when we inadvertently omitted 1 of the standard 1 items, thereby only including 49 questions. All other items were retained in the 2nd version. Students received 1 point for each correct answer to the RRS survey questions explicitly tied to standards 1, 2, 3, and 5. Multiple answers were permitted on some questions, and students could potentially earn 6 extra points. Two questions had 2 correct answers, and 2 questions had 3 right answers. For the original (2017 version) of our RRS, the highest possible score was 28 points. We created summary scores for standards 1, 2, 3, and 5 as well as a total score variable (Table 1). In 2018, the highest possible score was 27 points due to the missing standard 1 question. We did not include IL standard 4 with the RRS survey since this standard was being addressed in multiple later courses within our curriculum, wherein students applied their IL skills to specific subject matter assignments.
Statistical Analysis
For our summary scores for standards 1, 2, 3, and 5 and for our summary total RRS score variable, we calculated the means and standard deviations. To explore our 2 hypotheses, we (1) assessed whether standard 3 summary scores differed by students' self-reported ability to judge the quality of health information by using 1-way analysis of variance (ANOVA) after confirming all assumptions for ANOVA were met4 and (2) looked to see if there was an association between total RRS scores and students' previous earned degrees using the Kruskal–Wallis test (since the variance in total scores by prior degree did not meet the assumptions for 1-way ANOVA using Levene's test for homogeneity of variances).4 In addition to descriptive and inferential statistics on our continuous standard and total score variables, we also analyzed 18 individual multiple-choice and true-false questions (Table 2) by calculating the frequency and percentage of students with correct/desirable answers. We chose these specific questions for additional analysis because they addressed topics that we prioritized in the CRL courses.
RESULTS
We had 245 students in our final sample for analysis. Before enrolling in our DCP, students reported having an associate's degree (11%, n = 26), a bachelor's degree (80%, n = 195), or a master's degree (2%, n = 6), and 7% (n = 18) did not specify. After adding the number of correct answers for the 22 knowledge IL standard questions to create a total RRS score, the average scores were 16.91 (SD 3.88) out of a maximum possible score of 28 (60%) for 2017 survey respondents (n = 145) and 17.03 (SD 3.70) out of maximum possible of 27 (63%) for 2018 survey respondents (n = 100; Table 1). We also report in Table 1 the subcomponent scores for each IL standard that we assessed. The IL standard 1 subcomponent score is reported for each year separately (since our RRS in 2017 had 3 items but 2018 had only 2 items). All other IL standard subcomponent scores are reported for the entire sample for both years combined. Dividing the average scores by the total possible points for each IL standard, students performed best on standard 3 (67%) and worst on standard 2 (59%).
For the knowledge questions that we selected to assess individually, due to their high relevance in the CRL courses, the best performance was on the standard 2 question, “You need an article that is not on the Internet or in the campus journal collection. Your only option is to find another article.” Ninety-five percent of students answered correctly by responding “false.” The worst performance was on the standard 3 question, “You interviewed 50 members regarding their opinions of chiropractic care. You will be submitting the interview findings to a peer-reviewed journal. This type of research is referred to as . . .” with only 27% of students selecting “qualitative research” (Table 2).
We compared standard 3 performance scores between students whose self-reported ability to judge health information quality was “good/excellent” (n = 145) vs “average” (n = 85) vs “poor” (n = 15). Using ANOVA, we found a statistically significant difference in students' standard 3 scores (p = .003) by their self-reported ability to judge health information quality (Fig. 1). Students reporting an average ability to judge health information quality had statistically significant higher standard 3 scores than students reporting poor ability (p = .004).
- Standard 3 scores by students' self-reported ability to judge health information quality.
- Standard 3 scores by students' self-reported ability to judge health information quality.
We also found a statistically significant difference in students' total RRS score and their reported highest degree before entering our DCP. The largest difference observed in Figure 2 is between students with an associate's degree and students with a bachelor's degree. The Kruskal–Wallis equality of populations rank test was statistically significant (χ2 = 10.07, p = .018).
- Total Research Readiness Survey (RRS) score by prior degree. DNS = did not specify.
- Total Research Readiness Survey (RRS) score by prior degree. DNS = did not specify.
DISCUSSION
On average, matriculating students in our DCP in 2017 and 2018 correctly answered approximately 60% of knowledge questions for IL standards 1–3 and 5. Students with bachelor's degrees before enrolling in our DCP performed better than those with associate's degrees, an observation consistent with previous research finding greater research literacy skills with higher educational attainment.5 Although we did not see a statistically significant difference between students who obtained a master's degree and those who received a bachelor's or associate's degree before enrollment, the number of students with a master's degree was only 6, and the range of responses overlapped. Our analysis was also limited by 7% of students not reporting their prior degrees.
Previous research has found a disconnect between students' self-perceived research literacy abilities and their actual knowledge.3 We found that students who assessed themselves with average self-perceived skills performed better than those who reported poor skills. However, there was no difference between students reporting good/excellent abilities and their peers.
Despite our study's limitations of using a nonvalidated survey and assessing students in only 1 DCP, our RRS survey helped us identify and address the fundamental IL needs of our students as early as possible in our chiropractic curriculum. Our librarians tailored their guest lectures for each 1st-term class based on the RRS results and were able to address the knowledge gaps specifically for standard 2 (ability to access the needed information effectively and efficiently). For example, when the RRS results indicated that students did not know how to access peer-reviewed journals when not on campus, the librarian ensured this knowledge gap was addressed in the lecture. The lecturer defined the term peer-reviewed, identified the databases' indexing practice (eg, Index to Chiropractic Literature indexes both peer-reviewed and select trade publications), and demonstrated accessing full-text with remote authentication. This demonstration was recorded and posted on Canvas, the learning management system, as a resource for future reference.
Involving librarians in evidence-based medicine training is not unique to our institution.6–13 After clinicians, librarians are the 2nd most common type of faculty teaching evidence-based medicine at US and Canadian medical schools.12 Similar to our college, librarians frequently are involved in curriculum planning, instruction delivery, student assessment, and contributing to scholarly activity.14 Both faculty and librarians benefit from sharing the responsibility of teaching IL throughout the curriculum. Once a faculty and librarian collaboration is established by integrating the librarian in the course work, the students realize the librarian is a valued and accessible IL expert. The librarian, as an expert educator, can help improve the time efficiencies of faculty and students. In addition, librarians can help students cross the initial barriers of identifying needed information and accessing the literature in their journey to becoming information literate. When students direct IL questions to librarians, it allows faculty to concentrate on their subject matter expertise. This partnership creates a seamless and supportive overall learning experience for the students and interns by “demonstrating an interprofessional collaboration, providing a positive role model for learners.”15
CONCLUSION
Our study highlights the IL educational needs of matriculating students at 1 DCP. Incoming students had the most difficulty answering questions related to IL standard 2 (accessing the needed information effectively and efficiently), supporting the need to include librarians on the teaching team.
REFERENCES
FUNDING AND CONFLICTS OF INTEREST The authors have no funding sources or identified conflicts of interest to declare.
Author notes
Concept development: KW, BD, AO, MS. Design: BD, AO, DO, MS. Supervision: KW, BD, MS. Data collection/processing: KW, BD. Analysis/interpretation: KW, BD, MS. Literature search: KW, BD, DO. Writing: KW, BD, MS. Critical review: BD, AO, DO, MS.