Objective

To report the development and initial testing of a questionnaire designed to assess the concept of learning alignment within chiropractic college courses.

Methods

A 36-item questionnaire, Educator's Learning Alignment Instrument (ELAI), was created to evaluate how learning goals, course activities, and assessments align within a college course. Questionnaire development was informed by learning theories and tested using a 2-phased electronic survey mechanism among a chiropractic college faculty. Phase 1 included completing the ELAI for a currently implemented course. Phase 2 included questions about confidential reports generated from ELAI data.

Results

Thirty-one of 46 (67%) respondents completed an ELAI. Twelve (38%) participated in phase 2. Twenty-one (68%) courses demonstrated consistent learning focus across goals, activities, and assessments. Aggregate data from early, middle, and late chiropractic program courses revealed progressive shifts toward higher-level learning. Eighty-seven percent of courses contained 1 or more individual learning areas with potentially misaligned goals, activities, or assessment. Ninety-seven percent of respondents completed ELAI questions within 20 minutes. Most (87%) phase 2 respondents noted the report accurately reflected the course. Sixty-seven percent of phase 2 respondents agreed that confidential reports provided useful information to inform course design.

Conclusion

The ELAI is a nonburdensome instrument that can facilitate faculty reflection on how aligned learning concepts are applied in a course and provide novel data to assess general learning focus within college courses and within programs. Results indicate ELAI questions can be revised to improve clarity. Additional research comparing ELAI responses from experts, peer educators, and students is recommended.

INTRODUCTION

Instructional alignment is defined as the extent to which specific aspects of the learning environment match among learning outcomes, instructional processes, and assessment.1,2  An example of alignment can be observed in a course focused on learning to paint a specific technique (eg, the learning goal or outcome). Such a course likely includes practice. Assessment is likely to include painting skill demonstration. However, if assessment involves a written test, it cannot be known if learners reach the learning goal because knowledge rather than performance is measured. In such a course, assessment is “misaligned,”3,4  potentially impeding learning and rendering assessment invalid, either knowingly or unknowingly.5 

The concept that learning theory should guide critical aspects of course design in health professions education can be considered a best practice recommendation.610  Learning consistent with behaviorism theory includes memorizing facts, developing motor skills, and learning to consistently follow protocols. Assessment is focused on testing memory, discriminatory ability, and evaluating skills that do not require higher order thinking.9  Learning most consistent with behaviorism could be considered roughly congruent with the “knowledge/remember” and “comprehend/understand” levels on Bloom's and Anderson and Krathwohl's cognitive learning taxonomies.11,12 

Learning consistent with cognitivism theory is focused on improving how learners think.6  Course activities focus on understanding learner thought processes. Assessment includes writing, analysis of problem-solving strategies, concept mapping, and hierarchical relationships.13  Learning through a cognitivism model is generally consistent with the middle levels of Bloom's and Anderson and Krathwohl's cognitive learning taxonomies.11,12 

Constructivism, the 3rd major learning theory, focuses on how learners create meaning from experience. Learning occurs within real-world or experiential environments and is generally consistent with middle and higher levels identified in Bloom's and Anderson and Krathwohl's learning taxonomies.8,11,12  Learning activities include internships, discussion, writing, project completion, and other measures of performance in work settings.6,9,14 

Within chiropractic training programs, learning consistent with all 3 major learning theories (behaviorism, cognitivism, and constructivism) occurs. Memory-based learning and motor skill development (behaviorism) comprise a substantial part of curricula.1518  Learning how to think through problems and organize information (cognitivism) is an important part of developing diagnostic and clinical decision-making skills.19,20  Experiential learning occurs through several mechanisms, including internships.18,21  These disparate types of learning, which may occur simultaneously, require different learning activities and assessment methods.

Research suggests improved student engagement and deeper learning occurs within courses purposely designed with alignment principles.22  Student satisfaction, grades, and participation in learning can also improve in learning-aligned courses.23,24  Cook et al,25  in a meta-analysis of studies conducted in health professions education settings reported that course design plays a significant and measurable role in learning. This evidence suggests educators need resources to make evidence-informed course design decisions. An efficient tool assessing individual course alignment could potentially provide a convenient method for (1) evaluating courses in light of learning theories and (2) informing research focused on better understanding how course design influences learning and other related factors such as student satisfaction, anxiety, and motivation.2631 

The purposes of this article are to report the development and initial testing of an instrument designed to assess the concept of applied alignment within college courses. Specifically, this article reports (1) steps included in instrument development, (2) the feasibility of instrument completion among college faculty unfamiliar with assessing alignment, (3) descriptive data analysis methods describing individual and aggregate courses, (4) respondent perceptions of course profile reports and the capacity to facilitate reflection on alignment, and (5) summary feasibility assessment and recommendations for further development.

METHODS

This study was conducted on the Davenport campus of Palmer College of Chiropractic, a regionally and programmatically accredited, private, postgraduate educational institution in the United States with a curriculum primarily dedicated to granting a chiropractic degree. A small undergraduate program provides select courses leading to a bachelor of science degree. A small associate degree program in chiropractic technology was also offered. The main curriculum includes 10 sequential trimesters and over 70 courses. Trimesters 1–4 were identified as “early,” trimesters 5–7 as “middle,” and 8–10 as “late” program courses. The bachelor of science program includes 8 general science courses, and the chiropractic technologist program includes 19 courses. This study was reviewed for both ethical and legal considerations and determined to be exempt by the institutional review boards of Palmer College of Chiropractic and Jefferson College of Health Sciences. Exempt status was granted consistent with exemption 1 as outlined by Department of Health and Human Services Guidelines 45 CFR Part 46.101(b).

Instrument Development

The Educator's Learning Alignment Instrument (ELAI) was developed by the authors as a 36-item questionnaire to be completed by an educator who answers questions about a current course. The ELAI is available online as Appendix A, as are all of the appendices for this paper, at www.journalchiroed.com. ELAI questions were divided into 3 sets of 12. The first asked about learning goals, the second about course-related activities, and the third set pertained to assessment. Each set of 12 questions were equally subdivided into 4 questions designed to encompass most types of learning generally described by the 3 main learning theories (behaviorism, cognitivism, and constructivism) and to describe learning at all levels of Bloom's cognitive and psychomotor taxonomies.11 

The first 4 questions of each set asked about the amount of focus placed on learning factual discrimination, motor skill development, protocol use, and information recall. The next 4 questions asked about the types of learning generally described through a cognitivism model. Respondents reported the amount of focus on learning to compare course-related information, developing problem-solving strategies, mentally processing course information, and organizing course information.

The final 4 questions asked about learning that can be generally described through a constructivism model by assessing the amount of focus within a course on creating new knowledge through scientific research, experiential learning in real-world environments, learning through mentorship, and learning to communicate complex information in new and unique ways. Though the types of learning included in the questionnaire were generally consistent with the 3 major learning theories and taxonomy levels, the ELAI did not identify or measure alignment within these specific classifications because types of learning can potentially relate to more than 1 theory or level. Instead, the ELAI was designed to measure the relative consistency of focus (alignment) for the included levels of learning across learning goals/outcomes, course activities, and assessments within a course.

Development was guided by best practice principles of survey questionnaire design.32,33  ELAI questions were developed through an iterative process with feedback from faculty at the primary author's institution. Informal feedback regarding perceptions of question clarity, intent, and the meaning of response scales was offered by faculty who completed draft questions in different stages of development. Psychometric evaluation by an expert in survey research was also obtained. ELAI questions, data analysis, and feasibility testing methods were further refined, and question content validity was confirmed through feedback from a doctoral program academic oversight committee. Each stage resulted in progressive instrument refinement.

ELAI questions were answered by checking an answer box using a 5-point ordinal scale indicating the amount of focus for each type of learning within a course (All, Most, About ½, Marginally, No/None). An “All” response indicated 90%–100% of focus within the course. A “Most” indicated 60%–89%; an “About ½” response was 40%–59%; a “Marginally” response was 11%–39%; and a “No/None” response indicated 0%–10%. ELAI answers to each set of 4 questions were averaged, plotted using the middle percentage for a chosen response (eg, No/None = 5%, Marginally= 25%, About ½ = 50%, Most = 75%, and All = 95%), and graphically profiled in what was called a global profile. Each learning goal question "aligned" with a matching course activity and assessment question as demonstrated in the following linked questions pertaining to memory-based learning.

  • Do learning outcomes focus on recall of course-related information? (Learning goal question)

  • Do learners engage in information recall exercises? (Course activity question)

  • Does assessment measure course-related information recall? (Assessment question)

Phase 1 of this study began with an email invitation to all faculty serving as lead instructor for an undergraduate or graduate course representing a total potential population of 46. The invitation included information about the project and an electronic link to an online survey. Respondents were directed to a Web page further explaining the study and ending with a request for consent. Those who continued beyond the initial Web page were directed to ELAI questions.

Though many potential respondents taught more than 1 course, each was asked to answer ELAI questions about a single current course for which they served as lead instructor. Courses were designated by course name and identification code in the invitation message and on the initial ELAI question page. Courses were chosen to include all (46) faculty teaching as lead in a course to maximize the number of possible respondents. Thus, this project included an a priori determined, nonrandom sample, intended to maximize the number of individual faculty respondents and stratify responses to ensure that undergraduate and all levels of graduate courses were represented.

Two additional questions asked respondents about how well they understood the questions and about the amount of time required to complete ELAI questions. One open-text question asked respondents to describe challenges in answering questions (Appendix B). Individual ELAI data were exported and used to generate a confidential course profile report. Respondents received course profile reports individually via email. The course profile report provided results with explanatory text, graphs with interpretations, general strategies for improving potential misalignment, and tables and references supporting course alignment concepts (Appendix C).

Phase 2 included a review of the course profile report by the respondent and an invitation to complete an anonymous electronic 6-question survey about the perceived usefulness, accuracy, and understandability of the report (Appendix D). Questions were answered using a 5-point scale ranging from strongly agree to strongly disagree. Two free-text response questions asked: “What information was most useful to you?” and “What suggestions do you have for improvement?”

Data Analysis and Management

We used a descriptive analysis for ELAI feasibility testing because (1) descriptive analysis most closely matches response choices, (2) advanced statistical analysis could be inappropriate without known face, content, and discriminative validity, and (3) the purpose of the analysis was to convert responses for descriptive display in course profile reports. ELAI question responses were converted to a single mid-response range percentage. The “All” response, representing 90%–100% of course goals, activities, and assessment, was scored as 95%. The “Most (60%–89%)” response was scored as 75%; the “About ½ (40%–59%)” response as 50%; the “Marginally (11%–39%)” response as 25%; and the “No/None (0%–10%)” response as 5%. Global profiles, generated as column charts using graphic functions within Excel, were created to display the general focus for different types of learning indicated by learning goals, course activities, and assessment (Figs. 1 and 2). Aggregate data were similarly used to generate summary global profile graphs for early (trimesters 1–4), middle (trimesters 5–7), and late (trimesters 8–10) program courses (Fig. 3).

Figure 1-

Exemplar global profile of an early chiropractic program (trimester 1–4) lecture-based course suggesting general alignment of focus among course goals, activities, and assessments. General alignment is inferred when columns showing the relative focus of learning over “Goals” are proportionally similar to those displayed over “Activity” and “Assessment.”

Figure 1-

Exemplar global profile of an early chiropractic program (trimester 1–4) lecture-based course suggesting general alignment of focus among course goals, activities, and assessments. General alignment is inferred when columns showing the relative focus of learning over “Goals” are proportionally similar to those displayed over “Activity” and “Assessment.”

Figure 2-

Exemplar global profile of a mid-program (trimester 5–7) lecture-based course demonstrating general alignment of goals with course activities. Columns representing relative focus of learning types over “Goals” and “Activities” are proportionally similar. Potential misalignment of “Assessment” is noted in columns that are not proportionally similar. Instead, the greatest assessment focus is on discrimination, motor skills, protocol(s), and/or recall rather than on comparison, problem solving, mental processing, and/or knowledge organization, the predominant focus for “Goals” and “Activities.”

Figure 2-

Exemplar global profile of a mid-program (trimester 5–7) lecture-based course demonstrating general alignment of goals with course activities. Columns representing relative focus of learning types over “Goals” and “Activities” are proportionally similar. Potential misalignment of “Assessment” is noted in columns that are not proportionally similar. Instead, the greatest assessment focus is on discrimination, motor skills, protocol(s), and/or recall rather than on comparison, problem solving, mental processing, and/or knowledge organization, the predominant focus for “Goals” and “Activities.”

Figure 3-

Aggregate global profiles display learning focus for early chiropractic program courses within the first 4 trimesters of a 10-trimester program. (Left) Stepped pattern with greatest focus on foundational types of learning (eg, protocol[s], motor skill, recall). Early program courses primarily include basic science topics such as gross anatomy, biochemistry, and cell physiology. (Center) Aggregate global profile of courses in middle trimesters (5–7) showing greater learning focus in comparison, problem solving, mental processing, and experiential learning compared with early program courses. (Right) Aggregate global profile of late chiropractic program courses (trimesters 8–10) demonstrating higher focus on experiential learning and a more even focus on all learning types compared with early and mid-program courses. Though late program courses include greater emphasis on types of learning commonly considered higher-level, foundational types of learning (discrimination, motor skill, protocol[s] and recall) are consistently present in all curricular phases. Note: Early, middle, and late program profiles were not designed to be compared quantitatively to each other. Total percentage listed should be interpreted relative to columns within each 3-column group and individual profile rather than between profiles.

Figure 3-

Aggregate global profiles display learning focus for early chiropractic program courses within the first 4 trimesters of a 10-trimester program. (Left) Stepped pattern with greatest focus on foundational types of learning (eg, protocol[s], motor skill, recall). Early program courses primarily include basic science topics such as gross anatomy, biochemistry, and cell physiology. (Center) Aggregate global profile of courses in middle trimesters (5–7) showing greater learning focus in comparison, problem solving, mental processing, and experiential learning compared with early program courses. (Right) Aggregate global profile of late chiropractic program courses (trimesters 8–10) demonstrating higher focus on experiential learning and a more even focus on all learning types compared with early and mid-program courses. Though late program courses include greater emphasis on types of learning commonly considered higher-level, foundational types of learning (discrimination, motor skill, protocol[s] and recall) are consistently present in all curricular phases. Note: Early, middle, and late program profiles were not designed to be compared quantitatively to each other. Total percentage listed should be interpreted relative to columns within each 3-column group and individual profile rather than between profiles.

Linked-question profiles were graphically analyzed by displaying a column chart for each ELAI question response expressed as a single, mid-response range percentage. Linked-question profiles provided more specific data regarding the 12 types of learning included in the instrument. Feasibility and faculty perception survey questions requiring free-text responses were grouped by question and analyzed qualitatively by categorizing into thematic groups (content analysis). Ordinal scale items were analyzed using frequencies (Fig. 4).

Figure 4-

Linked-question profile of an early program (trimester 1–4) course. Each column displays the response to an individual question (total = 36). Columns are displayed in sets of 3, indicating the amount of focus placed on the type of learning within goals, activities, and assessment. Responses differing by more than 25% within a 3-column group suggest possible misalignment. Data suggest learning to organize course-related information is prominent in learning goals and course activities. However, assessment of this ability does not occur.

Figure 4-

Linked-question profile of an early program (trimester 1–4) course. Each column displays the response to an individual question (total = 36). Columns are displayed in sets of 3, indicating the amount of focus placed on the type of learning within goals, activities, and assessment. Responses differing by more than 25% within a 3-column group suggest possible misalignment. Data suggest learning to organize course-related information is prominent in learning goals and course activities. However, assessment of this ability does not occur.

Data were collected in REDCap (Research Electronic Data Capture, Vanderbilt University, Nashville, TN). REDCap is a secure Web-based application designed for research data capture, providing (1) validated data entry; (2) audit trails for tracking data manipulation and export procedures; (3) automated export to common statistical packages; and (4) procedures for importing data from external sources. REDCap data identified phase 1 respondents. Identifying information was cross-referenced with a faculty email and course identification list provided by the college registrar when sending ELAI profiles back to respondents. Faculty perception survey data (phase 2) were reported anonymously in REDCap.

RESULTS

Thirty-one of the 46 invited faculty (67%) completed ELAI questions. All respondents received a course profile report with an invitation to complete the faculty perception survey. Twelve (38%) completed the faculty perception survey questions. Table 1 displays sample characteristics.

Table 1-

Sample Characteristics (n = 31)

Sample Characteristics (n = 31)
Sample Characteristics (n = 31)

Global Profiles

Figure 1 displays an exemplar global profile of a generally aligned course. Responses for each 4-question set generally representing major learning theories were averaged and graphically displayed. General alignment was suggested when columns showing the focus for learning included in course “Goals” were proportionally similar to columns signifying the learning foci of course “Activities” and “Assessments.” Because this was the first test of the ELAI and because adjacent question responses were potentially similar, data were displayed descriptively. Cutoff values defining potentially good or poor global alignment were intentionally avoided due to the exploratory nature of this study. Instead, this determination was made subjectively by 1 author. Figure 2 displays a global profile indicating potential misalignment. In this example, data signify that the greatest proportional focus (tallest column) for “Assessment” is on recall, discrimination, protocols, and/or motor skills. However, the greatest proportional focus among “Goals” and “Activities” is on comparison, problem solving, mental processing, and/or knowledge organization.

Aggregate data from the 12 early chiropractic program courses included in this study, focused mainly in basic sciences, showed a stepped pattern with the highest focus on learning generally consistent with behaviorism and a progressively lower focus on other types of learning. Trends in aggregated global profiles of later semester courses show a shift toward higher levels of learning, generally consistent with cognitivism and experiential learning (Fig. 3). Twenty-one (68%) global profiles suggested generally good alignment.

Linked-Question Profiles

Figure 4 displays a linked-question profile. Each ELAI response is graphically displayed in 3-column linked-question sets. Because adjacent responses for ELAI questions may differ minimally, lack of alignment within any linked-question set (3 questions) was suspected only when 1 response differed from another by more than 25%. Table 2 displays frequencies of potential misalignment within linked-question profiles.

Table 2-

Linked-Question Profile Characteristics (n = 31), n (%)

Linked-Question Profile Characteristics (n = 31), n (%)
Linked-Question Profile Characteristics (n = 31), n (%)

Feasibility Questions

Table 3 displays results of scaled feasibility questions. Free-text responses to the question “Please describe any challenge you encountered while answering questions” most commonly focused on difficulty understanding some ELAI questions.

Table 3-

Feasibility and Faculty Perception Survey Responses

Feasibility and Faculty Perception Survey Responses
Feasibility and Faculty Perception Survey Responses

“Some wording seemed unrelated to my class. However, I made the best choice.”

“I wasn't quite sure how to put a percentage on the assessment questions.”

Some respondents had no difficulty understanding questions.

“None [challenges], the questions were straightforward.”

Twelve (39%) respondents answered questions about the course profile report (Table 3).

Virtually all free-text responses to the question “What information was most useful?” centered on the theme of how the report facilitated reflection or informing course changes, such as:

“Brought attention to reviewing my assessments and making them more in line with the learning outcomes.”

“The report highlighted areas where there may be a lack of alignment.”

“I will use the two pages following the graphs to better develop my course. . . . I feel that this is perfect timing for seeing this type of report–to assure that I'm developing assessments that properly measure the outcomes.”

“I liked reading the information . . . that showed me ways I can incorporate different learning and assessment strategies into the course to strengthen it”

Suggestions for improvement included the need to clarify instructions regarding how to answer questions, define terms, and/or provide examples within ELAI questions.

“The questions were really hard to understand exactly what was wanted.”

“Please define protocol.”

Other responses repeated the practical usefulness of the report to inform course design.

“I find this report to be useful and simple to read.”

“I would be interested in looking at having a better balance between my course activities, assessment, and outcomes.”

“I thought this was an excellent activity to review all of our outcomes, activities, and assessments. I would like to see this report for all of my courses.”

DISCUSSION

This study reports early-stage testing and validation of a new method for assessing the concept of learning alignment. Few tools are currently available to assist in designing aligned courses. Davis and Arend34  developed a simple 3-column tool to document and match learning goals with supporting theory and teaching methods. Fink35  proposed a similar tool to document and align learning goals, course activities, and assessments. Blumberg36  proposed a table by which educators plot key information to compare taxonomy levels and critically evaluate a course for alignment. Ramesh, Sasikumar, and Iyer37  developed a software tool to calculate data abstracted from a syllabus to display alignment between learning objectives and assessments.37  The Quality Matters process uses trained peer reviewers to assess whether online and blended courses are constructed according to alignment principles.38  To the authors' knowledge, the ELAI is the only tool that assesses alignment of 3 major course components using an efficient educator-completed questionnaire, provides semiquantitative analysis and summary reports with example correction strategies, and offers alignment-oriented graphical data.

Although faculty are considered subject matter experts, constructing courses to facilitate effective learning does not necessarily come automatically.39,40  The ELAI was initially tested with faculty teaching within a chiropractic program, which contains characteristics consistent with learning through the 3 major learning theory models (Behaviorism, Cognitivism, Constructivism). Though tested within a chiropractic educational setting, the ELAI focuses on learning alignment rather than topical content, suggesting potential broader applicability within higher education.

Initially, we used an iterative and informal process to develop an instrument that was informed by learning theories and then tested among naive educators to assess both face and content validity (eg, the instrument appears to measure what it is intended to measure, and the instrument contains appropriate content domains).41,42  Though some respondents had difficulty answering some questions, the majority found the ELAI and subsequent report to be informative and a good approximation of the course profiled. Preliminary analysis methods produced aggregate course profiles that generally matched what would be expected in a professional chiropractic program where the learning focus progressively moves from memory-dominated coursework toward problem solving, complex decision making, and experiential learning.

Early (trimesters 1–4) and mid-program (trimesters 5–7) data showed relatively equal focus on foundational and mid-level learning, with little emphasis on experiential learning. Late program course components showed greater emphasis on experiential learning (Fig. 3). These findings suggest early discriminant validity (eg, evidence that 1 concept is different from 1 closely related).41  Findings also suggest instrument clarity can be improved without adding significant time burden by revising instructions and adding brief definitions and/or examples to questions. Due to limitations of data analysis methods, results indicate the current value of the ELAI is in the ability to facilitate faculty reflection on a course. Further instrument revision, data analysis methods, and reliability testing should occur before additional uses can be recommended.

Limitations

As a study focused primarily on instrument development, quantitative results should be viewed with caution, bearing in mind a relatively small sample combined with the knowledge that several participants noted some difficulty understanding some questions. As survey-based research, results are limited by the response rate. However, there was no readily apparent reason to suspect responder bias. The initial response rate of 67% and a follow-up rate of 38% were higher than the 36% mean rate reported in a meta-analysis of organizational surveys, and response rate alone is not sufficient to judge the quality or validity of study results.43,44 

Although informed by learning theory and taxonomies, the ELAI does not measure alignment strictly defined within these domains. The types of learning included were intended to encapsulate most potential types occurring within college courses without overlap. Some types of learning may not be captured with the current instrument. Because the types of learning included in ELAI questions can be variously applied and match more than 1 learning taxonomy level or theory, a study focused on establishing the validity of questions with respect to learning theory or taxonomy level was neither consistent with the purpose of the instrument nor methodologically appropriate.

Results are also limited by faculty perceptions of the courses they designed and/or lead. Some respondents may have perceived or defined aspects of a course differently from how it was employed. In a study reported by Black and Wingfield,39  marketing and management faculty respondents (88%) reported their courses as active learning platforms when the authors identified only 28% with active learning characteristics.

To mitigate the potential for response bias arising from a perception that 1 or more types of learning are “better” than another, ELAI questions avoid mentioning learning taxonomy levels or learning theories. Linked questions were purposely spaced apart on separate pages of the electronic survey, which asks all 12 learning goal questions before proceeding to separate pages containing 12 course activity and, finally, 12 assessment questions. Viewing linked-question responses on a prior page required logging out, generating a password, and logging in again. Because 87% of linked-question profiles contained at least 1 learning type with a 25% or greater difference in focus among linked-questions, it is unlikely respondents used this method to equalize responses.

As a study seeking to understand where such limitations exist, these findings have been used to inform a revised instrument and can be used to inform future studies assessing reliability and further validation. Additional testing should answer questions such as: How similar are responses for the same course over time? How generalizable is the ELAI when used in different college settings? Do students rate courses similarly to instructors? Does the ELAI similarly assess courses in graduate and undergraduate programs? Do ELAI results inform course design changes that result in improved learning?

CONCLUSION

This study demonstrated the feasibility of using the ELAI among chiropractic educators and revealed areas for instrument improvement. Interpreted data from ELAI questionnaire responses facilitated useful reflection and raised awareness of learning alignment concepts among respondents. Subsequent research should consider reliability testing by comparing ELAI responses from experts, peer educators, and students.

ACKNOWLEDGMENTS

The authors acknowledge Francis Farrell, PhD for his scientific advice and critical review of this project and Min Wang, PhD for her construction and management of the electronic data collection process.

FUNDING AND CONFLICTS OF INTEREST

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors. The authors declare no conflicts of interest.

REFERENCES

1. 
Cohen
SA.
Instructional alignment: searching for a magic bullet
.
Educ Res
.
1987
;
16
(8)
:
16
20
.
2. 
Biggs
J.
Enhancing teaching through constructive alignment
.
High Educ
.
1996
;
32
(3)
:
347
364
.
3. 
Gibbs
G,
Simpson
C.
Conditions under which assessment supports students' learning
.
Learn Teach High Educ
.
2004
;
1
(1)
:
3
31
.
4. 
Rust
C.
the impact of assessment on student learning: how can the research literature practically help to inform the development of departmental assessment strategies and learner-centred assessment practices?
Act Learn High Educ
.
2002
;
3
(2)
:
145
158
.
5. 
Nilson
L.
Teaching at Its Best: A Research-Based Resource for College Instructors. 4th ed
.
San Francisco
:
Jossey-Bass;
2016
.
6. 
Ertmer
P,
Newby
T.
Behaviorism, cognitivism, constructivism: comparing critical features from an instructional design perspective
.
Perform Improv Q
.
1993
;
64
(4)
:
50
72
.
7. 
Taylor
DCM,
Hamdy
H.
Adult learning theories: implications for learning and teaching in medical education: AMEE Guide No. 83
.
Med Teach
.
2013
;
35
(11)
:
e1561
1572
.
8. 
Kay
D,
Kibble
J.
Learning theories 101: application to everyday teaching and scholarship
.
Adv Physiol Educ
.
2016
;
32827
(40)
:
17
25
.
9. 
Khalil
MK,
Elkhider
IA.
Applying learning theories and instructional design models for effective instruction
.
Adv Physiol Educ
.
2016
;
29605
(40)
:
147
156
.
10. 
Yardley
S,
Teunissen
PW,
Dornan
T.
Experiential learning: AMEE Guide No. 63
.
Med Teach
.
2012
;
34
(2)
:
e102
e115
.
doi:10.3109/0142159X. 2012.650741.
11. 
Bloom
B.
Taxonomy of Educational Objectives: The Classification of Educational Goals
.
Vol. 1. New York, NY: McKay;
1956
.
12. 
Anderson
L,
Krathwohl
D.
A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives
.
New York, NY
:
Longman;
2001
.
13. 
Yilmaz
K.
The cognitive perspective on learning: its theoretical underpinnings and implications for classroom practices
.
Clearing House
.
2011
;
84
(5)
:
204
212
.
14. 
Karpouza
E,
Emvalotis
A.
Exploring the teacher-student relationship in graduate education: a constructivist grounded theory
.
Teach Higher Educ.
2019
;
24
(2)
:
121
140
.
15. 
Triano
JJ,
Descarreaux
M,
Dugas
C.
Biomechanics–review of approaches for performance training in spinal manipulation
.
J Electromyogr Kinesiol
.
2012
;
22
(5)
:
732
739
.
16. 
Harvey
M-P,
Wynd
S,
Richardson
L,
Dugas
C,
Descarreaux
M.
Learning spinal manipulation a comparison of two teaching models
.
J Chiropr Educ
.
2011
;
25
(2)
:
125
131
.
17. 
Ward
KP.
Horizontal integration of the basic sciences in the chiropractic curriculum
.
J Chiropr Educ
.
2010
;
24
(2)
:
194
197
.
18. 
Coulter
I,
Adams
A,
Coggan
P,
Wilkes
M,
Gonyea
M.
A comparative study of chiropractic and medical education
.
Altern Ther Health Med
.
1998
;
4
(5)
:
64
75
.
19. 
Boysen
JC,
Shannon
ZK,
Khan
YA,
Wells
BM,
Vining
RD.
A graphical clinical decision aid for managing imaging report information
.
J Chiropr Educ.
2017
;
32
(1)
43
49
.
20. 
Rose
KA,
Babajanian
J.
The interrater reliability of an objective structured practical examination in measuring the clinical reasoning ability of chiropractic students
.
J Chiropr Educ
.
2016
;
30
(2)
:
99
103
.
21. 
Council on Chiropractic Education.
CCE Accreditation Standards: Principles, Processes & Requirements for Accreditation
.
Scottsdale
:
The Council
;
2018
.
22. 
McCann
M.
Constructive alignment in economics teaching: a reflection on effective implementation
.
Teach High Educ.
2017
;
22
(3)
336
348
.
23. 
Larkin
H,
Richardson
B.
Creating high challenge/high support academic environments through constructive alignment: student outcomes
.
Teach High Educ
.
2013
;
18
(2)
:
192
204
.
24. 
Szili
G,
Sobels
J.
Reflections on the efficacy of a constructivist approach to teaching and learning in a first-year bachelor of environmental management topic
.
J Geogr High Educ
.
2011
;
35
(4)
:
499
512
.
25. 
Cook
DA,
Hamstra
SJ,
Brydges
R,
et al.
Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis
.
2013
;
(35)
:
e867
e898
.
26. 
Hsieh
TL.
Motivation matters? The relationship among different types of learning motivation, engagement behaviors and learning outcomes of undergraduate students in Taiwan
.
High Educ
.
2014
;
68
(3)
:
417
433
.
27. 
O'Callaghan
A.
Emotional congruence in learning and health encounters in medicine: addressing an aspect of the hidden curriculum
.
Adv Heal Sci Educ
.
2013
;
18
(2)
:
305
317
.
28. 
Sockalingam
N.
The relation between student satisfaction and student performance in blended learning curricula
.
Int J Learn
.
2013
;
18
(12)
:
121
134
.
29. 
Liu
OL,
Bridgeman
B,
Adler
RM.
Measuring learning outcomes in higher education: motivation matters
.
Educ Res
.
2012
;
41
(9)
:
352
362
.
30. 
Rodger
S,
Murray
H,
Cummings
A.
Effects of teacher clarity and student anxiety on student outcomes
.
Teach High Educ
.
2007
;
12
(1)
:
91
104
.
31. 
Legon
R.
Measuring the impact of the Quality Matters RubricTM: a discussion of possibilities
.
Am J Distance Educ
.
2015
;
29
(3)
:
166
173
.
32. 
Krosnick
J,
Presser
S.
Questionnaire Design. The Palgrave Handbook of Survey Research. 2nd ed
.
London
:
Palgrave Macmillan;
2010
.
33. 
Leitz
P.
Research into questionnaire design, a summary of the literature
.
Int J Mark Res
.
2010
;
52
(2)
:
249
272
.
34. 
Davis
JR,
Arend
BD.
Facilitating Seven Ways of Learning: A Resource for More Purposeful, Effective, and Enjoyable College Teaching
.
Sterling, VA
:
Stylus;
2013
.
35. 
Fink
LD.
The power of course design to increase student engagement and learning
.
Assoc Am Coll Univ
.
2007
;
(Winter)
:
13
17
.
36. 
Blumberg
P.
Maximizing learning through course alignment and experience with different types of knowledge
.
Innov High Educ
.
2009
;
34
(2)
:
93
103
.
37. 
Ramesh
R,
Sasikumar
M,
Iyer
S.
A software tool to measure the alignment of assessment instrument with a set of learning objectives of a course. In: 2016 IEEE 16th International Conference on Advanced Learning Technologies (ICALT)
.
Piscataway, NJ: IEEE;
2016
:
194
198
.
38. 
Shattuck
K,
Zimmerman
WA,
Adair
D.
Continuous improvement of the QM rubric and review processes: scholarship of integration and application
.
Internet Learn
.
2014
;
3
(1)
:
25
34
.
39. 
Black
GS,
Wingfield
SS.
Using the most effective teaching methods: a comparison of marketing and management classrooms
.
J Adv Mark Educ
.
2008
;
12
:
1
9
.
40. 
Black
GS,
Daughtrey
CL,
Lewis JS. the importance of course design on classroom performance of marketing students
.
Mark Educ Rev
.
2014
;
24
(3)
:
213
226
.
41. 
Bolarinwa
O.
Principles and methods of validity and reliability testing of questionnaires used in social and health science research
.
Niger Postgrad Med J
.
2015
;
22
(4)
:
195
201
.
42. 
Moores
KL,
Jones
GL,
Radley
SC.
Development of an instrument to measure face validity, feasibility and utility of patient questionnaire use during health care: the QQ-10
.
Int J Qual Heal Care
.
2012
;
24
(5)
:
517
524
.
43. 
Anseel
F,
Schollaert
E,
Choragwicka
B.
Response rates in organizational science, 1995-2008: a meta-analytic review and guidelines for survey researchers
.
J Bus Psychol
.
2010
;
25
(3)
:
335
349
.
44. 
Morton
SMB,
Robinson
EM,
Carr
PEA.
In the 21st century, what is an acceptable response rate?
Aust N Z J Public Health
.
2012
;
36
(2)
:
106
108
.

Author notes

Robert Vining is the associate dean for clinical research at the Palmer Center for Chiropractic Research (741 Brady Street, Davenport, IA 52803; robert.vining@palmer.edu). Timothy Millard is the Assistant Director of Institutional Research at Radford University Carilion (101 Elm Avenue SE, Roanoke, VA 24013; tmillard@radford.edu).

Concept development: RV. Design: RV. Supervision: RV, TM. Data collection/processing: RV. Analysis/interpretation: RV, TM. Literature search: RV. Writing: RV, TM. Critical review: RV, TM.

Supplementary data