Creating advising curricula through backward design ensures that learning objectives remain central to the process and enables those in advising units to design comprehensive assessment plans for continued curricular improvement. By incorporating measures to observe student learning directly, advisors can evaluate their curriculum objectively to ensure students achieve desired learning outcomes. An advising unit created a proactive advising curriculum for academically at-risk students through backward design that includes multiple assessment measures. Students in four categories of academic risk were targeted for intervention. Through the evaluation of direct-learning evidence gathered through assessment, the advising unit improved the advising curriculum, showing the process for intentional curriculum design and assessment to improve student learning.

Advisors are expected to educate within an advising-as-teaching model, aiding and supporting students in creating context for their chosen curriculum. In this role, they must initially provide foundational, albeit sometimes prescriptive, information to scaffold advisees' learning experiences to facilitate future growth and development (Lowenstein, 2005). Only with the appropriate foundational knowledge can advisors support student development and exploration within an advising setting. Aiken-Wisniewski, Smith, and Troxel (2010) suggested that the literature lacks evidence of practitioner-level success in areas such as program assessment and evaluation, specifically citing a need for advising units to design curricula with intentionality; specifically, stakeholders should focus on defining and assessing student learning outcomes (SLOs). In this article, we seek to share our successful approach to proactive advising improvements through an intentional curriculum design that scaffolds and augments foundational knowledge with application to aid in student development and growth. The approach is based on backward design.

Advisors are increasingly challenged to provide more robust advising and show evidence of student learning, often without increased resources to meet this objective. Therefore, they must intentionally address student learning and development through the delivery of academic advising (Kimball & Campbell, 2013), particularly when taking into account student attrition prior to graduation. Failure to persist through graduation represents a significant institutional investment of time, resources, and class seats that would otherwise be available for students likely to graduate (Renzulli, 2015; Weissmann, 2012). As a result of this situation, institutional leaders must identify ways to structure advising that provides the highest level of student support and impact (Schuh, 2008).

Advising Curriculum and Backward Design

Although a seemingly daunting task, the impact of developing an intentional advising curriculum should not be underestimated, particularly with regard to student persistence. The NACADA Concept of Academic Advising clearly articulates the need for advising programs to include a curriculum that incorporates the mission and values of the institution, pedagogy appropriate to the curriculum, and defined SLOs (NACADA: The Global Community for Academic Advising [NACADA], 2006). An effective advising curriculum is deliberately designed to keep the central mission of the institution and the advising program at the core while content is scaffolded at multiple levels to facilitate student learning (Hemwall & Trachte, 2005). Consistent with a student's degree and course curriculum, an intentional advising curriculum empowers students to develop advising experiences that lead to clear learning outcomes and complement academic content. The use of backward design supports this intentionality.

Backward design describes a learning-centered approach to curriculum design. When used effectively, it helps to ensure that students experience well-rounded and comprehensive learning opportunities while achieving the intended SLOs (Wiggins & McTighe, 2001). The central tenet of backward design, the curriculum is created around predefined SLOs in the order opposite to the way they are experienced by students. Hence, the first step involves identifying the desired outcomes, followed by defining the mode of feedback and assessment; then finally the content and delivery mechanisms are defined (Fink, 2003). Only after SLOs are defined can a curriculum be mapped to ensure that appropriate learning opportunities lead to achievement of each outcome. Careful curriculum mapping aids instructors or, in this case, advisors and advising programs in identifying appropriate learning opportunities and the corresponding evidence to assess student learning (Fink, 2003). Through backward design, curriculum creators are continuously focused on SLOs such that learning opportunities are structured to support outcome achievement. In this regard, assessment proves essential to backward design.

Assessing Student Learning in Advising

Regular and systematic review of learning evidence—assessment—to ensure that advising curricula address the stated SLOs constitutes an integral part of a healthy learning-centered advising curriculum (Council for the Advancement of Standards in Higher Education [CAS], 2015). However, in a recent survey of advisors that indicated variable execution of regular assessment cycles in advising units, Powers, Carlstrom, and Hughey (2014) cited cases of failure to close the assessment loop. He and Hutson (2017) suggested that advisors need to incorporate assessment into their advising practice as a mechanism to demonstrate value to the institution and to contribute to the scholarship of advising. CAS has set similar standards for self-evaluation, which encourage advisors and administrators to hold themselves accountable for the use of assessment in their programs (White, 2006). Without a systematic assessment plan to ensure students are achieving the desired learning outcomes, advising does not reflect teaching (Banta, Hansen, Black, & Jackson, 2002). For optimal results, all individuals involved in content delivery, which often takes the form of student appointments for advising, should be involved in the curriculum design and assessment processes (Astin et al., 1992). Practitioners must understand the intentional nature of each curricular component for successful execution of it. In addition, through involvement in the review of learning evidence, stakeholders acquire insights into student performance, including information about characteristics of students who fail to meet important learning benchmarks (Aiken-Wisniewski, Campbell, et al., 2010; Astin et al., 1992).

To make informed changes in curricula, stakeholders undertake an assessment cycle that consists of defining SLOs, employing a mode of learning delivery, providing students opportunities to demonstrate their learning, and evaluating the learning evidence on which to base changes (Hurt, 2007; Robbins, 2016). They evaluate many types of student performance evidence, including direct and indirect measures (Aiken-Wisniewski, Campbell, et al., 2010; Campbell, Nutt, Robbins, Kirk-Kuwaye, & Higa, 2005; Robbins & Zarges, 2011). They collect indirect evidence collected via student self-evaluations (e.g., from surveys or focus groups) in which participants share their perceived learning gains or experiences (Banta et al., 2002; Powers et al., 2014). Distinctly different from indirect measures, direct evidence requires students to demonstrate their knowledge or skills and does not focus on their perceptions of content mastery (Accrediting Commission for Senior Colleges & Universities Western Association of Schools & Colleges, 2002; Robbins & Zarges, 2011). Examples of direct evidence in a classroom setting include student demonstrations of learning through oral presentations, essays, and exam questions. In an advising setting, advisors can directly observe learning evidence in one-on-one advising appointments or through information collected from evaluations or assignments.

In one-on-one advising appointments, practitioners directly witness student behaviors or demonstrations of knowledge for evaluation, similar to the way an instructor observes an oral presentation or examination for scoring according to carefully defined rubrics. For objective and consistent evaluations across all rendering judgment, each reviewer must follow rubrics to determine levels of learning demonstrated through completion of a test or activity in which no simple “correct” answer or performance applies (Hurt, 2007; Mertler, 2001; Suskie, 2018). Rubric content ranges from minimal to full depending on the level of detail provided to the reviewer. To assist the evaluator in assigning a score, full rubrics contain not just scales to rate student performance but also specific descriptors associated with each scale value. Full rubrics increase interrater reliability by decreasing ambiguity (Walvoord, 2010). NACADA's Guide to Assessment of Academic Advising (Aiken-Wisniewski, Campbell, et al., 2010) provides information for creating simple rubrics to assess advising curricula.

Each type of evidence is associated with a unique contribution to the assessment process. Therefore, combining numerous and diverse measures in an assessment plan ensures a comprehensive review of any advising curriculum (Campbell & Nutt, 2008).

Proactive Advising by Backward Design

Glennen (1976) originally coined the term intrusive counseling when reporting on a successful institutional advising intervention. Intrusive advising requires a student to meet with an academic advisor, often before the student reaches out for support, who employs counseling techniques to demonstrate care about and establish connection with the student. Earl (1988) interpreted intrusive advising to mean “deliberate structured student intervention at the first indication of academic difficulty in order to motivate a student to seek help” (p. 28), and referencing his 1987 thesis, he elaborated that “intrusive advising utilizes the systematic skills of prescriptive advising while helping to solve the major problems of developmental advising which is a student's reluctance to self-refer” (p. 28). Both Glennen and Earl referred to the positive effect of intrusive advising when used as a mechanism to demonstrate care for the student while providing the support he or she needs to succeed (Varney, 2007).

More recently, the negative connotations associated with the term intrusive have inspired advisors to relabel this intentionally holistic approach to working with students as proactive advising, which appropriately emphasizes the assertive nature of the intervention and focuses on the variety of methods that advisors integrate to aid the student in finding success (Drake, Jordan, & Miller, 2013; Varney, 2012). Practitioners leverage proactive advising in a variety of settings according to institutional information that indicates students may be at risk of departure before graduation. In general, simple mandates for students to seek advising do not equate to immediate increases in student persistence or performance. Therefore, proactive advising needs to fit within the broader context of a student's comprehensive curriculum such that student needs are met without duplication of efforts; that is, advisors use a proactive approach to meet students appropriately at their current place in the educational journey (McFarlane, 2017). The many proactive advising applications and variations in strategies are well reviewed by the contributors to Academic Advising Approaches: Strategies That Teach Students to Make the Most of College (Drake et al., 2013).

In their research, Abelman and Molina found that proactive advising success depends on the level of intrusiveness, with greater intrusiveness correlating with better grade point average (GPA) outcomes (Abelman & Molina, 2001, 2002; Molina & Abelman, 2000). Their longitudinal study largely focused on GPA and retention as success outcomes, two constructs important to institutional reputation and advising-related research. Although they maintained that content, regardless of approach, remained consistent, in general, they found that greater intrusiveness equated with success, which begged the question: Which aspects of the advising process intervention result in improved outcomes? Because of the numerous examples in the literature that focus on quantitative measures to determine proactive advising success, such as GPA and retention (e.g., Vander Schee, 2007), for the study presented herein, we chose to focus specifically on successful learning, independent of these other, more commonly reported metrics. With a continued focus on advising as teaching, we sought to demonstrate SLO achievement in an intentionally designed, proactive advising curriculum created through backward design. The findings lead to the need for further research—beyond the intentions of this study—to ascertain whether this backward-designed curriculum translates to increased performance via the more commonly reported metrics of GPA and persistence.

With backward design at the center of the approach used to design advising curricula, we demonstrate the value of intentional curriculum design for proactive advising. Assessment plays a central role in the evidence we present. By assessing academically at-risk students' knowledge of pertinent academic policy before appointments, advisors can tailor the in-appointment prescriptive information students need while leading them into exploration and development. In addition to supporting student mastery of prescriptive information, the advising curriculum incorporates and supports the NACADA Core Values of Academic Advising (NACADA, 2017b). Specifically, advisors empower students to take charge of their academic journeys while approaching each student's situation with support and caring on the basis of the view through a teaching-focused lens.

Method

Participants

The site for this preliminary exploration of intentional curriculum design was a public, land-grant, high-level research institution enrolling more than 18,000 students. The exploration focused on undergraduates in life sciences for the Spring 2016 and Fall 2016 terms. Students with semester or cumulative GPAs less than 2.0 were invited to participate in the Academic Action Intervention (AcAc) program, a structured, mandatory advising-appointment-focused intervention. For life sciences students, AcAc was administered by an advising unit composed of three full-time academic advisors, referred to as the advising team or advising unit.

The AcAc program was designed to support students struggling academically because they are typically considered at risk of leaving the institution prior to degree completion (Tinto, 1993). For this intervention, an academically at-risk student was defined as any degree-seeking undergraduate pursuing a life sciences major who earned a cumulative or semester GPA less than 2.0. At the conclusion of the Spring and Fall semesters of 2016 these students received registration holds, which required them to complete the AcAc program curriculum or change (or undeclare) their majors before resuming registration (enroll in courses the following semester or drop courses in which they were currently enrolled). At the end of the Spring term, 187 of the 1,139 declared majors were invited to participate in the AcAc program; 140 accepted the invitation. After the following Fall, 213 of the 1,267 declared majors were invited to participate, and 154 accepted the invitation (see Table 1).

Table 1

Spring 2016 and Fall 2016 students invited to participate in the Academic Action Intervention (AcAc) program

Spring 2016 and Fall 2016 students invited to participate in the Academic Action Intervention (AcAc) program
Spring 2016 and Fall 2016 students invited to participate in the Academic Action Intervention (AcAc) program

Data from academically at-risk students identified for the AcAc program were categorized as follows: warned early (cumulative GPA less than 2.0 with fewer than 24 completed credits), warned low semester (semester GPA less than 2.0, cumulative GPA greater than 2.0), probation (cumulative GPA less than 2.0 with 24 or more credits completed), and suspension (cumulative GPA less than 2.0 following probation or cumulative GPA less than 1.7) (see Appendix). Students subject to suspension were included in the AcAc program because they were required to take only a one-semester pause from the institution and were automatically admitted to the same major upon reapplication; therefore, to achieve future success, a student returning after a suspension likely needs continued support. Students subject to dismissal were encouraged to seek advising but were not required to participate in the AcAc program because they could not enroll as classified degree-seeking students without reapplication to the institution; therefore, they were subject solely to admissions policies (see Appendix).

AcAc Learning Goals

Members of the advising unit developed the SLOs for the AcAc program because they facilitated the proactive advising appointments. They created the following measurable final SLOs for the AcAc program to guide the curriculum:

Students will be able to

  • define good academic standing;

  • define academic action policies, including GPA policy benchmarks and the grade replacement policy;

  • identify sources of difficulty that may have impeded learning in the previous term;

  • construct an academic plan to achieve (or maintain) good academic standing that aligns with their stated professional goals;

  • identify appropriate campus resources to achieve academic success; and

  • utilize appropriate campus resources to achieve academic success.

After establishing consensus on the SLOs, the advising team created a curriculum map (see Table 2) to determine the ways and time line for students to achieve each outcome (Aiken-Wisniewski, Campbell, et al., 2010; Campbell et al., 2005; Robbins, 2011). Whenever possible, students were provided multiple opportunities during the intervention time frame to demonstrate learning for each outcome. Scaffolding of information delivery to ensure progressive building of knowledge on a strong foundation characterizes effective curriculum design (Maki, 2004). The curriculum map (see Table 2) details the learning opportunities connected with each SLO within the AcAc curriculum and the assessment tools used to assess the student learning achievement at the conclusion of the AcAc program.

Table 2

Academic Action Intervention Curriculum Map

Academic Action Intervention Curriculum Map
Academic Action Intervention Curriculum Map

Components of the AcAc Curriculum

The components of the AcAc curriculum were created after the SLOs had been determined. They made up the four parts of the curriculum.

Assigned readings and preappointment evaluation

Immediately prior to scheduling an appointment with an advisor, the student is required to read pertinent university policies online from the catalog; after reading the policies, to assess the student's knowledge of university policies that affect his or her academic standing, the student completes a preappointment evaluation by designating a series of statements as true or false (see Table 3). In addition, the instrument prompts the student to evaluate his or her sources of difficulty, in a manner similar to that of Earl (1988), and the level of effort and time the student dedicated to his or her studies during the previous semester.

Table 3

True and false questions from the pre- and postappointment evaluations for Spring 2016 and Fall 2016 terms with corresponding correct answer percentages on postappointment evaluations

True and false questions from the pre- and postappointment evaluations for Spring 2016 and Fall 2016 terms with corresponding correct answer percentages on postappointment evaluations
True and false questions from the pre- and postappointment evaluations for Spring 2016 and Fall 2016 terms with corresponding correct answer percentages on postappointment evaluations

Appointment and advisor rubric

Following the completion of the preappointment evaluation, the student schedules an appointment with a full-time academic advisor in the unit and participates in a 30-minute face-to-face proactive appointment. During this dedicated time, the advisor reviews the preappointment evaluation with the student, focusing on incorrect answers in the true-or-false section and referring the student to resources designated to help him or her improve the student-identified sources of difficulty. The advisor then works with the student to create an academic plan, encouraging the student to drive decision-making on the basis of the actions he or she needs to take to improve the GPA. As necessary, advisors add information to the plan to nudge the student toward a greater awareness of the actions needed for success. At the conclusion of the appointment, the advisor completes a full rubric to assess the student's ability to construct an appropriate academic plan and identify source(s) of difficulty (see Table 4).

Table 4

Advisor rubric scoring guidelines utilized to assess student performance during Academic Action Intervention (AcAc) appointments

Advisor rubric scoring guidelines utilized to assess student performance during Academic Action Intervention (AcAc) appointments
Advisor rubric scoring guidelines utilized to assess student performance during Academic Action Intervention (AcAc) appointments

Connect with referred resource(s) and postappointment evaluation

The student completes the postappointment evaluation after meeting with a minimum of one advisor-referred resource identified during the advising appointment. Although the student was directed to meet with the resource, no measures were put in place to quantify resource utilization; however, the postappointment evaluation provides questions about the student's experience connecting with the resource. This postappointment evaluation requires the student to identify as true or false the same statements that were presented in the preappointment evaluation, describes perceptions of the connection with the accessed resource(s) identified in the first appointment, and captures the student's overall evaluation of the proactive advising appointment.

Registration holds were removed upon completion of the postappointment evaluation.

Assessment Tools

The unit advisors used three in-house assessment tools to assess the AcAc curriculum. These mechanisms comprised the pre- and postappointment evaluations and the advisor rubric. The pre- and postappointment evaluations combined direct measures of student learning via students' responses to policy-based true-or-false statements with indirect measures of self-evaluation for reflection on perceived areas of difficulty. Evaluation data were collected and appointments scheduled through commercially available software called Insight by Symplicity. The software was configured such that students had to complete the preappointment evaluation before they could access the appointment-scheduling feature within their student accounts. In addition, periodically, advisors completed the advisor rubric following select student appointments to assess objectively a student's ability to navigate the degree audit system and thus build an academic plan and define sources of difficulty. For this study, we focused on the use of direct learning evidence; therefore, we present and discuss only the true-or-false statement and advisor rubric data, despite these data being a mere part of a more comprehensive assessment of the AcAc curriculum.

The student responses to the true-or-false statements on the pre- and postappointment evaluations were used to assess students' understanding of the prescriptive information delivered within the proactive advising appointments, which covered policies on good academic standing and grade replacement. The advising unit staff members had developed items that focused on elements of the policy that they considered most important for student academic success; that is, achievement and maintenance of good academic standing requires a mastery of both policies. Spring 2016 was the pilot semester for administering the pre- and postappointment evaluations.

Following review of the Spring 2016 results, the advising team clarified one question for Fall 2016 to ensure that the correct elements of the policy were evaluated (see Table 3). Also according to the Spring 2016 data, they set a performance benchmark (Robbins & Zarges, 2011) of 75% for the postappointment evaluation, meaning that, to consider an SLO achieved, 75% of the students were expected to identify each statement correctly as true or false. The advisors expected to reconsider this initial benchmark after the data were reviewed; however, the preliminary review of correct responses for the 10 items fell into a range from 49% to 99%, with an unweighted mean of 69% and a median of 82%. Therefore, the 75% benchmark initially chosen for this study seemed an appropriate compromise between the mean and the median and was not adjusted further.

The advisor rubric consisted of a Google form completed by each advisor immediately following the completion of any AcAc program appointment. The rubric contained specific scoring criteria to ensure a consistent and objective approach across advisors (see Table 4). One advisor designed the initial advisor rubric to aid in the objective observation of student performance and to establish a clear minimum performance threshold (as per Aiken-Wisniewski and Wozab, 2012; Aiken-Wisniewski, Campbell, et al., 2010; Campbell et al., 2005). Following the initial design, other advisors in the unit provided feedback, and consensus was garnered for the final design. The advisor rubric, along with the student responses to the preappointment survey, guided the student discussion with an advisor. In Spring 2016, advisors completed advisor rubric forms for 40 students.

Results

Of the 187 Spring 2016 students eligible for the AcAc program, 140 agreed to participate, and 105 completed the curriculum requirements. Similarly, in the following Fall, 213 students were identified as eligible for the AcAc program; 154 agreed to participate, and 119 completed the full curriculum. A total of 106 students did not start the intervention process either semester: 47 from the Spring and 59 from the Fall (see Table 5).

Table 5

Academic Action Intervention program (AcAc) participation profile

Academic Action Intervention program (AcAc) participation profile
Academic Action Intervention program (AcAc) participation profile

We analyzed data only from students who completed the full AcAc program curriculum because of the necessary comparison between pre- and postappointment evaluations and the need for a comprehensive data set that included evaluation of all aspects of the curriculum. Table 6 presents class standing by gender for both those who completed the full AcAc curriculum and those who initially participated but did not complete the program.

Table 6

Class standing by gender and Academic Action Intervention program (AcAc) completion

Class standing by gender and Academic Action Intervention program (AcAc) completion
Class standing by gender and Academic Action Intervention program (AcAc) completion

Figure 1 depicts the frequency with which students correctly identified the following statement as false: “Students can be removed from academic probation at the end of a fall or spring semester.” In the postappointment evaluation, 83.8% answered correctly; therefore, the set benchmark of 75% was achieved, demonstrating an increase of 11.4% in correct responses following the proactive advising appointment.

Figure 1

Percentage of students correctly identifying the pre- and postappointment evaluation statement about probation policy as true or false

Note. For Spring 2016 (n = 105), the statement “Students can be removed from Academic Probation at the end of a fall or spring semester” was correctly evaluated as false. For Fall 2016 (n = 121), the statement was revised to “Academic actions are not taken at the end of summer” and was correctly evaluated as true.

Figure 1

Percentage of students correctly identifying the pre- and postappointment evaluation statement about probation policy as true or false

Note. For Spring 2016 (n = 105), the statement “Students can be removed from Academic Probation at the end of a fall or spring semester” was correctly evaluated as false. For Fall 2016 (n = 121), the statement was revised to “Academic actions are not taken at the end of summer” and was correctly evaluated as true.

The statement about academic probation removal was modified for the Fall 2016 implementation of the AcAc program to ensure that students fully understood the policy. The modified statement read, “Academic actions are not taken at the end of summer.” The number of students who correctly identified this statement as true on the preappointment evaluation was lower than it had been for the unrevised untrue statement the previous semester, with 45.0% of students answering correctly in the Fall after 72.4% had evaluated the 2016 version of the item correctly in the Spring. Following completion of the Fall 2016 appointments, 75.0% of students answered the question correctly on the postappointment evaluation (see Figure 1), demonstrating learning gains for 30% of the participants after the appointment. The performance on the postappointment evaluation was sufficient to meet the benchmark, thereby indicating the students learned the necessary content and achieved the SLO associated with it.

Knowledge of the grade replacement policy was assessed through true-or-false Items 8 through 10 of Table 3. The students' Spring 2016 performances on Item 8, “If I repeat a course, only the most recent grade will show on my transcript,” and Item 10, “If I repeat a course that I earned a C– or lower, the previous grade will not show on my transcript,” on the postappointment evaluation fell below the 75% benchmark at 51.4% and 57.1%, respectively (see Figure 2), indicating students did not understand the policy completely. Following modifications to the Fall 2016 AcAc curriculum, including the integration of active learning to teach the grade replacement policy during the appointment, student performances on the postappointment evaluation for Items 8 and 10 were 75.0% and 82.5%, respectively, thereby correcting the learning deficit identified in Spring 2016. Data on the most current AcAc curriculum show achievement of the SLOs for this portion of AcAc.

Figure 2

Percentage of students correctly identifying the pre- and postappointment evaluation statements 8–10 as true or false

Note. Spring 2016, n = 105; Fall 2016, n = 121. Statement 8 (false): “If I repeat a course, only the most recent grade will show on my transcript”; Statement 9 (true): “For repeat courses, the last grade received shall be included in the cumulative GPA and previous attempts will be excluded”; Statement 10 (false): “If I repeat a course that I earned a C− or lower, the previous grade will not show on my transcript.”

Figure 2

Percentage of students correctly identifying the pre- and postappointment evaluation statements 8–10 as true or false

Note. Spring 2016, n = 105; Fall 2016, n = 121. Statement 8 (false): “If I repeat a course, only the most recent grade will show on my transcript”; Statement 9 (true): “For repeat courses, the last grade received shall be included in the cumulative GPA and previous attempts will be excluded”; Statement 10 (false): “If I repeat a course that I earned a C− or lower, the previous grade will not show on my transcript.”

Advisors submitted rubric scores for 40 students in Spring 2016 using the advisor rubric detailed in Table 4. When demonstrating their ability to construct an academic plan to achieve (or maintain) good academic standing, 82.5% of students met or exceeded the expected performance level (see Figure 3). In addition, 100% of the evaluated students identified perceived source(s) of their academic difficulty (see Figure 3). For both SLOs, students exceeded the benchmark of 75.0% set by the advising team. The performance data indicate that students are capable of performing at or above the expected threshold, thereby indicating that the current AcAc curriculum is sufficient for students to meet the SLOs set for this portion of the program.

Figure 3

Assessment results of student progress on advisor rubric (N = 40)

Note. Benchmark of success = 75%. Full rubric statements: “Student can construct an academic plan to achieve (or maintain) good academic standing that aligns with their professional goals” and “Student can identify sources of difficulty that may have impeded learning in the previous term.”

Figure 3

Assessment results of student progress on advisor rubric (N = 40)

Note. Benchmark of success = 75%. Full rubric statements: “Student can construct an academic plan to achieve (or maintain) good academic standing that aligns with their professional goals” and “Student can identify sources of difficulty that may have impeded learning in the previous term.”

Discussion

Backward design keeps student learning central to the curriculum design process. In designing the advising curriculum, the advising unit made sure each element addressed a specific component of the SLOs. Furthermore, they used the curriculum map (see Table 2) to ensure that no necessary component was omitted.

While designing the curriculum, the unit intentionally identified logical points for learning-evidence collection, which provided value to the advisors when conducting their in-person appointments and to the students as they reflected on their experiences. That is, the goal to capture the learning-evidence data proved a useful exercise that benefited all aspects of the curriculum, including SLO assessment for continuous curriculum improvements throughout the assessment cycle. Likewise, the advisors wanted to limit the quantity of collected information so that each component added value to the AcAc curriculum. With a consistent focus of the advisors on teaching to the SLOs, paying careful attention not to stray into content areas not relevant for the AcAc intervention, we found that the backward curriculum design process guaranteed inclusion of essential content into a realistic and attainable assessment plan.

To encourage complete participation in the AcAc program, students who failed to complete the full curriculum were subject to registration holds, which have been determined to be essential for ensuring student compliance; for example, Schwebel, Walburn, Klyce, and Jerrolds (2012) noted that intrusive (proactive) advising efforts without holds did not affect students participating in their study. In our case, registration holds expired following the start of the restricted withdrawal period for the subsequent year. For example, a student eligible for the AcAc program following the Spring 2016 semester could not register until the end of the unrestricted withdrawal period for the Spring 2017 semester. This mandate was based on the rationale that a student's AcAc status was not determined until after enrollment in the Fall 2016 term courses. Therefore, to ensure participation in the program, academic holds needed to cover a period during which a student needs to access registration portals; hence, the hold was extended through the Spring 2017 registration period or until the student had completed the AcAc curriculum, whichever came first. These holds resulted in all students who chose to continue within a life sciences major participating in the AcAc program.

Students who changed to a major other than life sciences were not required to complete the AcAc program and were instead offered appropriate interventions to help them overcome their at-risk status within the department of the new major. Students who subsequently dropped out of the institution showed variable participation. The intervention may have been available too late for these students. A proactive early alert intervention offered at the beginning of the semester may have been more helpful. We hope to explore this area of proactive advising in the future.

An important aspect of the AcAc curriculum implementation, the advising unit required students to complete preappointment evaluations before they could schedule their AcAc advising appointments. Specifically, the software tracked students who needed to submit forms prior to accessing other components of the system, such as appointment scheduling. Although employing a system with like features does not determine the ultimate success of any pre- or postassessment of an advising curriculum, we argue that, on the basis of our experience, such a system makes the process relatively uncomplicated.

We did not want students feeling rushed to complete the preappointment evaluation immediately prior to the start of their advising appointments, such as might be experienced in an in-office pre-check-in before an appointment, because the evaluation asked for responses on items that required the students to read the pertinent university policies. By requiring students to read the policies and respond to policy-related items as true or false, we provided students a first-time learning opportunity while they were engaged in the AcAc curriculum. Students and advisors entered the appointment with a better sense of the areas upon which the appointment should focus. According to our experience, we suggest that unit advisors consider all resources at their disposal to ensure that they create a curriculum delivery model that is deemed effective and efficient for both students and advisors. An onerous procedure introduces additional opportunity for survey fatigue and may prove unsustainable in the long term without increased resources.

In executing the assessment process, the advisors involved in this study used their experience and expertise to improve the accuracy of the assessment tools when evaluating student learning. For example, because some students who correctly identified the statement “Students can be removed from academic probation at the end of a fall or spring semester” as false also asked why their probation status had not changed after the Summer 2016 term, the advisors recognized that the statement did not adequately explain the policy. The advisors modified the statement to ensure the assessment wording would reveal specific aspects of the policy such that a student's precise area of misunderstanding became apparent. Subsequent to the change, advisors of students who misidentified as false the updated, true statement “Academic actions are not taken at the end of summer” spent more time discussing the effects of Summer term performance on academic standing, thus enabling the student to perform better when identifying the veracity of the same item during the postappointment evaluation. In this case, without careful rewording of the statement to highlight areas of confusion in the preappointment evaluation, students may not have received the additional information needed to understand the entire policy. The dramatic change in student performance in the preappointment evaluation based on the two statements about the same academic policy demonstrates the careful design necessary to ensure an appropriate and accurate student learning evaluation (see Figure 2).

The benefits of the assessment cycle were revealed when performance on the Spring 2016 postappointment evaluation indicated that students were not learning enough about the grade replacement policy during advising appointments to meet the benchmark. To close the knowledge gap and, as a result, the assessment loop, the advisors identified a new learning opportunity and teaching approach for the policy, which involved incorporating active learning into each advising appointment. Active learning engages students in the learning process rather than perpetuates an expectation that students will necessarily grasp passively shared information. Chickering and Gamson (1987) identified active learning as one of seven principles for good practice in higher education.

Applying active learning to teaching a policy that students were struggling to understand increased the levels of information processing and retention. For example, in Fall 2016, instead of simply explaining the policy, advisors asked students to diagram different scenarios surrounding their academic situations and the ways the grade replacement policy might affect their academic standings in the future. Advisors guided students through the examples to help them comprehend the potential impact of their academic performance on their GPAs and academic standings. The Fall 2016 assessment results indicated that this new element of the curriculum helped students achieve the benchmark; therefore, further modification of the curriculum to address this SLO was deemed unnecessary, and the advisors instead reflected upon and enjoyed this improved outcome.

Opportunities to celebrate success constitute an important part of the assessment process and should not be ignored (Weiner, 2009). Too often, assessment efforts are seen as means to focus on areas requiring improvement such that accomplishments are not properly recognized. Excitement about and acknowledgment of positive outcomes encourage and reinvigorate those who have worked diligently toward the goal.

We discovered that the advisor rubric (see Table 4) was an integral component of the assessment process as it served as the only means of evaluating student ability to reach benchmark levels of SLOs four and five (see Table 2). By having students start the self-reflection process in the preappointment evaluation, then continuing the contemplation process and integrating the resulting considerations and aspirations into discussions about future academic plans, advisors enabled students to demonstrate mastery of important knowledge for creating an academic plan by the conclusion of their appointments. By using a full rubric, advisors evaluated student performance objectively and consistently through the descriptions of each scoring option. By contrast, academically struggling students may lack the understanding needed to identify the specific areas in which their performance led to poor outcomes or the areas in which they need more information. Advisors interacted with students during appointments, then used the rubric to assign scores based on the extent to which students' behaviors reflected the descriptions articulated in the rubric. The one-on-one nature of the appointments enabled advisors to make observations regarding students' ability to create academic plans that were then compared with the rubric descriptions and accompanying scores. Because advisors used the rubric to guide scoring, the variability among advisor observations was relatively low; however, it was not eliminated because of the relatively simple nature of the rubric (Mertler, 2001).

We emphasize that the data from the rubric were not utilized as a way to assess individual advisors' efficacy. As appropriate for assessment, the rubric was intended to quantify objectively the students' abilities to navigate their academic plans and identify sources of difficulty. In fact, the successful use of the data relied on advisors' scoring students honestly without concern that those ratings might reflect on their own advising performance. Program assessment should focus on student learning, not on individual practitioners' performances (Aiken-Wisniewski, Campbell, et al., 2010; Hurt, 2007; Walvoord, 2010). By involving all advisors in the curriculum design process, including the design of assessment tools and evaluation of learning evidence, each advisor became a stakeholder with a vested interest in the success of the curriculum. Without buy-in, assessment data have little impact because practitioners without understanding of the advising theory and philosophy behind the assessment approach may not commit to incorporating the modifications indicated by the data.

Limitations and Future Modifications

We acknowledge that comparison of the wording of the SLOs and the advisor rubric shows differences such that some aspects of the SLOs were omitted from the assessment. However, to address this limitation, the unit advisors discussed refinements to the rubric wording to ensure all aspects of each SLO are measured in the future. Specifically, the latter portion of SLO 4, stating “… that aligns with their professional goals,” was not clearly included in the rubric. Such improvements to the rubric will increase the validity of the tool and therefore the reliability of the outcome data.

Additional limitations include the shortened time period in which the rubric was administered, which was purposeful to ensure advisors were not overburdened and that every student could meet with an advisor (Kraft-Terry, 2015). Because of the time required to record the rubric information, we asked advisors to complete the rubric within a specific time window: between May 23, 2016 (the start of the AcAc program), and August 3, 2016. To ensure quality support and advisor availability for incoming student inquiries, we wanted advisors to have completed the rubric before resuming the demands of advising appointments, which necessarily increase at the start of a new semester.

In the future, we plan to administer the rubric in one week increments throughout the semester to ensure better sampling across the entire population without adding stress for the advisors. In addition, we see value in further defining the student population to understand more fully the demographic breakdown of those participating in the AcAc program. Therefore, we plan to collect gender and ethnicity information from students. These data will aid in understanding whether the AcAc program population represents the entire college population and thereby provides insight into whether additional outreach is needed to address any potential imbalances in the opportunities to improve student academic status.

Finally, we point to these results as not causal; that is, the data do not necessarily reflect cause and effect. In fact, we contend that learning transpires in many contexts such that no single assessment can be definitive in determining outcomes.

Conclusion

In this article, we presented an approach to assessing student learning within proactive advising sessions for academically at-risk students. Specifically, we addressed the curricular benefits of using direct measures to assess the efficacy of a proactive advising intervention designed to increase student learning about the causes of and remedies for poor academic standing and to improve the proactive advising curriculum. Because advising literature on assessment is typically based on indirect measures, such as student satisfaction and self-evaluation (Aiken-Wisniewski, Smith, et al., 2010; Banta et al., 2002; Powers et al., 2014), our work provides new evidence to reinforce the value of enhancing advising assessment with direct measures of student learning as part of a comprehensive assessment plan. It also provides multiple examples of successful approaches used to establish student mastery. In addition, it reveals the way one advising unit utilized intentional design and assessment to improve student learning. This explanation of curriculum creation using backward design, which ultimately resulted in actions that improved the AcAc curriculum, can encourage other practitioners to be intentional about creating and using a proactive advising curriculum.

The NACADA Academic Advising Core Competencies Model (2017a) and the CAS Standards for Academic Advising Programs (2015) both indicate a need for advising units to integrate assessment as a regular practice. The methods described herein showcase only one example of a successful approach to curriculum design via backward design and a thoughtfully executed assessment cycle. We encourage advisors to focus less on the specific tools employed and more on the process, which can be modified to fit the needs of individual advising units.

Although student learning in an advising setting has not been directly linked to increases in institutional retention or GPA performance, the advising-as-teaching approach requires regular intentional design and assessment by advisors similar to those expected of faculty members in the classroom. In the future, we hope to demonstrate a connection between successful student learning in an advising context and institutional measures of success such as GPAs, retention levels, and graduation rates. Currently, the advising team continues regular assessment cycles to ensure all aspects of the AcAc curriculum appropriately achieve the SLOs that benefit at-risk students.

References

Abelman,
R.,
&
Molina,
A.
(
2001
).
Style over substance revisited: A longitudinal analysis of intrusive intervention
.
NACADA Journal
,
21
(
1–2
),
32
39
. doi:
Abelman,
R.,
&
Molina,
A.
(
2002
).
Style over substance reconsidered: Intrusive intervention and at-risk students with learning disabilities
.
NACADA Journal
,
22
(
2
),
66
77
. doi:
Accrediting Commission for Senior Colleges & Universities Western Association of Schools & Colleges
. (
2002
).
Evidence Guide: A Guide to Using Evidence in the Accreditation Process: A resource to support institutions and evaluation teams (Working draft)
.
Alameda, CA
:
Western Association of Schools and Colleges
.
Aiken-Wisniewski,
S.,
Campbell,
S.,
Nutt,
C.,
Robbins,
R.,
Kirk-Kuwaye,
M.,
&
Higa,
L.
(
2010
).
Guide to assessment in academic advising (2nd ed.)
.
Manhattan, KS
:
National Academic Advising Association
.
Aiken-Wisniewski,
S. A.,
Smith,
J. S.,
&
Troxel,
W. G.
(
2010
).
Expanding research in academic advising: Methodological strategies to engage advisors in research
.
NACADA Journal
,
30
(
1
),
4
13
. doi:
Aiken-Wisniewski,
S.,
&
Wozab,
J.
(2012, October).
Developing a rubric to assess learning in academic advising
.
Paper presented in NACADA 36th Annual Conference Pre-conference Workshop, Nashville, TN.
Astin,
A. W.,
Banta,
T. W.,
Cross,
K. P.,
El-Khawas,
E.,
Ewell,
P. T.,
Hutchings,
P.,
Wright,
B.
(
1992
).
Principles of good practice for assessing student learning
.
Washington, DC
:
American Association for Higher Education
.
Banta,
T. W.,
Hansen,
M. J.,
Black,
K. E.,
&
Jackson,
J. E.
(
2002
).
Assessing advising outcomes
.
NACADA Journal
,
22
(
1
),
5
14
. doi:
Campbell,
S.,
Nutt,
C.,
Robbins,
R.,
Kirk-Kuwaye,
M.,
&
Higa,
L.
(
2005
).
NACADA Guide to assessment in academic advising (Monograph No. 23)
.
Manhattan, KS
:
National Academic Advising Association
.
Campbell,
S. M.,
&
Nutt,
C. L.
(
2008
).
Academic advising in the new global century: Supporting student engagement and learning outcomes achievement
.
Peer Review
,
10
(
1
).
Chickering,
A. W.,
&
Gamson,
Z. F.
(
1987
).
Seven principles for good practice in undergraduate education
.
Retrieved from the ERIC database. (ED282491)
Council for the Advancement of Standards of Higher Education
. (
2015
).
CAS standards for academic advising programs (9th ed.)
.
Washington, DC
:
Author
.
Drake,
J. K.,
Jordan,
P.,
&
Miller,
M. A.
(
Eds.
). (
2013
).
Academic advising approaches: Strategies that teach students to make the most of college
.
San Francisco, CA
:
Jossey-Bass
.
Earl,
W. R.
(
1988
).
Intrusive advising of freshmen in academic difficulty
.
NACADA Journal
,
8
(
2
),
27
33
. doi:
Fink,
L. D.
(
2003
).
A self-directed guide to designing courses for significant learning
.
Retrieved from the Dee Fink and Associates web site: https://www.deefinkandassociates.com/GuidetoCourseDesignAug05.pdf
Glennen,
R.
(
1976
).
Intrusive college counseling
.
The School Counselor
,
24
,
48
50
.
He,
Y.,
&
Hutson,
B.
(
2017
).
Assessment for faculty advising: Beyond the service component
.
NACADA Journal
,
37
(
2
),
66
75
. doi:
Hemwall,
M. K.,
&
Trachte,
K. C.
(
2005
).
Academic advising as learning: 10 organizing principles
.
NACADA Journal
,
25
(
2
),
74
83
. doi:
Hurt,
R. L.
(
2007
).
Advising as teaching: Establishing outcomes, developing tools, and assessing student learning
.
NACADA Journal
,
27
(
2
),
36
40
. doi:
Kimball,
E.,
&
Campbell,
S. M.
(
2013
).
Advising strategies to support student learning success: Linking theory and philosophy with intentional practice
.
In
J. K.
Drake,
P.
Jordan,
&
M. A.
Miller
(
Eds.
),
Academic advising approaches: Strategies that teach students to make the most of college
(pp
.
3
15
).
San Francisco, CA
:
Jossey-Bass
.
Kraft-Terry,
S.
(2015
,
October)
.
Designing and implementing a proactive advising curriculum that works
.
Paper presented at the 39th NACADA Annual Conference
,
Las Vegas, NV
.
Lowenstein,
M.
(
2005
).
If advising is teaching, what do advisors teach?
NACADA Journal
,
25
(
2
),
65
73
. doi:
Maki,
P. L.
(
2004
).
Maps and inventories: Anchoring efforts to track student learning
.
About Campus
,
9
(
4
),
2
9
. doi:
McFarlane,
B.
(2017
,
December)
.
Mandatory advising, yes or no?
Academic Advising Today
,
40
(
4
).
Mertler,
C. A.
(
2001
).
Designing scoring rubrics for your classroom
.
Practical Assessment, Research & Evaluation
,
7
(
25
).
Molina,
A.,
&
Abelman,
R.
(
2000
).
Style over substance in interventions for at-risk students: The impact of intrusiveness
.
NACADA Journal
,
20
(
2
),
5
15
. doi:
NACADA: The Global Community for Academic Advising
. (
2006
).
NACADA concept of academic advising
.
NACADA: The Global Community for Academic Advising
. (
2017a
).
NACADA academic advising core competencies model
.
NACADA: The Global Community for Academic Advising
. (
2017b
).
NACADA core values of academic advising
.
Powers,
K. L.,
Carlstrom,
A. H.,
&
Hughey,
K. F.
(
2014
).
Academic advising assessment practices: Results of a national study
.
NACADA Journal
,
34
(
1
),
64
77
.
doi: NACADA-13-003
Renzulli,
S. J.
(
2015
).
Using learning strategies to improve the academic performance of university students on academic probation
.
NACADA Journal
,
35
(
1
),
29
41
.
doi: NACADA-13-043
Robbins,
R.
(
2011
).
Assessment and accountability of academic advising
.
In
J.
Joslin
and
N.
Markee
(
Eds.
),
Academic Advising Administration: Essential Knowledge and Skills for the 21st Century (Monograph No. 22
,
pp
.
53
64
).
Manhattan, KS
:
National Academic Advising Association
.
Robbins,
R.
(
2016
).
Assessment of academic advising: Gathering outcome evidence and making changes
.
In T. Grites, M. Miller, & J. Givens Voller (Eds.),
Beyond Foundations: Developing as a Master Advisor
(pp. 289–303). San Francisco, CA: Jossey-Bass.
Robbins,
R.,
&
Zarges,
K. M.
(
2011
).
Assessment of academic advising: A summary of the process
.
Schuh,
J. H.
(
2008
).
Assessing student learning
.
In
V. N.
Gordon,
W. R.
Habley,
&
T. J.
Grites
(
Eds.
),
Academic advising: A comprehensive handbook (2nd ed
.,
pp
.
356
368
).
San Francisco, CA
:
Jossey-Bass
.
Schwebel,
D. C.,
Walburn,
N. C.,
Klyce,
K.,
&
Jerrolds,
K. L.
(
2012
).
Efficacy of advising outreach on student retention, academic progress and achievement, and frequency of advising contacts: A longitudinal randomized trial
.
NACADA Journal
,
32
(
2
),
36
43
. doi:
Suskie,
L.
(
2018
).
Assessing student learning: A common sense guide (3rd ed.)
.
San Francisco, CA
:
John Wiley & Sons
.
Tinto,
V.
(
1993
).
Leaving college: Rethinking the causes and cures of student attrition
.
Chicago, IL
:
University of Chicago Press
.
Vander Schee,
B. A.
(
2007
).
Adding insight to intrusive advising and its effectiveness with students on probation
.
NACADA Journal
,
27
(
2
),
50
59
. doi:
Varney,
J.
(
2007
).
Intrusive advising
.
Academic Advising Today
,
30
(
3
).
Varney,
J.
(
2012
).
Proactive (intrusive) advising!
Academic Advising Today
,
35
(
3
).
Walvoord,
B. E.
(
2010
).
Assessment clear and simple: A practical guide for institutions, departments, and general education
.
San Francisco, CA
:
John Wiley & Sons
.
Weiner,
W. F.
(
2009
).
Establishing a culture of assessment
.
Academe
,
95
(
4
),
28
32
.
Weissmann,
J.
(2012
,
March
29)
.
Why do so many Americans drop out of college?
The Atlantic
.
White,
E. R.
(
2006
).
Using CAS standards for self-assessment and improvement
.
Wiggins,
G.,
&
McTighe,
J.
(
2001
).
What is backward design?
In
Understanding by design
(pp. 7–19).
Upper Saddle River, NJ
:
Merrill Prentice Hall
.
Appendix

Academic action categories for students

Academic action categories for students
Academic action categories for students

Author notes

The authors acknowledge the contributions of the advisors who participated in various aspects of the assessment process, especially Christy Burt, who was integral in data collection, along with Diana Thompson and Jain Yi.

Dr. Stephanie Kraft-Terry is the interim director of advising for the College of Natural Sciences at the University of Hawai‘i at Mānoa. She holds a PhD in experimental neuroscience from the University of Nebraska. She is a tenured faculty member in the Department of Biology, where she is responsible for overseeing academic advising and program assessment for all degree programs within the department. In addition, she currently serves as the interim director of advising for the College of Natural Sciences. She can be reached at kraft2@hawaii.edu.

Cheri Kau holds an MEd in educational administration (higher education) from the University of Hawai‘i at Mānoa, where she was previously employed as an academic advisor in the Department of Biology. She currently is employed at the University of Hawai‘i at Coursera.