Best practices of academic advising assessment involve identification of student learning outcomes, the development and use of multiple measures of student learning, and sound professional judgment to understand the information gathered and to improve student learning. However, the assessment results often come from minimal, narrow, and inconsistent evaluation practices, often based on student satisfaction surveys. Therefore, to generate a picture of the current state of assessment, we surveyed those conducting or deemed responsible for academic advising assessment. Although 80% of survey participants identified academic-advising student learning outcomes, one half assessed the achievement of those outcomes, with most using student surveys. Furthermore, 7% reported employing three or more measures while 60% reported improvements of practice and student learning based on the assessment.

Faculty members provide educationally purposeful activities in their classes by developing learning objectives to guide the information and methods by which they teach. Many in higher education view academic advising as a form of teaching that leads to student learning (Appleby, 2008; Creamer, 2000; Hemwall & Trachte, 2005; Lowenstein, 2005; Melander, 2005) and personal development (Crookston, 1972/1994/2009). Advisors provide educationally purposeful activities by developing procedures to guide students in looking beyond curricular requirements to discover opportunities that provide a breadth and depth of educational experiences. By promoting opportunities that will challenge and facilitate student intellectual and social development, good academic advising enables students to add value to the college experience (Campbell & Nutt, 2008). This growth in students throughout the college experience fulfills the mission of the institution and demonstrates the impact of effective advising on the teaching and learning process.

According to the National Academic Advising Association (NACADA) (2006), academic advising consists of curricula, pedagogies, and student learning outcomes (SLOs) just as classroom teaching does. The Council for the Advancement of Standards in Higher Education (CAS) (2008) recommended that advising programs identify relevant and desirable development goals and SLOs that are purposeful and holistic. In addition, CAS (2008) suggested that advising provide programs and services needed to assist with the achievement of those outcomes. The advising program staff is responsible for determining the relevant outcome domains and related dimensions for students based on institutional mission.

According to Aiken-Wisniewski et al. (2010), the SLOs of the advising experience include cognitive (what students should know), behavioral (be able to do), and affective (value) outcomes as a result of participating in academic advising. Furthermore, SLOs of academic advising should be tailored to the needs of students (Martin, 2007) and enable them to reach their educational and career goals.

Assessment on college campuses is primarily driven by accreditation of outside organizations, consumers, public opinion, legislative pressure (e.g., Texas Gen. Laws 61, 2011), and an internal commitment to improvement (Ewell, 2009). If advising is viewed from a learning-centered paradigm that focuses on outcomes (Campbell & Nutt, 2008), assessment must be used to understand whether or not the SLOs have been achieved. CAS (2008) standards require evaluation and assessment for academic advising programs. As noted by Aiken-Wisniewski et al. (2010), assessment of academic advising supports student persistence, success, and learning. It also serves to improve advising delivery through continuous feedback (Robbins & Zarges, 2011) as the practice is reviewed and revised.

The methods and measures used in assessment should comport to the assessment questions asked and garner feedback on student learning. Assessment may include combinations of quantitative and qualitative types of inquiry, direct and indirect methods of measurement, and formative and summative means of evaluation (Robbins, 2009, 2011, 2013). Participants in the advising program may identify a wide range of SLOs. In addition, the multidimensional characteristics of learning mean that effective assessment must include multiple measures to provide encompassing and useful information (Campbell, 2005b; Huba & Freed, 2000; Maki, 2004; Palomba, 2002a; Suskie, 2009).

Student evaluation of advising interactions comprises the most predominant form of academic advising assessment (Habley, 2004; Macaruso, 2007); however, standard student evaluations can be problematic (McClellan, 2011; Robbins, 2009, 2011, 2013). Specifically, in addition to reflecting possible student biases toward advisors or advising, student evaluations often offer limited ability to measure the scope of advising processes or abstract concepts. In addition, Creamer and Scott (2000) stated, “Student satisfaction measures cannot capture long-term outcomes and may be influenced by unrealistic or uninformed expectations about the role of an advisor” (p. 344). Use of collective findings from multiple measures allows for better guidance that improves advising efforts (Creamer & Scott, 2000; Robbins, 2009, 2011, 2013).

To improve advising programs, administrators need systematically gathered and specific assessment data (Campbell, 2005a). More importantly, assessment must provide advising program personnel with an understanding of the ways and topics of student learning through their involvement in academic advising experiences. In addition to understanding student views on assessment, “Professionals must monitor their own behaviors and constantly examine their assumptions, practices, and outcomes” (White, 2006, ¶12).

Purpose of Study

The literature regarding assessment practices of academic advising SLOs is limited and lacks descriptive information on the methods being used to measure outcomes or the use of resulting data. The lack of research devoted entirely to assessment of academic advising SLOs inspired this study.

We investigate the extent to which academic advising SLOs are identified at colleges and universities engaged in the assessment of academic advising. We also determine the type and number of measures used to assess the achievement of the SLOs. In addition, we look at the use of the information obtained through the assessment process. We also examine an association between institutional characteristics (e.g., institutional type and size, existence of a formal mission statement) and the identification of SLOs as well as use of formal measures of SLOs and the resulting assessment information.

Method

Participants

Participants for the study included administrators, advisors, and other personnel who practice or are responsible for the assessment of academic advising at their institutions. All participants came from institutions with members of NACADA and were recruited from those who had completed the 0271-9517-34-1-64-NACADA1NACADA 2011 ,National Survey of Academic Advising (Carlstrom & Miller, 2013) and had agreed to participate in follow-up studies. We also solicited some participants at the NACADA 2011 National Conference and by an invitation distributed via the NACADA Assessment Listserv. From these pools of potential participants, we invited 499 individuals via e-mail to complete a web-based survey. We collected data from 291 people, a 58% response rate. Out of this number, 230 (46% of the invited participants) had offered complete data that we used in the results.

The greatest percentage of respondents came from NACADA Region 5 (19.1%, n = 44) and the fewest came from Region 8 (3.9%, n = 9). These trends reflect the NACADA membership: Region 5 is home to the most members and Region 8 is home to the fewest; the percentages of participants were also proportional to the composition of NACADA membership per other regions (NACADA, 2012, 2014).

The highest percentage of participants by institution type came from public and private, nonprofit, doctoral degree–granting institutions (37.8%, n = 87). Public and private, nonprofit, 2-year institutions were home to the next largest group of participants (24.3%, n = 56), which aligns with the percentage (29.0%) they make up among all types of institutions surveyed.

Institutional size, based on the Carnegie Foundation for the Advancement of Teaching (2010) classification, reflects three categories (per undergraduate enrollments): small (fewer than 6,000); medium (6,000 to 23,999); and large (24,000 or more). Almost all of the participants, 83.4%, reported being from small and medium institutions; this group was evenly split at 41.7% from each nonlarge category.

The data show that 53.0% (n = 122) of participants reported job responsibilities associated with institution-wide undergraduate advising. Collected demographic data indicate that most hold the title of advising director/coordinator (45.7%, n = 105), and 21.7% (n = 50) said they work as an academic advisor. Assistant/associate dean described 9.6% (n = 22) of the respondents while 5.2% (n = 12) identified themselves as dean. The fewest self reported being a faculty advisor (1.7%, n = 4).

Eighty-seven percent (n = 200) of the participants indicated having some direct advising responsibilities, with 32.6% (n = 75) representing situations exclusive to professional advisors and 20.0% (n = 46) from situations in which only faculty advisors were employed. Nearly one half (45.2%, n = 104) reported use of a split model such that both faculty and staff provide advising.

The data show that 42.2% (n = 97) of participants work in programs that mandate advising for all students, and 22.6% (n = 52) respondents indicated that advising requirements depend on specific situations (e.g., mandatory for new freshman, transfer, or probationary students). Roughly one third reported no mandates for advising. The results indicate that a formal mission statement for academic advising exists in 65.7% (n = 151) of participants' advising situations.

Instrument

We specifically developed The Survey on Assessment of Academic Advising for this national study. Administered online, it was comprised of two sections. Items in the first section were included to obtain demographic information of the participants and characteristics of the institutions they represented (e.g., type and size of institution, personnel who advise undergraduates, existence of formal mission statement). The second section is comprised of 21 items related to a specific SLO. The outcomes were gleaned from the NACADA Guide to Assessment in Academic Advising (Aiken-Wisniewski et al., 2010); the Assessment of Academic Advising Institute; and the NACADA Clearinghouse, which includes Constructing Student Learning Outcomes (Martin, 2007) and sample academic advising syllabi (NACADA, 2011). The SLOs were presented as groups of cognitive, behavioral, and affective outcomes (see Appendix).

Participants who responded affirmatively to SLO items were presented with a list of options and asked to select all used to assess the SLOs. The measures included those most frequently found in the assessment of academic advising literature and were also drawn and adapted, with permission from the National Institute for Learning Outcomes Assessment, from the national survey of provosts and chief academic officers on assessment practices (Ikenberry & Kuh, 2009). After selecting the measures used to assess the identified outcomes, participants chose all applicable options of assessment information use.

Participants received access to a write-in section where they could list any other academic advising SLOs that had been formally identified. In addition, participants could describe additional measures they used to assess SLOs and additional ways the assessment information was used.

Procedures

The Institutional Review Board at Kansas State University granted permission to conduct this study. The administration of the Survey on Assessment of Academic Advising took place in February 2012. Potential participants were sent an e-mail notification inviting them to take part in the survey over 3 weeks. A follow-up e-mail was sent after the first 2 weeks had passed to remind them of the survey and encourage them to complete it.

Hypotheses

We did not create hypotheses to address the exploratory values of institutional type and size or institutional level of advising. We created the following hypotheses for the other institutional variables based on the findings of Carlstrom and Miller (2013):

  • H1. More participants from situations where only professional advisors are employed than from situations where only faculty advisors are employed report formal identification and measurement of SLOs and use of the resulting assessment data.

  • H2. Fewer participants from situations where advising is mandatory than those from advising situations where it is not mandatory report formal identification and measurement of SLOs and use of the resulting assessment data.

  • H3. More participants from situations in which a formal mission statement guides academic advising report formal identification and measurement of SLOs and use of the resulting assessment data than do those from advising situations with no mission statement.

Statistical Analyses

We collected data to determine the characteristics of participants' institutions as well as the number of participants reporting identification and measurement of academic advising SLOs. In addition, we collected information to determine the number of participants who reported utilization of multiple measures and who indicated that the assessment information is used to make decisions at their institutions.

We conducted a series of Pearson's chi-square tests to examine any existing associations between (a) institution type, (b) institution size, (c) institutional level of advising, (d) advising personnel, (e) mandatory advising for all students, and (f) the existence of a formal mission statement for academic advising with the following:

  • formal identification of academic advising SLOs,

  • use of formal measures to assess academic advising SLOs,

  • use of three or more formal measures to assess academic advising SLOs, and

  • use of assessment information.

We present only the chi-square analyses that meet the following requirements: (a) No more than 20% of cells had expected counts fewer than 5, and (b) no cells had expected counts less than 1.

Results

Identification and Assessment of Student Learning Outcomes

Results indicated that 77.8% (n = 179) of the 230 participants reported formal identification of academic advising SLOs. Table 1 lists the numbers and percentages of participants who identified each of the three most frequently identified cognitive, behavioral, and affective outcomes.

Table 1.

Numbers and percentages of participants who reported identification and assessment of student learning outcomes (SLOs), N = 230

Numbers and percentages of participants who reported identification and assessment of student learning outcomes (SLOs), N = 230
Numbers and percentages of participants who reported identification and assessment of student learning outcomes (SLOs), N = 230

The results indicated that 57.8% (n = 133) of participants reported use of formal measures to assess academic advising SLOs. The outcome most frequently measured was “student knows the degree requirements of college/department” (see Table 1). However, the data show that 1.9% (n = 3) of the participants who identified this outcome indicated that three or more formal measures are employed to assess it.

Student survey or questionnaire was the overwhelming choice reported for measuring achievement of each SLO (see Table 2). For cognitive and behavioral outcomes, direct observations and written exams were the next most commonly reported forms of assessment.

Table 2.

Numbers and percentages of participants who reported use of formal measures to assess student learning outcomes (SLOs): cognitive, behavioral, and affective

Numbers and percentages of participants who reported use of formal measures to assess student learning outcomes (SLOs): cognitive, behavioral, and affective
Numbers and percentages of participants who reported use of formal measures to assess student learning outcomes (SLOs): cognitive, behavioral, and affective

Use of Assessment Information

Results indicated that 60.0% (n = 138) of participants who identified academic advising SLOs (n = 179) also said that the information gathered from assessing those outcomes contributed to decision making. The numbers and percentages of participants who reported using assessment information in specific ways are listed in Table 3. The participants reported the following actions in descending order: revising process/delivery outcomes, revising the advising curriculum, and evaluating the advising unit. The fewest cited using assessment information to meet institutional or accrediting body mandates, to revise SLOs, and to lobby for additional resources.

Table 3.

Numbers and percentages of participants who reported how information was used as a result of student learning outcomes (SLOs) assessment

Numbers and percentages of participants who reported how information was used as a result of student learning outcomes (SLOs) assessment
Numbers and percentages of participants who reported how information was used as a result of student learning outcomes (SLOs) assessment

Association Between Characteristics of Institutions Represented and Assessment Practices

We considered responses of “Do not know” and “Choose not to reply” to the institutional variables as missing data because some participants likely chose these to avoid disclosing their lack of knowledge of assessment practices within their advising situation (McMillin, 2012). We decided little pertinent information would be gained from including the data from these respondents. Therefore, we subjected 171 cases to chi-square analyses.

Type of institution

We found no significant association between the type of institution and identification of formal SLOs and use of formal measures to assess academic advising SLOs, three or more assessment measures, or assessment information.

Size of institution

The association between the size of the institution and formal identification of SLOs was significant: χ2 (2, N = 171) = 7.83, p = .02. More participants from large and medium institutions indicated formal identification of SLOs than expected. We found no significant association between size of institution and use of formal measures to assess SLOs, three or more formal measures, or assessment information.

Institutional level of advising

We found no significant association between the institutional level of advising and formal identification of SLOs or use of formal measures of assessment, three or more formal assessment measures, or assessment information.

Advising personnel

According to H1, we expected that more respondents from situations that hire only professional advisors than respondents from situations that hire only faculty members would report use of formal identification and measurement of SLOs and using the resulting assessment data. The data show, based on the odds ratio, respondents from solely professional-advising staffed situations were 2.82 times more likely to confirm use of identified outcomes than those from situations with other advising personnel. According to Field (2009), the odds ratio is a useful measure of effect size for categorical data. Table 4 presents the difference between expected and observed values for advising personnel. There was a significant association between personnel and formally identified academic advising SLOs: χ2 (2, n = 171) = 8.12, p = .017.

Table 4.

Cross-tabs analysis of assessment practices by advising personnel

Cross-tabs analysis of assessment practices by advising personnel
Cross-tabs analysis of assessment practices by advising personnel

We found no significant association between advising personnel and use of formal measures to assess SLOs, three or more formal assessment measures, or assessment information. However, those from advising situations with only professional advisors were 1.77 times more likely to report use of assessment data than were those from situations that solely hired faculty advisors.

Mandatory advising

H2 stated our expectation that fewer respondents from advising situations characterized by mandatory advising for all students would report formal identification and measurement of SLOs as well as use of assessment data than their counterparts from situations where academic advising is not mandatory. The association between mandatory advising and formal identification of SLOs was not significant. However, contrary to the hypothesis, a relatively equal percentage of participants from institutions with and without mandatory advising identified SLOs (see Table 5).

Table 5.

Cross-tabs analysis of assessment practices by mandatory advising

Cross-tabs analysis of assessment practices by mandatory advising
Cross-tabs analysis of assessment practices by mandatory advising

We found no significant association between mandatory advising and use of formal measures to assess outcomes, three or more assessment measures, or assessment data use. However, a greater percentage of participants from places with mandatory advising used formal measures, including three or more, than did those participants from places without mandatory advising. As hypothesized, a smaller percentage of participants from institutions with mandatory advising (54.1%) than those from institutions that did not mandate advising (62.5%) reported use of assessment data.

Formal mission statement

According to H3, we expected more participants from advising situations characterized by a formal mission statement to report formal identification and measurement of SLOs and use of assessment data than those from advising situations with no mission statement. As hypothesized, more participants in advising situations with a formal mission statement identified SLOs (87.9%) than did those from places without such a statement (58.2%) (see Table 6). We found a significant association between a formal mission statement and formal identification of academic advising SLOs, χ2 (1, n = 171) = 19.47, p = .000, as well as use of formal measures to assess them: χ2 (1, n = 171) = 9.33, p = .002. We found no significant association between a formal mission statement and use of three or more measures to assess academic advising SLOs.

Table 6.

Cross-tabs analysis of assessment practices by mission statement

Cross-tabs analysis of assessment practices by mission statement
Cross-tabs analysis of assessment practices by mission statement

As hypothesized, more from advising situations with a formal mission statement reported use of assessment information (67.2%) than those where a mission statement (49.1%) did not exist. We found a significant association between having a formal mission statement and use of assessment information: χ2 (1, n = 171) = 5.19, p = .023.

Discussion

Assessment is vital to the achievement of the advising program mission for “without ongoing assessment it is not possible to determine with any certainty that the advising program is accomplishing its stated mission” (Habley, 2005, ¶6). The mission statement serves as the guide to determine advising program learning outcomes (American Association of Higher Education [AAHE], 1996; Campbell, 2008; CAS, 2008; Palomba, 2002a), and this study clearly shows that this first step in programming leads to greater assessment activities. More participants who reported their institutions have a formal mission statement identified SLOs, reported utilization of formal measures to assess learning outcomes and three or more measures to do it than did those reporting no such statement. More participants affirming mission statements also reported use of the resulting assessment information to inform and make decisions.

Over three fourths of those surveyed came from situations with identified SLOs. Participants of this study indicated prioritization of cognitive SLOs (e.g., degree requirements, the policies of their major department or college). Although provision of information is considered a prescriptive form of advising, students need to know the specifics for degree completion. Furthermore, they need to know the location of campus resources, a priority outcome according to some participants and one that likely affects retention (Cuseo, 2012).

Some participants cited recognition of the importance of behavioral SLOs (e.g., develop long-term goals, create and use an educational plan to manage progress toward degree completion). According to CAS (2008), helping students create an educational plan should be a primary purpose of advising programs. The planning process encourages students to engage in higher levels of thinking, such as evaluating or creating (Krathwohl, 2002), by using all of the complex information available to them and generating a plan that meets their academic, career, and personal goals (Hurt, 2007; NACADA, 2006). Such plans are also purposeful and holistic (CAS, 2008), providing individualized attention to each student in his or her development.

Appleby (2007) noted that some outcomes are abstract and difficult to measure, which may be the reason few participants reported identification of affective SLOs. Advisors may believe that students appreciate the contribution of advising, but may not view the affective outcomes as significant or have the means to assess them. Perhaps better understanding of ways to measure affective outcomes, such as described by Erlich and Russ-Eft (2011) or Robbins (2009), would lead to more frequent identification of these outcomes.

More respondents identifying SLOs came from situations where both faculty and professional personnel advise. The results indicate that environments of shared obligation to assessment promote evaluation efforts. Palomba (2002b) noted that such an environment demonstrates a commitment to student success.

Of participants who identified SLOs, fewer than 65% reported measurements for those outcomes and fewer than 15% indicated use of multiple measures. They reported predominant use of a student survey/questionnaire to assess outcomes, a finding consistent with previous studies revealing that most who assess academic advising use student satisfaction surveys (Carlstrom & Miller, 2013; Habley, 2004; Macaruso, 2007). Student perceptions of the advising process can be an effective element of assessment, but they should not be the sole measure used (Robbins, 2009, 2011, 2013). Student surveys that assess outcome achievement (e.g., self-report of learning) to determine learning experienced are more effective means of measuring achievement than are those based on satisfaction (Robbins, 2009, 2011, 2013).

Participants suggested that student work or portfolios are seldom used to measure achievement of learning outcomes. This finding is surprising in light of the usefulness of these tools for tracking and demonstrating SLOs of academic advising interactions (Chen & Black, 2010). In addition, few participants reported the use of rubrics for assessing outcome achievement. According to Hurt (2007), use of rubrics to assess student work or performance promotes a holistic assessment of student learning.

The use of three or more measures to assess SLOs constitutes a best practice in assessment (Campbell, 2005b; Cuseo, 2008; Huba & Freed, 2000; Maki, 2004; Palomba & Banta, 1999; Robbins, 2009, 2011, 2013; Suskie, 2009). To capture the complexity of student learning gained as a result of academic advising, researchers need to employ multiple measures of assessment. The results show 7.8% of participants reported use of three or more measures to assess student learning, suggesting that advising units may not be collecting sufficient information to provide evidence of SLO achievement (Creamer & Scott, 2000; Robbins, 2009, 2011, 2013).

More participants from situations employing only professional (and not faculty) advisors reported assessments of outcome achievement. Professional advisors likely shoulder fewer demands for research and lighter teaching loads, which leaves more time for assessment efforts. More participants in situations where only faculty and where both faculty and professionals advise reported use of three or more measures than did those in situations where only professional advisors are employed. Perhaps faculty experience with conducting assessment explains this finding. A collaborative environment in which both professional and faculty advisors work together on assessment efforts appears to provide the optimal results.

Assessment information proves useful to enhance advising performance that will lead to improved practices and SLO achievement (Ewell, 2009). Over one half of the participants in this study reporting identification of SLOs also indicated use of the results. They reported utilization in the following descending order: changing the advising process/delivery outcomes, evaluating the advising unit, and revising advising pedagogy and curriculum. However, because student surveys are the most frequently reported measure, the assessment information may have resulted in changes that increased satisfaction but not necessarily enhanced outcome achievement.

More participants reported use of assessment information than reported use of outcome measures. Informal assessments made during sessions with students inform practice only if advisors directly observe an expected performance level based on set criteria. Mere speculation that outcomes have been achieved likely result in inconsistent and unreliable data, which in turn may not lead to needed enhancements in advising delivery or student learning.

Limitations

Participants were solicited through their membership in NACADA. They indicated work with assessment at their institutions and volunteered to take part in the survey. As a result, study findings may not generalize to other advisors or administrators who work in academic advising at all institutions.

Recommendations for Practice

Leaders of advising programs need to determine their mission to students in efforts to guide the identification of relevant SLOs (AAHE, 1996; Campbell & Nutt, 2008; CAS, 2008; Maki, 2004; Martin, 2007; Robbins, 2009). Advisors should increase assessment efforts to provide evidence that students are learning from the advising relationship and program (AAHE, 1996; Angelo, 1995; Appleby, 2007; Ewell, 2009; Maki, 2004; White, 2006). The data inform determination of the advising programs that work well and those that need enhancement to positively influence student learning. Assessment efforts must include using multiple measures (e.g., exams, assignments, rubrics to measure student work/portfolios, direct observations of student performance, and reflective essays) to provide sufficient data in support of achieved learning outcomes (Creamer & Scott, 2000; Maki, 2004; Palomba, 2002a; Robbins, 2009, 2011, 2013; Suskie, 2009). Education or professional development on multiple measures and utilization of the resulting information must receive prioritization. Finally, administrators must make better use of valid assessment results to improve advising practices and increase student learning (AAHE, 1996; Ewell, 2009; Palomba, 2002a).

Recommendations for Research

Based on the results of this study, we recommend additional research. For example, a study that determines the most effective measurement methods informs assessment practice, and a qualitative study that shows the impact of the advising process and increased student learning informs advising practice. Shared results of research on advising programs that feature SLO achievement measures and the actions taken based on assessment information benefit others (Palomba, 2002a). In addition, a longitudinal study designed to assess the entire educational experience through an advising program and that shows student progress through their academic career, even as the desired outcomes evolve, would contribute much to the field. Much could be learned from programs that have goals and objectives in place for assessing student development over time (CAS, 2008; Ewell, 2009). Finally, this study could be replicated with another sample of academic advising personnel who are not NACADA members. Those not affiliated with NACADA may have implemented sound assessment practices that could provide new information.

References

Aiken-Wisniewski
,
S
.,
Campbell
,
S
.,
Nutt
,
C
.,
Robbins
,
R
.,
Kirk-Kuwaye
,
M
.,
&
Higa
,
L
.
(
2010
).
Guide to assessment in academic advising (2nd ed.) (Monograph No. 23)
.
Manhattan, KS
:
National Academic Advising Association
.
American Association for Higher Education (AAHE)
. (
1996
).
Principles for good practice for assessing student learning
.
Washington, DC
:
Author
.
Angelo
,
T
.
(
1995
).
Reassessing and defining assessment
.
AAHE Bulletin
,
48
(
3
),
49
51
.
Appleby
,
D. C
.
(
2007
).
[The contents of online advising syllabi]
.
Unpublished raw data
.
Appleby
,
D. C
.
(
2008
).
Advising as teaching and learning
.
In
V. N
.
Gordon
,
W. R
.
Habley
,
&
T. J
.
Grites
(
Eds.
),
Academic advising: A comprehensive handbook (2nd ed
.) (
pp
.
85
102
).
San Francisco, CA
:
Jossey-Bass
.
Campbell
,
S
.
(
2005a
).
Why do assessment of academic advising? Part I
.
Academic Advising Today
,
28
(
3
),
1
,
8
.
Campbell
,
S
.
(
2005b
).
Why do assessment of academic advising? Part II
.
Academic Advising Today
,
28
(
4
),
13
14
.
Campbell
,
S. M
.
(
2008
).
Vision, mission, goals, and programmatic objectives for academic advising programs
.
In
V. N
.
Gordon
,
W. R
.
Habley
,
&
T. J
.
Grites
(
Eds.
),
Academic advising: A comprehensive handbook (2nd ed
.) (
pp
.
229
243
).
San Francisco, CA
:
Jossey-Bass
.
Campbell
,
S. M
.,
&
Nutt
,
C. L
.
(
2008
).
Academic advising in the new global century: Supporting student engagement and learning outcomes achievement
.
Peer Review
,
10
(
1
),
4
7
.
Carlstrom
,
A
.,
&
Miller
,
M. A
.
(
Eds.
). (
2013
).
NACADA national survey of academic advising (Monograph No. 25)
.
Manhattan, KS
:
National Academic Advising Association
.
Carnegie Foundation for the Advancement of Teaching
. (
2010
).
Basic classification
.
Chen
,
H. L
.,
&
Black
,
T. C
.
(
2010
).
Using E-portfolios to support an undergraduate learning career: An experiment with academic advising
.
EDUCAUSE Quarterly
,
33
(
4
).
Council for the Advancement of Standards in Higher Education (CAS)
. (
2008
).
Academic advising programs: CAS standards and guidelines
.
Creamer
,
D. G
.
(
2000
).
Use of theory in academic advising
.
In
V. N
.
Gordon
&
W. R
.
Habley
(
Eds.
),
Academic advising: A comprehensive handbook
(pp
.
18
34)
.
San Francisco, CA
:
Jossey-Bass
.
Creamer
,
E. G
.,
&
Scott
,
D. W
.
(
2000
).
Assessing individual advisor effectiveness
.
In
V. N
.
Gordon
&
W. R
.
Habley
(
Eds.
),
Academic advising: A comprehensive handbook
(pp
.
339
348)
.
San Francisco, CA
:
Jossey-Bass
.
Crookston
,
B. B
.
(
2009
).
A developmental view of academic advising as teaching
.
NACADA Journal
,
29
(
1
),
78
82
. (
Reprinted from Journal of College Student Personnel, 13, 1972, pp. 12–17; NACADA Journal, 14[2], 1994, pp. 5–9
)
Cuseo
,
J
.
(
2008
).
Assessing advisor effectiveness
.
In
V. N
.
Gordon
,
W. R
.
Habley
,
&
T. J
.
Grites
(
Eds.
),
Academic advising: A comprehensive handbook (2nd ed
.) (
pp
.
369
385
).
San Francisco, CA
:
Jossey-Bass
.
Cuseo
,
J
.
(
2012
).
Academic advising and student retention: Empirical connections & systemic interventions
.
Erlich
,
R. J
.,
&
Russ-Eft
,
D
.
(
2011
).
Applying social cognitive theory to academic advising to assess student learning outcomes
.
NACADA Journal
,
31
(
2
),
5
15
.
Ewell
,
P
.
(2009
,
November)
.
Assessment, accountability, and improvement: Revisiting the tension. (NILOA Occasional Paper No. 1)
.
Urbana, IL
:
University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment
.
Field
,
A
.
(
2009
).
Discovering statistics using SPSS (3rd ed.)
.
London, England
:
SAGE
.
Habley
,
W. R
.
(
2004
).
The status of academic advising: Findings from the ACT sixth National Survey (Monograph No. 10)
.
Manhattan, KS
:
National Academic Advising Association
.
Habley
,
W. R
.
(
2005
).
Developing a mission statement for the academic advising program
.
Hemwall
,
M. K
.,
&
Trachte
,
K. C
.
(
2005
).
Academic advising as learning: 10 organizing principles
.
NACADA Journal
,
25
(
2
),
74
83
.
Huba
,
M. J
.,
&
Freed
,
J. E
.
(
2000
).
Learner centered assessment on college campuses: Shifting the focus from teaching to learning
.
Needham Heights, MA
:
Allyn & Bacon
.
Hurt
,
R. L
.
(
2007
).
Advising as teaching: Establishing outcomes, developing tools, and assessing student learning
.
NACADA Journal
,
27
(
2
),
36
40
.
Ikenberry
,
S
.,
&
Kuh
,
G
.
(
2009
).
[National survey of provosts and chief academic officers on campus assessment activities]
.
Bloomington, IN
:
National Institute for Learning Outcomes Assessment
.
Krathwohl
,
D. R
.
(
2002
).
A revision of Bloom's taxonomy: An overview
.
Theory into Practice
,
41
,
212
218
.
Lowenstein
,
M
.
(
2005
).
If advising is teaching, what do advisors teach?
NACADA Journal
,
25
(
2
),
65
73
.
Macaruso
,
V
.
(
2007
).
From the co-editors: Brief report on the NACADA Commission on the Assessment of Advising 2004 results
.
NACADA Journal
,
27
(
2
),
3
8
.
Maki
,
P
.
(
2004
).
Assessing for learning: Building a sustainable commitment across the institution
.
Sterling, VA
:
Stylus
.
Martin
,
H
.
(
2007
).
Constructing learning objectives for academic advising
.
McClellan
,
J. L
.
(
2011
)
: Beyond student learning outcomes: Developing comprehensive, strategic assessment plans for advising programs
.
Journal of Higher Education Policy and Management
,
33
,
641
652
.
McMillan
,
J. H
.
(
2012
).
Educational research: Fundamentals for the consumer (6th ed.)
.
Boston, MA
:
Pearson
.
Melander
,
E. R
.
(
2005
).
Advising as educating: A framework for organizing advising systems
.
NACADA Journal
,
25
(
2
),
84
91
.
National Academic Advising Association (NACADA)
. (
2006
).
NACADA concept of academic advising
.
NACADA
. (
2011
).
Sample academic advising syllabi from a variety of colleges and resources
.
NACADA
. (
2012
).
[NACADA member demographic information as of February 28, 2012.] Unpublished raw data
.
Manhattan, KS
:
Author
.
NACADA
. (
2014
).
Regions
.
Palomba
,
C. A
.
(
2002a
).
Characteristics of effective outcomes assessment: Foundations and examples
.
In
T. W
.
Banta
(
Ed.
),
Building a scholarship of assessment
(pp
.
261
283)
.
San Francisco, CA
:
Jossey-Bass
.
Palomba
,
C. A
.
(
2002b
).
Scholarly assessment of student learning in the major and general education
.
In
T. W
.
Banta
(
Ed.
),
Building a scholarship of assessment
(pp
.
201
222)
.
San Francisco, CA
:
Jossey-Bass
.
Palomba
,
C. A
.,
&
Banta
,
T. W
.
(
1999
).
Assessment essentials: Planning, implementing, and improving assessment in higher education
.
San Francisco, CA
:
Jossey-Bass
.
Robbins
,
R
.
(
2009
).
Evaluation and assessment in career advising
.
In
K. F
.
Hughey
,
D
.
Nelson
,
J. K
.
Damminger
,
&
B
.
McCalla-Wriggins
(
Eds.
),
The handbook of career advising
(pp
.
266
292)
.
San Francisco, CA
:
Jossey-Bass
.
Robbins
,
R
.
(
2011
).
Assessment and accountability of academic advising
.
In
J
.
Joslin
&
N
.
Markee
(
Eds.
),
Academic advising administration: Essential Knowledge and skills for the 21st century (Monograph No. 22)
(
pp
.
53
64
).
Manhattan, KS
:
National Academic Advising Association
.
Robbins
,
R
.
(
2013
).
Assessment of peer advising
.
In
H
.
Koring
&
D
.
Zahorik
(
Eds.
),
Peer advising and mentoring: A guide for advising practitioners (2nd ed
.) (
pp
.
129
140
).
Manhattan, KS
:
National Academic Advising Association
.
Robbins
,
R
.,
&
Zarges
,
K. M
.
(
2011
).
Assessment of academic advising: A summary of the process
.
Suskie
,
L
.
(
2009
).
Assessing student learning: A common sense guide (2nd ed.)
.
Bolton, MA
:
Anker
.
Texas Gen. Laws 61, § 077. Academic Advising Assessment
.
2011
Texas Acts Education Code
.
31 March
2011
.
White
,
E. R
.
(
2006
).
Using CAS standards for self-assessment and improvement
.

Appendix.

Student learning outcomes presented in survey on academic advising assessment

Cognitive outcomes

Student knows

  • the degree requirements of the college/department.

  • department/college policies (e.g., late withdrawal from courses, grade replacement, late adding of a course).

  • about academic majors available.

  • how to schedule an advising appointment.

  • how to compute his/her GPA.

  • where to locate resources on campus (e.g., tutoring, career services, financial assistance).

Behavioral outcomes

Student is able to

  • demonstrate effective decision-making skills.

  • develop long-term plans to meet education goals.

  • use an educational plan to manage progress toward degree completion.

  • engage with appropriate resources to meet individual need for academic success.

  • interpret a degree audit report for educational planning.

  • prepare questions for an advising appointment.

  • use the online registration system to enroll in classes.

  • access academic advising in a timely manner.

Affective outcomes

Student values/appreciates

  • the benefits of the general education requirements (a liberal education).

  • how personal values relate to life goals.

  • how his/her academic major reflects personal interests.

  • having a sense of ownership of one's educational experience.

  • how academic advising has contributed to his or her educational experience.

  • the role of internships as part of his/her undergraduate experience.

  • the importance of interacting with faculty members.

Author notes

Keith L. Powers, PhD, is Director of the Advising and Resource Center in the Pott College of Science, Engineering, and Education at the University of Southern Indiana. He earned his doctorate in counseling and student development–student affairs in higher education from Kansas State University. He has been a member of NACADA since 2004. This research was revised from his dissertation. Correspondence should be addressed to him at kpowers2@usi.edu.

Aaron H. Carlstrom, PhD, is a clinical assistant professor in the Psychology Department at the University of Wisconsin–Parkside in Kenosha, Wisconsin. Dr. Carlstrom earned his PhD in counseling psychology from the University of Wisconsin–Milwaukee. His research interests focus on the career development of students in middle school through college. He was the editor of the 2011 NACADA National Survey of Academic Advising. He has been a member of NACADA since 2009. Dr. Carlstrom can be reached at Aaron.Carlstrom@uwp.edu.

Kenneth F. Hughey, PhD, is professor and Chair of the Department of Special Education, Counseling, and Student Affairs in the College of Education at Kansas State University. He serves as program director for the graduate programs in academic advising offered through the Division of Continuing Education at Kansas State and serves on the Editorial Board of the NACADA Journal.