Many institutions of higher education struggle with low retention rates. One state liberal arts college addressed this concern by assigning an academic case manager to higher risk students. This project evaluated the effectiveness of the case manager on student credit hours and retention using a randomized control trial. The case manager contacted assigned students regularly, meeting with students and helping them navigate college and their classes. We found that students randomly assigned to the case manager earned higher grades, completed more credits, and were more likely to return to campus the second semester of the academic year.

Colleges and universities strive to graduate larger percentages of their students. In 2018–2019, 63 percent of first-time, full-time (FTFT) undergraduates at four-year degree-granting institutions graduated in 6 years (Irwin et al., 2021). A first step in increasing graduation rates is retaining first-year students. Yet, in fall 2019, only 81 percent of last year’s FTFT undergraduates at four-year degree granting institutions returned to campus (Irwin et al., 2021). One promising intervention to help improve retention and ultimately graduation is increased support in advising, coaching, and other nonfinancial areas. Quality advising can increase students’ self-efficacy (Erlich & Russ-Eft, 2013) and loyalty to their university (Vianden & Barlow, 2015). Support services are also an increasing part of the student experience. However, to date, evidence demonstrating their causal role in improving student outcomes is scarce, especially when compared to studies on advising.

Much of the research evaluating advising in higher education has focused on student satisfaction, not student success (Young-Jones et al., 2013). Some studies explored how advising affects student success, suggesting that students who met more often with their advisors knew more about the institution and its policies (Smith & Allen, 2014), earned higher GPAs (Mu & Fosnacht, 2019; Young-Jones et al., 2013), and were less likely to stop out of the institution (Swecker et al., 2013). In each of these cases, however, advising measures relied on students taking advantage of available opportunities. Students who were already more likely to be successful may have interacted more profitably or more often with their advisors, thereby biasing estimates toward finding positive results from advising.

As higher education institutions invest more in student support services, it is increasingly important to better understand the effectiveness of these services. The purpose of this study was to evaluate, using causal methods, the effectiveness of additional advising services provided to FTFT undergraduate students at a public liberal arts college. We used a randomized control trial (RCT), assigning treated students to an academic case manager (ACM), and then empirically tested how outcomes those students assigned to the ACM differed from those receiving only traditional advising services.

Current literature evaluating the effectiveness of advising is often either descriptive or relies on observational studies, which are subject to bias. Randomly assigned advising interventions, however, can mitigate this bias and provide plausibly causal estimates. To date, at least three studies have used random assignments to evaluate the effectiveness of advising. These studies found that female college students, and maybe males, benefited from more intensive advising. Bettinger and Baker (2014) found that students randomly assigned to receive coaching services had comparatively higher retention and college completion rates. Angrist et al. (2009) randomly assigned Canadian college students to four groups: one with access to an array of services, one with financial incentives to earn a minimum grade point average (GPA), one with both treatments, and a control group. Women who received both treatments used more services and earned higher GPAs than women in the services-only group or the control group. Treatment did not affect men’s use of services or short-term outcomes. Evans et al. (2017) found similar results for Texas community college students: Women who received comprehensive case management and emergency financial assistance completed their degrees significantly more often than the control group.

Other research has estimated plausible causal effects of providing additional academic, mental health, and/or financial support to college students; some estimate the combined effects of additional academic and financial support. Scrivener and Weiss (2009), using a RCT, found that enhanced counseling and the introduction of a small stipend increased registration rates in following semesters and increased credit hour attainment among community college students. Scrivener et al. (2015) evaluated the City University of New York system’s Accelerated Study in Associate Programs (ASAP) that provided financial support and academic support to community college students. This RCT found increased persistence, improved credit accumulation, and higher rates of associate degree completion among treated students. Sommo and Ratledge (2016) estimated comparable results for ASAP in Ohio.

Similar programs targeting high schoolers have found like results. Academic support services and financial incentives raised high school graduation rates and college enrollment (Oreopoulos et al., 2017), increased adult earnings and employment, and reduced welfare receipt (Lavecchia et al., 2020).

One study used causal methods to estimate separate effects of academic support, generous financial support, or both. Clotfelter et al. (2018) used regression discontinuity and difference-in-differences to estimate the effects on low-income students’ college success. Simply receiving financial assistance did not improve students’ outcomes; when the program added academic support (faculty and peer mentoring, workshops, etc.) students increased their number of earned credits, improved grades, and possibly enhanced graduation rates.

This study examined a public liberal arts college of less than 4,000 students who primarily follow a faculty advising model. Faculty advisors are assigned to students in their specific major to provide academic, professional, and course registration advising. First-year students take a first-year seminar, which is topical and tied to their faculty-member/advisor’s discipline. The instructor of that seminar is their advisor of record during their first semester. In their second semester, students are assigned a faculty advisor in their intended major, whether they have formally declared a major. Undecided students are assigned one of three professional staff advisors in the Academic Success Center.

The Office of Academic Advising employs three full-time advisors who work with undecided students on course registration, withdrawals, and academic success issues such as tutoring. They also advise students on a host of secondary issues that impact student success such as financial issues, health and counseling, and available social services. First-year retention was a particular concern at the institution, with retention at 78 percent in 2016, falling to 72.5 percent in 2019, and well below the national average of 81 percent (Irwin et al., 2021).

The study targeted below average students who were not eligible for targeted advising programs. We identified all entering FTFT students in fall 2019 and fall 2020 with SAT or ACT scores below the institutional median. We excluded two groups of students who already received additional advising services: college athletes and those enrolled in a specialized, high-touch program. This program provides more intensive support for first-generation students. The assistant athletics director for Student-Athlete Success provides additional support for student athletes. This target population consisted of students with below average test scores who were not already receiving more intensive advising services.

For this study, the college hired an ACM. The ACM had a background in counseling and reported to the associate director of Academic Advising. She met weekly with the other advisors and her supervisor. In October 2019, the ACM attended NACADA’s annual conference and spent weeks reviewing the literature on the model of case management. She followed a five-step model of relationship building in her work with students, which included developing an intake form for students to understand their contextual situation and generate case notes during each student visit.

After students arrived on campus in fall 2019, the ACM emailed her assigned students to set up appointments. Future communications included texting students, meeting them in the dining hall for casual conversations, and providing comfortable meeting space in her campus office. She helped students identify barriers to their success and matched students to relevant campus resources. Some resources, such as on-campus success workshop programming, were available to all students. In contrast to the typical advising students received, the ACM reached out to her assigned students, encouraged meetings with her, and met regularly with students by making contact at least once every 2 weeks. Typically advised students received services during course registration and when the students reached out to an advisor.

The COVID-19 pandemic shifted the institution to online learning in the middle of spring 2020. Advising similarly shifted to virtual delivery. The ACM continued to reach out to students online and meet with students virtually; the shift in modality led to availability across a wider range of hours than before. Conversations with students included concerns about food insecurity, domestic violence, and financial stress; she also helped her assigned students access community resources. The entering fall 2020 cohort received communications and met with the ACM virtually.

This study used a randomized controlled trial, randomly assigning an ACM to FTFT students. The sample population comprised FTFT students who scored below the institution’s median SAT or ACT score. We excluded students participating in existing high-touch advising programs including student-athletes. In short, we focused not on the most at-risk students but on below median achievement students who are not part of other special programs geared toward retention.

Treated students received additional advising services from a full-time advisor, the academic case manager (ACM) with training in counseling. There were two cohorts: fall 2019 and fall 2020. In fall 2019, 252 students met the population requirements; we randomly assigned 100 students to the ACM with the rest receiving typical advising services—which is significantly less than the average caseload of 260 students of a full-time advisor at public bachelor’s institutions (Carlstrom & Miller, 2013, Table 6.28). In fall 2020, 200 students met the population requirements, and we randomly assigned another 100 students to the ACM. We used a random number generator to assign numbers to each student in the target population, with the students with the lowest 100 randomly assigned numbers assigned to the treatment.

An RCT provides a relatively straight-forward empirical analysis. By randomizing the treatment, a comparison between the average outcomes of the treated group and the control group provides an estimate of the effect of the treatment. The randomization, in effect, controls for other student characteristics that may affect student outcomes.

We considered a variety of outcome variables: credit hours registered, credit hours completed, retention, and GPA. First-year retention, accumulating credits, and grades high enough to remain in good academic standing are all early predictors of on-time graduation. Additionally, we used Ordinary Least Squares to estimate multivariate regressions; adding controls to the analysis can increase the precision of estimates while controlling for any unbalanced student characteristics across treatment status. For student i, we estimated:

The coefficient of interest is β1, the average difference in outcome between treated and control students, accounting for student characteristics. We controlled for a vector of student characteristics, X, that included their SAT score and indicators for whether the student is a Pell Grant recipient, a first-generation college student, and male. The institution has test-optional admissions although enrolled students are required to provide their ACT or SAT scores. For students only submitting ACT scores, we converted ACT scores to SAT scores using concordance tables (College Board & ACT, 2018). Although the ACM met only with treated students, many students learn from their peers (National Survey of Student Engagement, 2014). Any spillover effects of the ACM bias the estimates against finding a positive effect of the ACM on student success. The error term, εi, reflects idiosyncratic differences among students.

Table 1 presents cohort-specific summary statistics for the population. Sample sizes are somewhat smaller than the initial population because of students failing to enroll and, in later semesters, stopping out of the institution. In the fall 2019 cohort, treated students were somewhat more likely to be Pell Grant recipients, first-generation college students, and female. Treated students averaged slightly lower SAT scores than control group students, a statistically significant difference at the 10 percent level. In the fall 2020 cohort, treated students were somewhat less likely to be Pell Grant recipients or first-generation students, and treated students had slightly lower SAT scores and were significantly more female.

Table 1

Checking for Balance in Treatment and Control Groups: Student Characteristics

Checking for Balance in Treatment and Control Groups: Student Characteristics
Checking for Balance in Treatment and Control Groups: Student Characteristics

The RCT design implies that mean differences between groups estimate average intent-to-treat effects. Table 2 presents these means and t-tests for both cohorts. We consider outcomes in the order in which students experience them first for the fall 2019 cohort and then for the fall 2020 cohort.

Table 2

Differences in Student Success Measures

Differences in Student Success Measures
Differences in Student Success Measures

For the fall 2019 cohort, we began with fall 2019 outcomes, such as hours completed or their first semester GPA. The treated students completed significantly more credit hours (almost one hour more on average) and earned higher fall 2019 GPAs. These differences are statistically significant at the 10 percent and 1 percent levels, showing that students randomly assigned to the ACM performed better in the first semester of college. We then considered outcomes in spring 2020: whether the students were retained for the second semester and returned to campus; how many credits hours they registered for and completed; and their GPA. Treated students were more likely to return to campus for spring 2020 and registered for more credits. In spring 2020, the number of completed credit hours and GPA did not differ between groups. We then considered outcomes for the fall 2019 cohort during their second year of college. We observed no other statistically significant differences between groups for the fall 2020 outcomes or the spring 2021 outcomes, although generally the point estimates suggest that the treated students performed better.

We also evaluated the cohort’s outcomes at the end of 2 years. To retain as many students in the sample as possible for the GPA variable, we assigned the last known GPA to students who left the university. We observed higher retention rates, more completed hours, and higher GPAs among treated students, while the differences in hours completed was not statistically significant.

We then analyzed outcomes for the second cohort, those entering in fall 2020 during the pandemic. For this cohort, only one of the t-tests indicated a statistically significant difference between the treated and control groups: treated students registered for more credit hours in their second semester (spring 2021) than control students. Mean outcomes were higher for treated students, although almost none of these differences were statistically significant.

These t-tests estimated the effect of being assigned to the ACM and receiving case management from a trained counselor in addition to standard advising. However, because one student characteristic in each cohort differed significantly across the random assignment, we also estimated multivariate regression results. We used Ordinary Least Squares to estimate the effects of the treatment on student success, controlling for SAT score as well as indicators for whether the student was a Pell Grant recipient, first-generation college student, and male.

The results, as expected, were similar to the t-tests presented in Table 2 and are available upon request. For the fall 2019 cohort, we continued to observe that treated students performed better in fall 2019 with significantly higher GPAs, earned credit hours, and registered credit hours during their second semester. Treated students were also more likely to return in spring 2020. These effects were reasonably large. Treated students earned almost a third of a letter grade higher GPAs and completed almost one additional credit hour than control students. They were about 7 percent more likely to be retained as well, roughly consistent with the mean difference in Table 2 where treated students had 97 percent retention and control students, 91 percent. We observed no significant difference in cumulative hours earned or GPA by the end of spring 2020. In their second year, we observed no statistically significant differences between the groups. At the end of the sample period, treated students have stayed longer, completed more hours, and earned higher GPAs, although the effect on GPA was the only statistically significant estimate. We estimated similar regressions for the fall 2020 cohort. Consistent with the t-tests, we observed a general pattern where treated students registered for and earned more credit hours and higher GPAs, although these differences mostly were not statistically significant.

Angrist et al. (2009) and Evans et al. (2017) found stronger effects of interventions for women than for men. We allowed the effect of treatment to differ by gender; however, in results not presented here, treated women and treated men were not statistically different. The coefficients on the interaction term with females tended to be positive but not statistically different from zero. We found similar results when pooling the two cohorts.

Although an RCT provides cleaner identification than many other empirical strategies, there are some limitations. Ultimately, whether the students graduate is of interest; as more time elapses, future research may consider this. The size of the current institution prevents large sample sizes, limiting the power of the analysis. In addition, the timing of the pandemic changed the implementation of the ACM’s work from in-person to virtual. The fall 2019 cohort received a different treatment than the fall 2020 cohort in that students entering college in fall 2019 received in-person services for most of their first academic year. The fall 2020 cohort benefited less from in-person treatment than did the pre-pandemic FTFY students. Interestingly, Peters et al. (2023) observed little difference between virtual and in-person advising; therefore, more research comparing the modalities would be useful.

The treatment assigned students to one person providing academic case management; the RCT compares students who received this person’s services to those who did not. The ACM provided a range of services, but we are unable to separate out which parts of her work increased retention. Detailed, causal analysis of different approaches to advising would be beneficial future research.

This quantitative study found that students receiving additional advising services from an ACM benefited, at least while the university operated fully in-person before the pandemic. The sample population comprised FTFT students with below average applications at a public, liberal arts college. Using a randomized control trial, the plausibly causal estimates show that treated students earned higher first semester GPAs, earned more credit hours in their first semester, and were more likely to return for the spring semester. Women may have experienced greater benefits from the ACM.

The benefits of the ACM outweigh the cost of additional personnel. The annual cost of salary, benefits, and office equipment for the ACM was less than $55,000. Considering only the increase in fall-spring retention of roughly 7 percentage points, for her hundred-student caseload, this implies 7 more students enrolling and paying tuition in the spring. First-time, first year students must live on campus and purchase a meal plan—so the expected, average annual revenues per student for tuition, fees, housing, and dining services was $8,445. Retaining 7 more students one semester longer retains $59,115 in revenue. Even with no other improvements in student success, increased retention pays for employing one more ACM. Although unretained students can be replaced with transfer students, transfers are uncommon at this institution in the middle of the academic year.

Results from this study support increased institutional investment in case management. The ACM was particularly effective while students remained on campus. The benefits of the ACM may stem from the ability to form relationships with her advisees; relationships are more easily built and maintained in-person.

Angrist,
 
J.,
Lang,
 
D.,
&
Oreopoulous,
 
P.
(2009)
.
Incentives and services for college achievement: Evidence from a randomized trial
.
American Economic Journal: Applied Economics
,
1
(
1
),
136
163
. https://doi.org/10.1257/app.1.1.136
Bettinger,
 
E. P.,
&
Baker,
 
R. B.
(2014)
.
The effects of student coaching: An evaluation of a randomized experiment in student advising
.
Educational Evaluation and Policy Analysis
,
36
(
1
),
3
19
. https://doi.org/10.3102/0162373713500523
Carlstrom,
 
A. H.,
&
Miller,
 
M. A.
(Eds.).
(2013)
.
2011 NACADA national survey of academic advising (Monograph No
.
25).
National Academic Advising Association
. https://nacada.ksu.edu/Resources/Clearinghouse/View-Articles/2011-NACADA-National-Survey.aspx
Clotfelter,
 
C. T.,
Hemelt,
 
S. W.,
&
Ladd,
 
H. F.
(2018)
.
Multifaceted aid for low-income students and college outcomes: Evidence from North Carolina
.
Economic Inquiry
,
56
(
1
),
278
303
. https://doi.org/10.1111/ecin.12486
College Board, & ACT, Inc.
(2018)
.
Guide to the 2018 ACT/SAT concordance
. https://collegereadiness.collegeboard.org/pdf/guide-2018-act-sat-concordance.pdf
Erlich,
 
R. J.,
&
Russ-Eft,
 
D. F.
(2013)
.
Assessing student learning in academic advising using social cognitive theory
.
NACADA Journal
,
33
(
1
),
16
33
. https://doi.org/10.12930/NACADA-13-135
Evans,
 
W. N.,
Kearney,
 
M. S.,
Perry,
 
B. C.,
&
Sullivan,
 
J. X.
(2017)
. Increasing community college completion rates among low-income students: Evidence from a randomized controlled trial evaluation of a case management intervention (Report No. 24150).
National Bureau of Economic Research
. https://doi.org/10.3386/w24150
Irwin,
 
V.,
Zhang,
 
J.,
Wang,
 
X.,
Hein,
 
S.,
Wang,
 
K.,
Roberts,
 
A.,
York,
 
C.,
Barmer,
 
A.,
Bullock Mann,
 
F.,
Dilig,
 
R.,
&
Parker,
 
S.
(2021)
.
Report on the condition of education 2021 (NCES 2021-144)
.
U.S. Department of Education. National Center for Education Statistics
. https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2021144
Lavecchia,
 
A. M.,
Oreopoulos,
 
P.,
&
Brown,
 
R. S.
(2020)
.
Long-run effects from comprehensive student support: Evidence from Pathways to Education
.
American Economic Review: Insights
,
2
(
2
),
209
224
. https://doi.org/10.1257/aeri.20190114
Mu,
 
L.,
&
Fosnacht,
 
K.
(2019)
.
Effective advising: How academic advising influences student learning outcomes in different institutional contexts
.
The Review of Higher Education
,
42
(
4
),
1283
1307
. https://doi.org/10.1353/rhe.2019.0066
National Survey of Student Engagement (NSSE).
(2014)
.
Bringing the institution into focus–Annual results 2014
.
Indiana University Center for Postsecondary Research
.
Oreopoulos,
 
P.,
Brown,
 
R. S.,
&
Lavecchia,
 
A. M.
(2017)
.
Pathways to Education: An integrated approach to helping at-risk high school students
.
Journal of Political Economy
,
125
(
4
),
947
984
. https://doi.org/10.1086/692713
Peters,
 
B.,
Burton,
 
D.,
&
Rich,
 
S.
(2023)
.
Post COVID-19: A comparative assessment of in-person and virtual academic advising
.
NACADA Review
,
4
(
1
),
2
15
. https://doi.org/10.12930/NACR-D-22-10
Scrivener,
 
S.,
&
Weiss,
 
M. J.
(2009)
More guidance, better results: Three-year effects of an enhanced student services program at two community colleges
.
MDRC
. https://www.mdrc.org/publication/more-guidance-better-results
Scrivener,
 
S.,
Weiss,
 
M. J.,
Ratledge,
 
A.,
Rudd,
 
T.,
Sommo,
 
C.,
&
Fresques,
 
H.
(2015)
.
Doubling graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs (ASAP) for developmental education students
.
MDRC
. https://www.mdrc.org/work/publications/doubling-graduation-rates
Smith,
 
C. L.,
&
Allen,
 
J. M.
(2014)
.
Does contact with advisors predict judgments and attitudes consistent with student success? A multi-institutional study
.
NACADA Journal
,
34
(
1
),
50
63
. https://doi.org/10.12930/NACADA-13-019
Sommo,
 
C.,
&
Ratledge,
 
A.
(2016)
.
Bringing CUNY Accelerated Study in Associate Programs (ASAP) to Ohio: Early findings from a demonstration in three community colleges
.
MDRC
. https://www.mdrc.org/work/publications/bringing-cuny-accelerated-study-associate-programs-asap-ohio
Swecker,
 
H. K.,
Fifolt,
 
M.,
&
Searby,
 
L.
(2013)
.
Academic advising and first-generation college students: A quantitative study on student retention
.
NACADA Journal
,
33
(
1
),
46
53
. https://doi.org/10.12930/NACADA-13-192
Vianden,
 
J.,
&
Barlow,
 
P. J.
(2015)
.
Strengthen the bond: Relationships between academic advising quality and undergraduate student loyalty
.
NACADA Journal
,
35
(
2
),
15
27
. https://doi.org/10.12930/NACADA-15-026
Young-Jones,
 
A. D.,
Burt,
 
T. D.,
Dixon,
 
S.,
&
Hawthorne,
 
J. M.
(2013)
.
Academic advising: Does it really impact student success
?
Quality Assurance in Education
,
21
(
1
),
7
19
. https://doi.org/10.1108/09684881311293034

Author notes

This project was funded by the University of North Carolina System’s Student Success Innovation Lab with support from the Laura and John Arnold Foundation and ECMC Foundation. We thank Shun Robertson, Tonya Walton, and Andrew Kelly for their assistance. Registration of the randomized control trial can be found at OSF at doi: 10.17605/OSF.IO/8QCS3

Angela Dills is the Gimelstob-Landry Distinguished Professor at Western Carolina University. She earned her B.A. in economics and Spanish at the University of Virginia and an MA and PhD in economics at Boston University. Her research focuses on education policy at all levels. Dr. Dills may be reached at [email protected]

Deaver Traywick is the Director of Institutional Planning and Accreditation Support at UNC Asheville. He earned his B.A. in East Asian Studies at Davidson College and an MFA in Creative Writing at the University of South Carolina. Prior to moving into institutional research and planning, he served as senior director of student success at UNC Asheville, where he supervised the offices of Advising, Tutoring, Accessibility, Career Services, and Study Abroad.