Using Predictive Analytics to Improve Retention

Identifying at-risk students and boosting student success

Predictive analytics has proven to be a crucial element for managing enrollment at many institutions. However, there is often a lack of clarity about how best to utilize these tools to boost retention.

In this web seminar, presenters discussed using predictive analytics to improve retention modeling, and how to distribute this information to end users to ensure the data created is actionable. The director of institutional research and effectiveness at Bellarmine University in Kentucky outlined how the institution is using predictive analytics to identify at-risk students and boost retention. For its work, Bellarmine received the 2018 ACPA Award for Innovative Academic Support Initiative for the work, and now has over 70 faculty and staff involved in the process.


Drew Thiemann
Director of Institutional Research and Effectiveness
Bellarmine University (Ky.)

Jon MacMillan
Senior Data Analyst
Rapid Insight

Drew Thiemann: There’s been a great shift—maybe you’d even say a sea change or paradigm shift—in the populations we serve. We have a different call now in terms of who our students and potential students may be. Predictive modeling is one of many tools we can use to ascertain where we are in this space and to help students succeed.

It’s not as simple as just putting students on a path and then peeling back the curtain and saying, “Go toward this outcome.” The outcome may be obvious, but we have to understand that experiences and behaviors will be constantly changing in many dimensions.

Jon MacMillan: In a survey about how widespread data use should be on campus, nearly 90% of respondents thought it should be widespread or very widespread across the board. But in reality, only 36% said it currently is. So where is the disconnect? Is it the tools that are in place? Is it the people who are creating the data? What Drew implemented in the early stages of building a predictive model was involving people from all different departments and getting a good conversation going about what data would be beneficial to them. That helped to create a lot of buy-in; people understood they were a part of the investment.

Drew Thiemann: Why did we move to predictive modeling? There are a lot of ins and outs and a lot of on-ramps and off-ramps, you might say. Predictive modeling simply helps us answer: What are some of the things we know about the cohorts we’ve brought into Bellarmine in the past, which lead to a very simple binary solution or a very simple logistic regression ability? Did they stay or did they go?

Jon MacMillan: When we’re predicting outcomes such as student retention, we’re not going to build a model that perfectly predicts every student outcome. You’re going to have instances in which students leave for a wide array of reasons that we can’t necessarily predict. So it’s not necessarily turning students into a score, but instead it’s identifying those significant factors that have historically proven to indicate attrition or retention, and use that knowledge to your advantage. Use all the tools and resources available to you to be proactive in your outreach.

Drew Thiemann: Predictive modeling gives us the ability to tailor interventions in a more nuanced and strong way. That allows us to make a difference for certain types of students in a way we hadn’t before, while also opening up pathways to work with other types of students.

We wanted to understand students in all of their complexities. We’re trying to move up from just information we share descriptively about students to a place where we can say, “Here’s the real, important takeaway from this. Here’s the upshot.” ‚We believe that individual students, their experiences and their behavior all matter, so we need to try to find a way not to be post-positivist, but to understand things in a more constructivist way.

Here’s how we actually used Veera: We started with an idea that we would pump one, two and three years’ worth of demographic data about the cohorts into the system and train a model based on that, and then iterate on it. Then, we would score our first checkpoint, and if a student scores in a certain way, we would use FIRE, our case management system, to deliver soft communications through our strategic plan. This is all part of a unified approach that involves dozens of people involved in student success planning and programming. Phase two was done on census day, phase three was done after fall midterms and phase four was done after fall grades.

Rapid Insight software pushes the envelope for us. It allowed us to make quick work of the datasets so that we could move on quickly to phases two, three and four. We rarely had to spend hours and hours getting these things together.

To watch this web seminar in its entirety, please visit

Most Popular