Analyze student data for real progress in student success

5 signs that a data analytics mindset has infiltrated campus culture

There’s no doubt that higher ed institutions have access to tons of student data these days, but what separates actionable insights from analytics overload? Colleges and universities that crack the code can embark upon data-based decision-making as it relates to student success initiatives. Others, struggling, get lost in the numbers.

“Most institutions are data rich and information poor,” says Darren Catalano, who was vice president of analytics at the University of Maryland University College from 2011 to 2015. “They have data and different teams working in different systems, but they’re not looking at it holistically,” adds Catalano, now CEO of HelioCampus, a data analytics company.

Overcoming the challenge, however, often results in real progress in student success. While there is no one formula for data success, data-empowered institutions share five key attributes for deciphering information in a meaningful way to create a culture of analytics across campus.


Data is used to optimize resources.

College and university leaders would love to have the resources for one-on-one conversations with every student to discuss pathways to success. “But that’s not the reality,” says Bryan Terry, vice chancellor for enrollment management at The University of North Carolina at Greensboro.

His school works with Rapid Insight. Its analytics software offers predictive modeling scores that rate how prepared a student is—from an academic and social standpoint—to persist, and ultimately graduate.

“We take a much more intrusive approach by looking at students who are going to need help the most, and have specific interactions for those students,” says Terry.

For instance, UNC has counselors cover financial planning, time management skills and other topics with low-income, first-generation students. Tracking student card swipes is a current effort.

“If we start to see a student has stopped going for meals, for example, that might be a flag to say this student is withdrawn,” Terry says. “Is it because of a tough course schedule? We can do specific outreach to them to find out.”

At SUNY Empire State College, data analysis in the LMS showed administrators that the first two weeks of a semester is critical for predicting whether a student will successfully complete a course.

“Things like the number of log-ins and discussion posts are proxies for engagement,” says Mitchell S. Nesler, vice president for decision support. “Those who engaged early are more likely to finish.”

In the past, student services might have stepped in after a month, but this insight proved the need for earlier support.


Administrators look at the right numbers.

Big data overwhelms some institutions because there’s no directive as to what analysts should be tracking. “Analytics is best when you’re trying to answer specific questions,” says Catalano of HelioCampus.

Simply scanning data for trends is a mistake.

Clint McElroy, dean of retention services at Central Piedmont Community College in North Carolina, says, “We recognize issues, and then have a conversation about the types of data that might inform us. What are the things we need to know to find out what’s causing the problem? That’s what has really changed our culture.”

For example, his team tried to figure out why about 20 percent of students were registering for classes, but not attending. As it turns out, many of those students had begun but not completed federal financial aid paperwork.

This happened to be around the time that a data breach forced FAFSA to shut down its Data Retrieval Tool, which allows families to input their tax return information. That insight prompted campus advisors to reach out and offer assistance.

Keeping more robust student files is a simpler process because the college has a centralized analytics system, built in-house. “It’s not unusual for multiple people to work with one student,” says McElroy.

“Data can keep track if they’ve been referred to tutoring, if they followed through. With more individualized information, we can give a more personalized experience.”

CPCC also uses Blackboard Intelligence to present aggregated data on students, course sections, enrollment, persistence and other areas via automatically populated dashboards to specific audiences across campus. The dashboards update daily—and staff don’t need to be taught how to run reports, says McElroy. “That’s really helpful.”


Officials break down big data, and take action.

Student success goals become more attainable when data is broken down to a level that’s relatable to each person’s specific role.

“We talk a lot about grad rates and retention rates, but that’s not actionable data—those students are gone already,” says Melissa Irvin, assistant vice president for student success at Tennessee Technological University.

Instead, she advises prioritizing information that helps instructors and others improve their performance. For example, professors can benefit from knowing which students in class have attempted the course before.

“Once people start seeing that they can get answers to the questions they’ve always had, they can immediately start working on outreach,” says Irvin.

At The University of Arizona, analyzing at-risk behavior indicators and then reporting them to academic colleges inspires everyone to action, says Melissa Vito, senior vice provost for academic initiatives and student success.

For example, using Civitas Learning’s Illume solution revealed that students passing English 101 with a C had a 15 percent lower overall graduation rate than those earning an A or B, despite the fact that it had no major effect on year-to-year retention.

“That’s something we wouldn’t have seen without a handle on data,” Vito says. The English and writing departments are now working to improve not just grades, but the level of competency needed to finish college.

Putting the data into perspective is helpful. Vito’s team told academic advisors that keeping just two more students enrolled would raise retention rates by four points. “All of a sudden, the data didn’t seem so cold and distant,” she says.

Administrators at Maryville University in St. Louis had a similar aha! moment. “Back when we had only 300 first-year students, we would say that three students is a percentage point,” says Jennifer McCluskey, vice president for student success.


The role of data analysts is respected.

Whether a college outsources its data solutions or has a campus research office, it’s important to have analysts in-house who also understand the institutions’ student success goals.

The relationship between the analytical and operational teams is key at Indiana University Bloomington, says Doug Anderson, director of strategic planning and research. “It’s very possible to make a pretty report that doesn’t meet needs or answer questions,” he says.

“If the people creating the report are not well-connected to the decision makers, you can swim through the data all day and not learn much.”

His team makes a point to meet with end users for feedback and requests. They might learn people are comparing two data points that happen to be on different slides and needing to click back and forth.

With that knowledge, the research department can generate a side-by-side chart for them next time, using the institution’s data analytics platform, Tableau.


Leaders aren’t afraid to take action or change direction.

Before Indiana U Bloomington had the tools allowing staff to dive into data independently, there was a lag time between meetings and decision-making, says Anderson.

“We used to print out a 20-page report for one meeting, and we’d leave with a ton of follow-up questions. The discussion would hit a dead end.” Now administrators can pull up data on a conference room screen—answering any questions and prompting discussion. “A policy decision can be made right then and there,” he says.

Data can help disprove certain assumptions, such as why one cohort of students is more likely to complete a course, says McElroy. “Being proven wrong can be just as helpful as being right. You might have an idea of why students are not doing well, but data may show that it’s another factor. That’s a good thing because it allows students to be more successful.”

With institutions judged on everything from retention to graduation rates to indebtedness—both by the federal government and in various rankings—it’s imperative to look for new ways to raise performance, says Vito at Arizona. “Staying intellectually curious, and going back to data to see things that we may not have seen initially, is the only way to improve.”

SUNY Empire State’s Nesler agrees. “Given that we operate in an area with scarce and valuable resources, think of institutional data as strategic,” he says. “Just like Amazon leverages the intelligence based on purchases, higher education could be doing the same thing.”

Dawn Papandrea is a Staten Island, New York-based writer and frequent contributor to UB.


Most Popular