Rise of the machines on college campuses

Artificial intelligence—led by text-based chatbots—has infiltrated campus life, helping institutions improve communication, compliance and retention
By: | Issue: October, 2018
September 21, 2018

The challenge: Reduce summer melt for incoming freshmen at Georgia State University.

The method: Use an artificially intelligent chatbot to help students complete pre-enrollment tasks, such as submitting transcripts and taking care of immunization requirements.

The results: Summer melt for 2016-17 was 21 percent less than in the prior year.

The hero: Pounce the text-based chatbot, serving as a Siri-like virtual assistant.

In practical terms, the chatbot’s role in reducing summer melt meant 116 more students than average followed through from acceptance to matriculation, says Lindsay Page, co-author of a research report on Georgia State’s efforts and assistant professor at the University of Pittsburgh School of Education.


Related: How campus student service areas can use chatbots


“Students could text into the system anytime, day or night, and ask a question. They didn’t have to wait for an overburdened university administrator to reply.”

Immediate answers to common questions kept students moving forward and on task. It also freed admissions, registration, counseling and financial aid staff to focus on the questions and scenarios that were too challenging or complex for AI.

By its second year, Pounce was exchanging nearly 2 million texts with students. Georgia State estimates hiring an additional 10 to 15 staff members would be necessary to handle that messaging volume.


Related: AI-miable connections


Other colleges are experiencing similar success, mainly with text-based (rather than web-based) chatbots. Here’s what the buzz is about with AI experimentation and some insight into launching the technology.

Winning with Winston

Text-based chatbots incorporate a proactive approach. Rather than waiting for a student to stumble upon the service online, a text-based chatbot reaches out directly to students via their phones.

For example, targeted students may get a message like, “Hey, I see you haven’t finished your FAFSA yet. Do you want some help to walk through it; yes or no?” Anyone replying “yes” is led through the process, either through the chatbot or a website.


Related: AI providers on implementation considerations


Students accepted by Winston-Salem State University in North Carolina receive an introductory text message from Winston, the institution’s AI chatbot. Winston first congratulates them and then supports them through the onboarding process, says Joel Lee, assistant vice chancellor for enrollment management.

“We wanted a better way to meet students and communicate in the ways they are communicating. Unlike email or traditional mail, texts have basically a 100 percent open rate.”

Introduced a little more than a year ago, Winston can now answer 75 percent of inquiries—or 3,000 to 4,000 questions—from incoming students. This past June, the peak period for incoming student questions, Winston engaged with 2,000 students, sending more than 42,000 texts.

“That sounds crazy, but we make sure that students don’t receive more than one or two initial texts a week from us,” says Lee. “We’re only sending texts if something is missing.”

In most of the texts, Winston is responding to questions and follow-ups. Inquiries that require more involved responses are forwarded to staff members.

Although the institution has also implemented a few other new approaches to student readiness, freshman bill payment was up 150 percent this year over the 2017-18 academic year, while immunization compliance was up 86 percent.

Launch time

Implementation of most text-based chatbots usually begins targeting an area of concern, such as reducing summer melt or increasing retention. Once an AI vendor is selected, a knowledge base gets developed.

At Winston-Salem, the cross-departmental implementation team included  financial aid, housing and wellness representatives, who vetted questions and information to be included in the initial version of the chatbot’s knowledge base. The process usually takes six to eight weeks.

California State University, Northridge, launched a text-based chatbot for incoming students this past August to help reduce summer melt, and administrators hope it will also improve retention and increase equity.

The institution currently has a limited web-based chatbot, but the new text-based service already drives more activity, says Elizabeth Adams, the associate vice president for undergraduate studies who oversees admissions, registrar, curriculum and policy.

“For implementing a tech product, I was surprised at how easy a lift it’s been,” Adams adds.

Staff helped build the knowledge base and created student lists, while a vendor did most of the initial chatbot training.

Since launch, staff have been further training it, using an intuitive, web-based interface that allows them to track usage and view the knowledge base and individual conversations. This involves monitoring responses and providing clearer answers.

The 40,000-student university signed a one-year contract for just the incoming freshman class of nearly 5,000 students. Including initial setup, the cost was in the $100,000 range, says Adams. Ongoing maintenance costs are lower.

Providers, such as AdmitHub, Ivy.ai, Signal Vine and CourseQ, offer text-based chatbot platforms for higher ed, with service pricing depending on targets, goals and audience. Most contracts involve a flat rate that can range from $6 to $20 per student annually, which increases if customization or innovation is needed.

An integrated approach

Any AI system will make mistakes or not know the best answers to questions at first, so having an individual dedicated to monitoring conversations is necessary for updating and expanding the knowledge base.

Most systems start with a few hundred questions and answers, and quickly expand into the thousands. Some chatbots redirect inquiries to websites where information is updated by individual departments.

Ongoing tweaks can decipher texts that are unclear.

For example, when Winston can’t understand a message, says Lee, “He’ll say, ‘I don’t understand. I do best with short questions. Can you ask in a different way?’” A text that’s still unclear is forwarded to an email box and answered by a staff member.

Implementing a text-based chatbot should involve having a specific goal and should be part of a comprehensive communication strategy so it does not overlap with other platforms or repeat messages.

“This kind of system can alleviate some of the challenges and uncertainties that students are facing, but it doesn’t fix everything,” says researcher Page. “It’s another tool to use, but not the only tool.”


Ray Bendici is deputy editor of UB.