3 rules for using AI tools in campus business processes

How higher ed can move from strategy to practice with artificial intelligence

As enrollment at Penn State World Campus increased, academic advisors raced to keep up. They needed support to manage more than 50 processes ranging from assisting students with course selection to handling requests to change majors—almost all manual actions.

Artificial intelligence, administrators determined, could be part of the solution.

“We decided to look at ways we could scale with increasing enrollments,” says Dawn Coder, director of academic advising and student disability services. “We started experimenting with AI to see how technology could support us and better service our students more individually, successfully and timely, while maintaining quality.”

AI sped up response time; now advisors spend less time crafting emails and looking up information in files and have more time to meet with students.

The market for AI in higher education is expected to increase almost 48% before 2022, according to a 2018 report by market research company Technavio. While teaching and learning with AI-powered software has attracted the attention of higher ed administrators, they are also investing significant resources in chatbots, machine learning and biometrics to boost efficiencies in their business operations.

“The end-to-end change that needs to happen to use digital tools like AI to improve back-office functions and make processes flow a little bit better is not a simple thing to take on,” says Justin Klutka, vice president of product technology at Wiley Education Services, a tech services provider. “Schools have to figure out their goals and how AI fits into their strategy before implementing these tools.”

Here are three rules to help AI make a difference in business processes.

1. Emphasize efficiencies

The academic advising department at Penn State World Campus started experimenting with AI this summer to create boilerplate emails to respond to common questions in core areas such as deferring a semester, reenrolling or †¨changing campuses. The “smart responses” help streamline operations, reducing the amount of administrative time spent answering recurring inquiries.

“To provide students with accurate information, we have to go into [our current] student information system and click on several different screens to gather the information we need,” Coder says.

What to do †¨before making an †¨investment in AI

Assess options. Artificial †¨intelligence has the potential to address a number of business issues, so it’s important to †¨establish specific goals. Dawn Coder, director of academic advising and student disability services for Penn State World Campus, believes that lacking a clear vision can make it harder to see the value of AI. “Part of the reason we’ve been so successful,” she says, “is because we did process mapping and identified areas where we thought things could be automated for us,” while also providing more positive †¨student interactions.

Ask questions. Technology providers may not have experience working with higher ed institutions and your team may lack sufficient expertise in AI to marry institutional needs and available technology options, points out Coder. “The biggest challenge for us was looking at what was available and connecting that to our business need, and then asking questions to overcome the unknown.”

Understand cultural issues. Rolling out AI tools for business processes can be challenging because faculty and staff may not be as comfortable with the tools as students are, says Curtis Carver, CIO of The University of Alabama at Birmingham. “We started with systems that were much simpler, much safer and had a much lower risk profile to augment what overworked faculty were already doing.”

The staff trained its AI tool to retrieve information from the database, analyze it and provide a summary on the user interface, decreasing the process time from 15 minutes for an academic advisor to just a few seconds for AI. The tool is expected to launch departmentwide in 2020. In the meantime, much work is going on behind the scenes where academic advisors are doing quality assurance reviews to ensure that AI responses are as accurate as their own (and more efficient).

Boosting efficiencies was also top of mind when officials at The University of Alabama at Birmingham began embracing AI. Their first initiatives, which triggered automatic notifications to students who were not attending classes or were at risk of failing, got introduced in 2017; the campus is now piloting a chatbot to respond to frequently asked student or staff questions.

Curtis Carver, vice president of information technology and chief information officer at UAB, explains that staff can now avoid navigating through multiple screens on their current administration system to access a pay stub, to check available vacation time or to perform other human resources functions. Instead, they can just command the chatbot: “Show me my pay stub.”


Also read: Curtis Carver on putting students and faculty first, including by using AI, in UB Tech® keynote


“It lets people solve problems and move forward much more efficiently,” says Carver, who covered AI as part of his keynote address at UB Tech® 2019. “We’re all so busy, and right now we’re not serving the information in the right format to help people do the right thing.”

2. Prioritize privacy

Allowing a chatbot (or other AI tool) to gather and analyze mass quantities of information—especially sensitive information such as student and staff records—raises privacy concerns. Almost half of U.S. adult respondents to a 2018 survey conducted by The Brookings Institution believed AI would reduce their personal privacy.

To address those concerns, UAB published new privacy standards that cover information on websites, the European Union’s General Data Protection Regulation (GDPR) data, and data related to children 13 and under (who attend camps or other campus activities). Carver expects to publish an additional privacy standard related to non-GDPR data later this year.


Also read: Chatbots infiltrate campus life


“We are, like many other universities, mapping where the data is in our systems,” he says. “We’ve made a good start in looking at privacy as it relates to the AI systems, [but] it’s going to take us two to three years to identify all of the IT systems that have privacy-related data and then create a mechanism to remove that data if the user requests it.”

A data protection officer was appointed to collaborate with the chief information security officer on implementing the privacy standards and mapping the location of the data in their systems.

3. Plan for the future

Information security is a big concern for Dartmouth College officials. Mitchel Davis, vice president and CIO, has started exploring ways to use AI to make business processes more secure. In 2018, he deployed a cloud-based wireless AI solution that monitors all computers on the network for abnormalities, including malware, that could put information at risk. If problems are identified, the tool fixes them.

“AI can respond a million times faster than a person,” Davis says. “It solves problems for us before we even notice them. It makes our network much more dependable.”

The technology also has the potential to improve information security. Davis is in talks with providers about how their tools can monitor standard computer operations and flag irregularities. For †¨example, if an admissions officer—who regularly uses a desktop computer to access test scores, enter information into student files, schedule tours and book travelattempts to log in to a secure server containing financial data, a behavior-based AI tool could send a notification to a predetermined campus contact or shut down the computer.


Related: Higher ed tech providers on AI in business process


 

Davis believes next-generation AI tools that prioritize deterrents provide tremendous value. “AI and security go hand in hand,” he says.

At Penn State World Campus, Coder is also looking toward the future. As department staffers monitor key performance indicators to determine if their current AI tool is successful, she hopes additional tools can be deployed to take over more business functions. “If we gain confidence [in the tool], if it is accurate and if it shows us that we are really saving time for our academic advisors and freeing up time so that we can work more closely with students, we would consider it a success,” she says. And that success will lead to this question: “What else can we do with AI?”

Jodi Helmer, a North Carolina-based writer, is a frequent contributor to UB.

 

 

Most Popular