When evaluating academic operations, begin with better measures
In its DNA, higher education is familial and collaborative, and no members of university communities want to be in their current predicaments. As institutions look to the future, administrators face extremely tough decisions about how to best serve students and keep everyone safe as the pandemic persists.
Increasingly, trustees are asking senior staff to present budget-cut scenarios, some as drastic as 50 percent, but very few institutions have a model to evaluate operations in that way, especially programmatically. Provosts, deans and other university administrators have the unenviable task of navigating uncharted territory, but it needn’t be wholly unfamiliar. There is a fair and representative path forward, and you shouldn’t be surprised when you find areas in which to grow as well.
Institutions can apply the same rigor they bring to evaluating student learning outcomes to understanding better the sustainability of existing academic programs.
By adopting an “academic portfolio management” (APM) approach, which is both supplemental and coalescent, colleges and universities can undertake this effort in a comprehensive, context-sensitive and data-informed manner. Among the advantages of a portfolio model are balancing risk and bundling assets to manage them collectively to achieve broader objectives. Institutions will secure the hard data identifying essential and exemplary programs in which to continue to invest, as well as a detailed grasp of programs they ought to consolidate or “teach-out,” but at least equally importantly, the data may suggest vital areas of need in which to expand and create new programs.
As with any data analysis you undergo, you need to manage your expectations. APM will be as reliable as your inputs, and calculating each academic unit’s cost and yield gets complicated quickly. To disentangle expenses and revenues associated with various courses and programs, you need to be collecting essential data already and storing and querying it appropriately. Undertaking this without these capabilities should be a nonstarter; this must be a data-informed exercise.
However, with enrollment downturns, colleges may be tempted to evaluate programs and academic operations via “program costing” (also known as “activity-based costing”), which looks at programming exclusively through a financial lens. Institutions may seek to identify the cost- and revenue drivers of each academic program to understand each unit’s financial contribution. Programs that generate the least money become destined for the chopping block. I know it feels productive to take quick, decisive action, but this approach too often draws upon false premises. A misstep here can lead to worse financial outcomes. Institutions may sacrifice programs that are more valuable than they appear initially, and which might prove to be key contributors after a thorough review.
As you lead your institutions through these trying circumstances, you can take comfort in the fact that APM mitigates the above risks. By taking a more holistic approach rather than simply evaluating dollars and cents, which has a way of obscuring true value, this exercise becomes meaningful and institutionally introspective. You’ll be emphasizing effectiveness and sustainability over mere cost reduction, all with a larger goal of self-improvement. As with student learning, APM centers on rubrics for program assessment. The one I use with institutions focuses on five key criteria and guides all conversations and calculations:
Mission: How does the program align to your mission? Higher education is mission-driven, so naturally, you must take this into consideration. Some programs that don’t generate money remain essential to your institutional mission, identity and priorities. Despite balance sheet shortcomings, their absence may come at a greater cost than their operation. Another strong consideration is how your academic programs support diversity, equity and inclusion. Ensure that your academic portfolio includes programs that appeal to underrepresented minorities and provide pathways for social mobility.
Program demand: By looking at your application, enrollment and transfer trends, you’ll be able to assess the demand and competitive landscape for various programs. However, this data is more nuanced than simply a number of majors, since some programs—which don’t confer as many degrees or have as many majors—disproportionately contribute to students’ ability to graduate. A department may have declining declared majors, even as its course registrations, which other programs require, rise exponentially. Such departments often generate high numbers of credit hours.
Degree conferrals: How are programs contributing to degree attainment? Beyond program demand, you must weigh degree conferrals. The programs that drive demand don’t always produce degrees. Say 15 percent of students pursuing a high-profile, prestigious program change majors after the first year. That program has notable attrition but draws students and drives revenue, and students remain at the institution in other academic programs.
Career outcomes: Whether students who earn a degree can secure jobs is top of mind for students and parents as they evaluate program performance. Consider the connection between academic programs and careers. Try to tease out from your data whether programs lead to jobs immediately, as well as their outlook across careers. No single data source can provide this, so you have to triangulate the data with multiple approaches. One way to do that is to map your programs to federal, state and local Bureau of Labor Statistics data to forecast future earnings and growth potential for your graduates within their majors. You can also use data from surveys and from your advancement office or foundation to understand alumni career outcomes. Many states are developing data centers aligning educational and workforce longitudinal data, which will prove indispensable here.
Contribution margin analysis: What are the direct and indirect costs, and is the program in the black or red? While this holistic model is about more than money, it is necessary to isolate each program’s contribution margin and include it in the equation. Disentangling revenue and expenses can be complicated. There are many ways to approach this effort—at its core it is an understanding of direct revenues and costs and how they impact a program, in addition to the indirect revenues and costs and how they are allocated—but whatever method you choose, knowing contribution margin is critical for success.
The key to rigorous program portfolio analysis is to consider this information in relation to all five of these criteria: mission, program demand, degree conferrals, career outcomes and contribution margin analysis. Using this rubric, say you evaluate four programs:
- Program A scores higher due to its relevancy to mission despite a lower contribution score. This is a program to sustain.
- Program B scores well on career outcomes, demand, and degree production but is neutral on mission and negative on contribution margin. This is a program in which to invest.
- Program C scores very well on a contribution matrix but does not tie to the institutional mission, and it lacks favorable program demand and career outcomes. You could sustain this program, or you might consider other options
- Program D is not accretive or dilutive towards the institution. Its applications and enrollments are limited, and graduates struggle to get jobs after graduation. While there is demand, it is minimal. This is a program that requires discussion.
By collecting and organizing all of this information into a rubric, and ideally illustrating the results with clear charts and scatter plots, you will have transparency with your faculty and facilitate meaningful conversations about the path forward. It’s also firmly grounded in the data, which makes it a fair process.
Presidents and provosts are in the fraught position of having to make hard choices, and being on the receiving end of appeals from colleagues they care for personally and professionally makes an already difficult situation even more daunting. Taking a long-term view that sees this process as a purpose-driven endeavor and follows a clear, holistic and data-informed approach that evaluates all programs equally helps ease this burden. Further, by managing programs proactively, institutions will be able to not merely cut but grow by identifying and acting on opportunities to better serve their communities and move from a conversation about cost to a dialogue on institutional sustainability and success.
Darren Catalano is the former vice president of analytics at the University of Maryland Global Campus. He currently serves as CEO of HelioCampus, a data analytics company born out of the University System of Maryland that provides strategic decision support and insights to colleges and universities to help them grow resources, maximize returns of academic programs and optimize administrative spend.