On Friday, July 17, in an article on his website titled USS University, Scott Galloway, a professor of marketing at New York University, published a list of 89 colleges that would perish and 129 colleges that would struggle in the coming months. Galloway, whose post was summarized in a UB article, based his conclusions on a formula that attempts to measure 1) a college’s value for its cost and 2) its vulnerability to COVID-19 related factors. Within hours, the Twittersphere was hopping with comments on his new post. I shared his results on my LinkedIn page and received more views on it than anything I have posted in the last year.
We all like easy answers to complex problems. One of these problems is who and what is going to survive COVID? Most of our efforts are spent on people who are most likely to be fine (healthy and under 65 years old) or not (over 65 years old with additional health issues). The survival of entire industries is also receiving serious attention. Package delivery companies are getting rich while restaurants without take out are barely making it. But what about universities?
Galloway plays on a recent analogy of comparing colleges to cruise ships to suggest that the conditions aboard cruise ships are similar to colleges. Both venues bring large numbers of fun-loving people together to socialize and let loose for a period of time before returning to reality (in a number of weeks in one case or years in another). The CDC had a “no sail order” for cruise ships through July 24, 2020; we can imagine how their industry is doing.
Because the college “cruise ships” (I realize colleges are much more than this) have not left the dock yet, there is still much hope that the condition of their passengers and the ship rules will prevent any similar CDC order. Scott Galloway has made his best first attempt at recommending which of these colleges’ guests may want to avoid getting on board.
At the risk of overwhelming readers, it is important to know what factor Galloway is using to make his predictions. Galloway describes his big variables as I) value and II) vulnerability. A college’s value is based on three factors A) Credential, B) Experience and C) Education.
The Credential factor is based on a college’s a) their undergraduate admit rate, b) the amount of times the college’s name is searched in Google, and c) the institution’s rank in U.S. News. It is worth noting that the U.S. New rank is based on 15 factors of varying percentages.
The Experience factor is based primarily on millions of student responses to surveys from Niche.com on the culture of their campuses. The main factors in Niche.com’s rankings can be found in Appendix 1.
Galloway divides these three factors by their net price to students (i.e. the average price a student at their college pays).
The Education factor is rooted in the return on investment for degrees 15 and 30 years out (i.e. how much a sum of money in the future is valued today; including costs, future earnings, and the length of time it would take to invest and earn a certain amount of money over a fixed horizon). The third component of the Education factor is the amount of money institutions spend on instruction/teaching per full-time student.
The scores from these three variables (after changing them to scales from 0-1) are multiplied and divided by the net price or average cost to a student (after changing to a 0-1 scale).
Galloway’s other measure of a university’s likely survival is based on their vulnerability to COVID. This measure is based on two numbers—endowment / student and the percentage of international students at their institution. He probably selected endowment per student as it is a respected measure of the college’s ability to have savings from which to draw during challenging financial times. He uses the percentage of international students knowing that they often pay high percentages of the listed price of tuition which results in major revenue (that would be at risk during COVID).
Galloway wisely suggests that he may have left out a number of variables that could improve his rating of colleges’ likelihood of survival. He mentions potential value coming from the percentage of commuting students or colleges with hospitals.
Galloway also acknowledges that one of his main goals to simply get the discussion going on what is going to happen to colleges. He welcomed feedback.
What is the value of Galloway’s research?
I don’t think we can say that Scott Galloway is not making an informed attempt at what many of us are trying to figure out. Galloway knows the value and of and uses the Integrated Postsecondary Education Data System (IPEDS) along with integrating the fifteen factors which make up U.S. News Rankings and the ten factors in Niche.com’s rankings.
A number of the factors Galloway uses have been shown to demonstrate institutional instability. Endowment per full-time student is a commonly used measure of institutional strength. Admissions selectivity, although recently removed from U.S. News rankings, is still a factor that has been used in several reports to demonstrate institutional instability. First to second-year student retention has a strong correlation with institutional success. Instructional expenses per full-time student demonstrate how much of an investment colleges are putting into the education of their students. Lower amounts on this measure have been correlated with institutions that close.
Although the number of international students in the past would have only been used as a positive factor to an institution’s strength, Galloway is one of the first to use the percentage of international students as a risk to an institution.
We have struggled for centuries to figure out how to describe a student life culture in any meaningful quantitative manner. Galloway attempts to put some value on the culture of the college and surrounding community by using data from millions of student feedback points collected by Niche.com.
What should we question in Galloway’s analysis?
While Galloway is transparent in providing his methodology, he does not explain in much detail why he selected and gave the various weights to the items. I am not a skilled data scientist—more of a fan and proponent of the new ways of intelligently using the information we have. This leads me to call several of his selections into question.
I will probably be one of thousands of critics of Galloway’s decision to include the student rankings of the party scene in the likelihood of an institution’s survival. While an active social life can be important to students, I am not aware of any correlation between the amount of alcohol consumed (score is linked to closeness of bars to campus) and the existence of a university. This is particularly evident when we examine the fast-growing adult-focused institutions growing faster than most traditional-age college students. Does the almost entirely online Southern New Hampshire University’s party scene put them in trouble? I am not saying this is not important to a section of students but to have it as a factor in a university’s potential demise seems excessive.
I am way out of my league on understanding search engine optimization and Google keyword planner use. Galloway uses the latter as a third of his score for a college’s credential. I have never seen this measure be used as a measure of evaluating colleges but I am open to understanding it more. It strikes me as a creative quantitative measure of an institution’s popularity, but it also concerns me that this is the first time I have ever seen this criterion used.
Galloway’s incorporation of the return-on-investment of colleges, labeled as net present value, is welcomed. The data on this value was published for the first time in fall 2019 and has generated mostly positive attention. The primary drawback of this data is similar to what most rankings systems struggle with and that is the inability to disaggregate the students going to elite institutions versus the students going to less well-known colleges. Students lacking some established privilege would likely not be as successful if the institutions with high return-on-investment are not intentionally structured to provide additional assistance to them.
Last but not least, it would help if Galloway had explained why he gave his various factors certain amounts of weight. For example, the admissions selectively and Google search volume are worth the same amount as the U.S. News ranking, which is a ranking comprised of fifteen different factors including graduation rates. U.S. News has reduced admission selectivity to less than 5% of an institution’s ranking before eliminating it last year. Galloway not only brings admissions selectivity back, but makes it one-third of a college’s credential score, equating that one measure with the sum of all fifteen of the U.S. News ranking factors.
What is missing from Galloway’s analysis?
Because the fall of a college is a rare phenomenon, the examples from the past several decades have been thoroughly studied and used to attempt to provide greater predictability looking forward. In fall 2019 I led an independent study on the topic of institutional instability. Kerri Bond, a hard-working doctoral student at Baylor, worked with me to create a list of all the factors that had been published in various articles and reports that were linked to institutional closure. As already mentioned with regards to Galloway’s factors, many of the factors we identified were connected to the universities’ closure but were rarely identified as causal factors to the closure.
In comparing the list that we came up with last fall to Galloway’s factors, there was some overlap. For example, endowment per student and instructional costs per students were both commonly mentioned factors in many reports along with student retention rates and net price.
However, many of the factors on our list were left out of Galloway’s analysis. Galloway might respond by indicating that many of these measures are not maintained in easily accessible locations on a national level. In most cases, that is probably true. On the other hand, making a prediction about a college’s closure is significant enough that a thorough review of the potential factors is warranted.
Examples of data that could be located but was not included are variables with regards to institutional and student debt, fundraising and grants, budget cuts, leadership experience, online courses, student enrollment changes and institutional type.
Scott Galloway was recently described in the Chronicle of Higher Education as “Higher Education’s Prickliest Pundit.” His USS University article was true to this descriptor. Depending on how you look at it Galloway has either catalyzed an important conversation or needlessly scared and offended hundreds of colleges. Both are probably true.
In 2015, Sweet Briar College’s Board announced they would be closing that summer. They still had 80 million dollars in their endowment at the time. They had a beautiful campus in the hills of the Blue Ridge Mountains. Unfortunately, they did not want to co-educate, they were geographically isolated, and their enrollment numbers and yield rates were declining. None of these factors are a part of Galloway’s predictions.
However, when Sweet Briar made this announcement the alumni rose up in opposition and reversed the decision. The little women’s college still exists today, albeit still small and struggling financially. One of the lessons taught to me by a former mentor who just so happened to have served on the Board of Sweet Briar and had a hand in encouraging closure, is that it is almost impossible to kill a college. There are many examples of colleges that were expected to disappear (e.g. Antioch) that did not because there were alumni who would not let it happen.
I share this because one thing Scott Galloway may need to get credit for is that the shock he just gave several hundred universities may lead to a push for survival that would not have occurred without Galloway’s conjecture. In short, I predict the colleges on the “Perish” list are now going to take more urgent and informed steps to prevent their demise. They may be the lucky ones who just received a shot of the virus in the form of a vaccine that will ultimately strengthen them for the coming months. Let’s certainly hope so.
Since stepping down from his role as Dean at Baylor University in Texas in 2019, Doyle has worked in Institutional Effectiveness (IE) at Baylor. In that role, he assists all the administrative and student support departments (>30) in their efforts to strategically plan, assess progress and take action. Doyle also teaches undergraduate and graduate courses on organizational behavior, leadership and higher education management. His blog is Deep Thoughts on Higher Education.