End Note

End Note

The Problem with Standardized Assessment
By:

By now we've all heard that the Commission on the Future of Higher Education may recommend standardized tests as a way to compare and rank institutions. Such tests would likely attempt to measure general reasoning and communication skills. The commission's intention is undoubtedly good, but can such an endeavor be successful?

To create a valid test, one has to know what questions it will answer. Perhaps we want to measure critical thinking and effective writing, since those are essential job skills. Can you really evaluate the many types of critical thinking with a single instrument? Is writing a computer program sufficiently similar to analyzing Homer in that they are simultaneously measurable in 30 minutes with a number-two pencil?

At Coker College (S.C.), we addressed this problem with a Faculty Assessment of Core Skills, which evaluates analytical and creative thinking, effective speaking, and writing. Assessing how well a student thinks is complex and subjective. So we aggregate the opinions of all course instructors. Validity is checked against grades, portfolios, and even library circulation history.

While this method works great for a small liberal arts college, it would not be a candidate for a national metric of achievement. It does, however, show that a single external definition of a skill like critical thinking is undesirable and unnecessary. It's proper for the government to define "legally drunk," but not "legally intelligent."

Aside from the problematic specifics of a national test, its very existence would pose problems. Suppose this hypothetical institutional report card were used to allocate or restrict federal aid in the name of accountability. Institutions would try to maximize their scores by whatever clever means they could devise-with huge incentives for test preparation and outright cheating. As for students, the test would wind up measuring enthusiasm for a "not-for-a-grade-while-I-could-be-playing-my-XBox" test.

Small colleges like Coker would have to quickly figure out how to best prepare and motivate students. We would be at a competitive disadvantage; richer institutions could simply go buy the "solutions" that the testing industry would eagerly offer.

There are many more problems with the standardized test approach. What would a parent make of the artificial rankings that result? What most families care about is whether or not Johnny can get a good job after graduation. This echoes the commission's own rationale for wanting more accountability in higher ed. Why not skip the test and its validity problems and cut to the chase? State and federal officials could give a pretty good answer to Johnny's question by using existing data. They could:

Match up financial aid records (to identify institutions and students) with tax information (to find earnings history) and aggregate to create a comparison between institutions and the earning power of their graduates.

It may be proper for the government to define "legally drunk," but not "legally intelligent."

Publish the results by school, major, and selected demographics.

The demographics would allow for re-norming by socioeconomic group of the student or other variables. In this way, market forces indicate needed changes. Better informed consumers would create a more efficient higher education marketplace without heavy-handed federal intervention.

With institutions not involved in the assessment, they have limited ability to artificially change the results. Moreover, the numbers can give meaningful answers to typical questions families have, such as:

"Are private school costs worthwhile?"

"Which is the most valuable computer science degree in my state?"

"If I major in dance, how long might it take me to repay my loans?"

"Is the rising cost of tuition compensated for by higher after-graduation salary?"

Imagine trying to answer those questions based on test score averages.

There's no need for the federal government to have to find a common denominator between FORTRAN and gray-eyed Athena. Officials already have all the historical data they need to answer questions about the relative value of higher ed. All they have to do is open the books.

David Eubanks is director of Planning, Assessment, and Information Services at Coker College. He blogs about higher education at highered.blogspot.com.


Advertisement