Fifteen years ago, the Obama administration and philanthropic foundations encouraged more Americans to get a college degree. Remedial classes were a big barrier. Two-thirds of community college students and 40% of four-year college students weren’t academically prepared for college-level work and were forced to take prerequisite “developmental” courses that didn’t earn them college credits. Many of these college students never progressed to college-level courses. They racked up student loan debts and dropped out. Press reports, including my own, called it a “remedial ed trap.”
One controversial but popular solution was to eliminate these prerequisite classes and let weaker students proceed straight to college-level courses, called “corequisite courses,” because they are combined with some remedial support at the same time. In recent years, more than 20 states, from California to Florida, have either replaced remedial classes at their public colleges with corequisites or given students a choice between the two.
In 2015, Tennessee’s public colleges were some of the first higher education institutions to eliminate stand-alone remedial courses. A 10-year analysis of how almost 100,000 students fared before and after the new policy was conducted by researchers at the University of Delaware, and their draft paper was made public earlier this year. It has not yet been published in a peer-reviewed journal and may still be revised, but it is the first longer term study to look at college degree completion for tens of thousands of students who have taken corequisites, and it found that the new supports haven’t worked as well as many hoped, especially for lower achieving students.
Read more at KQED.