Getting the SENSE of Students

Getting the SENSE of Students

New survey addresses crucial first weeks
 

Community college leaders already know to expect some students will drop out in the first few weeks of the semester. The Survey of Entering Student Engagement (SENSE) can help them figure out why.

“We lose nearly 50 percent of community college students nationwide before their second year,” says Angela Oriano-Darnall, assistant director for the SENSE survey, which is in its second year and is administered by the Community College Leadership Program (CCLP) at The University of Texas at Austin. “It is imperative that we learn what is happening early in their career.” 

SENSE is a paper survey completed by students during the fourth and fifth week of the fall semester when they are still getting their feet wet, as opposed to the Community College Survey of Student Engagement (CCSSE), which is administered in the spring after students have already shown some persistence. The CCLP launched CCSSE six years ago.

The new survey grew from the CCLP’s work on the established survey as well as from Achieving the Dream, a national multiyear initiative to help more community college students succeed. “We know that engagement is related to student success,” explains Director Kay McClenney. “We also know that certain student groups are ‘high risk’ because they have a low likelihood to persist.” Research has shown those high-risk students, often first-generation or minority students, are more engaged than other groups, and it is these highly engaged, high-risk students who are in the survey pool for SENSE, which is focused on developmental courses.

People are “stunned” when they find out high-risk students are making such an effort, says Arleen Arnsparger, the project manager for the MetLife Foundation’s Initiative on Student Success. The MetLife Foundation provides funding for the survey, along with the Houston Endowment and the Lumina Foundation for Education.

Colleges have programs in place ranging from student success courses to mandatory advising that are meant to address the many reasons students drop out. The survey can help administrators assess the effectiveness of an established program, while also highlighting intangibles, such as first impressions of campus. The data can reveal differences between programs and how students perceive them. “Oftentimes colleges think they have policies in place for assessment and placement, but in reality it is very different,” Oriano-Darnall says.

A key finding so far, which is supported by CCSSE data, is the importance of making connections with other students and college personnel early in a student’s career. Institutions encourage faculty to brainstorm ideas about connecting better on the first day of class. “Another area that emerged very clearly is that students don’t know what they don’t know,” Arnsparger says. This is especially a pain point for first-generation students who might not have anyone to help them navigate a college system. “Even people who work with community colleges fall into [this assumption],” says Oriano-Darnall.

 

Administrators at Durham Technical Community College (N.C.) participated in both the pilot and the field test. “I asked Angela if we could get our data early,” says Tom Jaynes, dean of student development. His team made significant changes to the orientation program based on data from last year’s survey, and they are eager to see its impact.

When the 2007 pilot results revealed that only 20 percent of students either went to orientation or knew it existed, Jaynes was surprised. “We were putting a lot of energy into creating the orientation program.” Officials already knew attendance was low, but the SENSE data highlighted the problem and also prompted changes.

'We do lots of great things, but we don't know if they are effective. SENSE pulls back the curtain and lets you see what is really going on.' -Tom Jaynes, Durham Technical Community College

Oriano-Darnall often sees this happening at participating schools. “We do know what students need, but we need the facts to support moving in a different direction,” she says. “Sometimes the anecdotal evidence is not good enough.”

Since orientation is not required at DTCC, administrators made it more appealing by promising to have students’ application processed by the end of the 90-minute program, something that can normally take two weeks. The program content remains the same, but now the first 15 minutes are spent filling out the application, after which students learn about topics such as financial aid and take a campus tour. When they return from the tour they receive their student ID and are ready for placement testing.

Jaynes calls the attendance change “dramatic,” from about 200 students to 743. An entering class can include up to 900 students, but orientation is geared toward brand new students (those submitting an application for the first time).

The program also scored high for learning outcomes such as knowing about the college and being able to calculate the cost of attendance. “Now the pressure on SENSE is to see our numbers from orientation after the changes,” he says.

Another area Jaynes found for improvement is advising. All students are required to see an advisor before registering for classes, but students don’t perceive it as advising. Different models are being investigated to address the matter.

Jaynes says DTCC will probably participate in both CCSSE and SENSE this year. “As student development professionals we have lived in Oz for too long,” he says. “What I mean is, we do lots of great things, but we don’t know if they are effective. SENSE pulls back the curtain and lets you see what is really going on.” DTCC loses 30 percent of students after their first semester of enrollment and the survey can help address the issue.

“We are providing colleges with critical information in those early areas of student experience and ... with actionable data,” Oriano-Darnall says.

McClenney says enrollment for the next CCSSE administration is close to last year’s level, and she is hearing of some colleges considering alternating participation in the two surveys. South Texas College is one of those institutions.

Brenda Cole, director of Research and Analytical Services, confirms that the college participated in the 2008 SENSE field test. “Within our own institution we’ve realized that there are things that need to be asked of students immediately in the first few weeks of school,” she says. STC administrators created their own Opening Weeks survey in 2006, which addressed many of the same issues as SENSE, so it was a logical decision to join the larger program. Administrators realized that many students who drop out in the first few weeks do so over issues such as unanswered questions or financial arrangements with which administrators might have been able to help if they’d been aware of them. “Your first option to resolve issues is to understand the big causes,” Cole points out. “This survey will help us see the smaller things too.” She adds that participating in CCSSE has also been useful. STC officials are now working on longitudinal trend analysis.

Both surveys offer support resources on utilizing data. “There is an incredible level of consistency across all groups. Students know what will help them succeed,” Arnsparger says.

In addition to the survey, Arnsparger has been conducting campus focus groups to obtain qualitative information to support the survey data. “Our greatest opportunity in helping the largest number of students succeed is engaging them at the front end,” she concludes. “That’s where our greatest attrition is, that’s where our greatest opportunity for success is.”


Advertisement