ChatGPT, AI policies—and international students?

In my experience, international students are often more likely to be suspected of academic dishonesty, especially when they write well. That statement might sound controversial, and it is.
Jonathan Summers
Jonathan Summers
Jonathan Summers is director of International Student Services at Cedar Crest College in Allentown, Pennsylvania.

ChatGPT and artificial intelligence are now a common part of the vocabulary in higher education. They can be used for lesson planning, quiz creation and, if used correctly by students, for research, language development and as an effective tool in their education to support student success. I have heard from language teachers who have used ChatGPT with their ESL students who have stated that it has had a positive impact on their learning.

On the flipside, admissions staff at colleges and universities across the country have seen countless numbers of ChatGPT-generated college essays, which at the very least compromises the legitimacy of the applicant’s story in their eyes. College writing assignments, research papers and all forms of academic work on many campuses are being flagged and submitted to faculty for review.

In my experience, international students are often more likely to be suspected of academic dishonesty, especially when they write well. That statement might sound controversial, and it is. But it’s what I have observed in the last 18 years that I have been involved in international higher education. Phrases such as “English is not their first language so it can’t possibly be their work,” and ‘This is just written too well,” are not uncommon.

Sometimes these statements are correct, but they are usually not. That’s why it’s important that there is a dissociation between using ChatGPT for the wrong reasons and applying that to international students. It’s a generalization and, as generalizations often go, it’s frequently applied to a situation both quickly and unfairly.

The problem is not the technology itself. It’s that many institutions of higher education do not have clear policies on ChatGPT and AI.

If faculty and staff are using the technology themselves then how can we filter how students use it? Is this a grey area and how far does this area reach? When is using ChatGPT considered to be plagiarism? We need to consider the short- and long-term effects of a student’s reliance on, or usage of, this technology during their educational journey.


More from UB: How educational therapy can help universities and students succeed


Many of us who work in higher ed recently read the widely shared article describing how a student during his graduation speech thanked ChatGPT for helping him graduate. The audacity. Or was he just using what was available to him?

Moving forward, we must start looking at how institutions can get ahead of ChatGPT. In my field of international student education, ChatGPT should be discussed as part of international student orientation. That’s what we’re doing at my college. The trick is to introduce the technology in a way that focuses on how to use it correctly.

I encourage others who work with international students to have these conversations on their campuses. But with it still being so new how do we do that effectively? From an academic standpoint, faculty should address ChatGPT and AI in their class and on their syllabi. Open discussions among colleagues and with academic leadership on campuses large and small about establishing clear policies on this are essential.

ChatGPT workshops, and educational resources need to be introduced throughout a student’s educational journey. After all, this is not a “one-and-done” type of arrangement.

Institutions of higher education have a wealth of technology and platforms to aid learning. Using ChatGPT and AI for good is a priority. So, it is essential that each institution put together a comprehensive approach to ChatGPT in how it is introduced to students, how it is promoted, how it is discouraged, and what are the ramifications for misuse.

Institutions should look at the best way to establish an institutional policy on ChatGPT and AI so that each individual faculty member doesn’t have to create their own set of rules. Perhaps there can be flexibility in that, but we don’t want faculty being caught between a ChatGPT rock and a hard place.

The bottom line is that more conversations need to happen on this topic, even though it can be an intimidating one as we all seek to better understand the technology. If we don’t address it, it might get away from us—and isn’t that what so many in higher education and beyond are really afraid of?

– Jonathan Summers is Director of International Student Services at Cedar Crest College in Allentown, Pennsylvania.

Categories:

Most Popular