Canterbury University's bid to beat students cheating with AI

Photo: File image / Getty
Photo: File image / Getty
By Kate Paterson

Canterbury University is looking at new ways to uncover and stop students from using artificial intelligence to cheat.

Jeanette King.
Jeanette King.
Associate Dean of Arts Jeanette King said one of the biggest challenges they have been facing is AI being used for assessments.

“It is a bit tricky because we can’t ultimately prove that the student has used AI,” she said. 

Data obtained under the Official Information Act revealed there have been 116 cases of academic dishonesty involving AI over the last two years, where the student could not defend against the allegation. 

The number is considered small by the university, and the disciplinary action taken varied across cases.

If a student is found to have committed academic dishonesty, the student may be reprimanded, denied credit in any course, be required to formally apologise, be fined up to $500 or be required to complete up to 40 hours of university community service. 

New types of generative AI, such as ChatGPT and other large scale language models (LLMs), work by processing data from across the internet to generate a response to prompts.

Because the technology is producing its own response, it is harder to pick up on in assessments than other forms of plagiarism. 

But King said UC is looking for new ways to challenge AI use among students. 

“We have to have robust assessment procedures that try and mitigate against students being able to use it effectively.”

This includes a return to in-person exams, requiring students to submit word documents with tracked changes, and staff initiating open conversations with students about the technology. 

A recent study in the International Journal of Educational Technology in Higher Education recommended a new model for universities when teaching with AI.

It includes providing training and support for teachers, staff and students in AI literacy - all of which UC is doing.

“It has required a huge upskilling by our staff to address some work with us. And there’s quite a few different aspects, so it’s going to be an ongoing challenge and aspect to our work for quite a while,” said King. 

UC digital humanities lecturer Dr Geoffrey Ford said there are also markers to look out for when it comes to assessment.

“The kinds of things I think about when I’m reading student work is sometimes the terminology. It’s generated from a model that is essentially being trained on the internet and will feature words and phrases that might be relevant internationally, but aren’t so relevant or used in the New Zealand context.”

Photo: University of Canterbury
Photo: University of Canterbury
UC’s position statement on AI use is vague, as guidelines for how it should be used differ between departments. 

It states: “UC recognises the importance of developing high-quality guidelines to mitigate risks in the areas of academic integrity, research quality, institutional reputation, and equity and inclusion for both staff and students.”

However, AI can be a helpful tool, and it has become popular among students to aid them in their studies. 

“We want students who can really make sense of the best use of the technology and who making good decisions about the use of them,” said Ford.

UC student Anna (not her real name) uses generative AI to help her study and likes it because it is simple and effective to use.

“I use it in my study for things like paraphrasing sentences, to search up how I can find more information about things, for keywords, and I can use it as a grammar checker,” she told The Star.

Anna’s course has a zero-tolerance policy for AI use, but she said she’s never used the technology to cheat. 

“Knowing that it could potentially affect my grade makes me worried. But I have never used AI in a way that it would write a whole paragraph for me, and I would take it word for word and put it into my assignment,” she said.

Ford currently teaches two courses about generative AI at both first year and masters level. 

“We’re getting students to engage with the technology directly, to learn about how to use it practically, what some of the limitations are, and what some of the issues are around using it,” he said. 

“Part of that conversation is pointing out some of the legitimate uses of this while they’re completing their studies, and so things like brainstorming, outlining, maybe putting complex ideas into more simple language to aid understanding.”

He and King agree that one of the biggest challenges when it comes to the technology is ensuring students leave university with the skills they came for.

“We want to be able to make sure that students are achieving those learning outcomes so that when they leave, they really are in the best spot for being able to use those critical thinking skills,” said King. 

Ford said working with students, rather than against them, is the best way forward.

“I think it is very difficult to start accusing students of using this. I’m trying to have open, honest conversations with students, and students are responding well to that,” he said.

“Students and educators are in this together. This is a kind of technology that is confronting to all of us, and I know students are concerned about this, as are educators.”