Schools abandon some homework after rise in AI cheating

Photo: Getty Images
Photo: Getty Images
By John Gerritsen of RNZ

Some schools are abandoning take-home essays and assignments because students are using artificial intelligence to cheat.

The Ministry of Education said the risk of AI misuse was one of the reasons reports were being dropped as a form of assessment for externally-assessed level one NCEA standards next year.

Auckland English teacher Kit Willett said his school made a big change after about 15-20 percent of students used AI to cheat on assessments early last year.

"We've reverted back to hand-writing for a lot of assessments and sometimes that's a hand-written first draft and then they can go and type and edit but we've still got access to that student's authentic work that we have verified as authentic because they've completed it under our supervision without access to technology," he said.

Willett said the change had a big impact with detected plagiarism dropping to about five percent this year.

The ministry said other schools had done the same, giving up on take-home essays and assignments as a form of assessment or quizzing students to check they understand the work they submit.

The head of science at Wellington's Saint Patrick's College Doug Walker said teachers were rethinking how to assess their students.

"It's really tricky to be able to do an essay now and I feel that within my own department we are jumping through hoops trying to create a situation where a student can research something in a controlled environment so that the we're not allowing them to use AI so they can't fall prey to plagiarism," he said.

"Anytime you bring it up there's a room full of groans when you're asking about AI, because teachers across the board are dealing with the same sorts of challenges. And while it is a very powerful tool, there's an opportunity to misuse it and that's where we're hitting problems."

Onslow College deputy principal Michael Bangma said some people criticised the NCEA qualification but it was lucky most secondary schools used it because it allowed schools to use a wide range of assessments.

"Why can't a student show understanding of something through a conversation or a slideshow or in a group setting, rather than just having to do a report or an exam or a test," he said.

"We've gone down the path of normalising assessments via conversation if that's what a student prefers. So if a student has to do a report about climate change and some of its impacts, then if they want to have a conversation for five minutes and I record that conversation, then that could be the evidence for the assessment."

Westlake Girls' High School teacher and fellow with Unesco's International Research Centre on artificial intelligence, Susana Tomaz, said AI was advancing fast and schools had no choice but to move with the times.

"It is just getting better at a ridiculous pace. AI detectors do not work and there's a lot of false positives ... Second-language students are a big percentage of the false positives through AI detectors. So I guess the only way forward is really rewriting our modes of assessment," she said.

The ministry said the risk of AI cheating was one of the factors that prompted it and NZQA to drop reports as the method of assessment for 19 level one NCEA standards next year.

"Due to the administrative burden of submitted reports and concerns over authenticity, including the rapid advancement in AI tools, the ministry and NZQA have decided to discontinue their use as a method of external assessment in NCEA Level 1 from 2025," it said.

"The change in methods of assessment is intended to support schools and teachers in implementing NCEA Level 1 and address specific concerns over authenticity."

The ministry said schools were concerned about managing the use of generative AI for internal assessments.

"The ministry and NZQA recognise that this is a rapidly evolving environment which may require some re-thinking of assessment practice," it said.

"We are talking with schools about how assessment activities could be designed to minimise inappropriate use of generative AI, and alternative ways in which the authenticity of assessment responses could be verified."

It said NZQA had guidance for schools about the acceptable use of AI in assessments, including careful use of AI checkers or detectors.

"Be very mindful of false positives, and do not rely on this alone," it said.

It said schools should have clear assessment policies about the strategies and the processes to follow if a teacher suspected unauthorised use of generative AI.

The ministry said it was working with NZQA on further guidance for teachers and students on managing authenticity in NCEA and expected to engage with school leaders and teachers before the end of the year.