Two high-achieving students at separate high schools claim they were wrongly accused of cheating and flunked crucial assessments after teachers judged artificial intelligence had completed their work.
But both girls are adamant their work is their own, with one parent describing the use of AI by teachers in the grading process as “Russian roulette”.
Experts also warn the use of artificial intelligence to detect plagiarism is dangerous and has potential legal implications, while the New Zealand Qualifications Authority (NZQA) emphasises that human teacher judgment remains the best assessment tool.
ChatGPT, released in November last year, is an artificial intelligence chatbot that generates text.
The students failed written assessments after teachers at Cambridge High School and Pukekohe High School flagged an issue with their work and used AI tech for a second opinion.
The Year 13 student at Pukekohe High claims she was also dismissed by her teachers when she tried to show them the background work she did to complete the assignment.
“[The teacher] said you’ve definitely used AI and then they left it at that, so my mum had to ask for the meeting,” the student told the New Zealand Herald.
In the meeting, as she tried to show her notes, pictures of the board, proofreading and other evidence, the student claimed the teacher repeatedly said, “AI can generate all of that too”.
“It just really put me down because they didn’t even want to hear me out or anything,” the student said. “I can’t lie, I cried a lot.”
The student, who wants to study medical imaging at university next year, is now concerned about how a failed grade because of alleged cheating might affect her chances of being accepted into the course.
Her mum said her daughter is heartbroken at the allegation.
“She is a good student, she’s never been in trouble, never been called into question ever for anything,” the mum said. “She studies really hard and she’s always had merits and excellence on all exams and internals.”
Pukekohe High School principal Richard Barnett said teachers will speak to students about their level of understanding to see if it correlates with the written understanding of a finished assignment.
Barnett said his staff, who “know their students individually”, make judgments on which work needs additional scrutiny.
Cambridge High School principal Greg Thornton also said: “We review each student’s work using teacher judgment. Only the scripts that are not considered to potentially be a student’s own work are further investigated.”
Thornton said teacher judgment involved looking at the students’ previous work and their “overall performance” in class.
The Cambridge student’s mother told the Herald her daughter has never failed an NCEA assessment before.
“She got endorsed with excellence last year,” she said, questioning why her daughter’s work was flagged.
“People take [the software] as gospel and it’s not. It’s full of falsehood and made-up information.
“My issue with it all is that if this is how schools are checking work it’s just ridiculous when we know that ChatGPT - everything I’ve read about it says that it’s unreliable, it’s inconsistent.”
The mother described the use of AI technology to help grade school work as like “Russian roulette”.
Thornton said he wouldn’t comment on the specific case at his school, but said he was “engaging with [the student] in accordance with our concerns and complaints procedure”.
“We are in line with NZQA expectations to follow principles of natural justice when investigating possible cases of plagiarism,” Thornton said.
NZQA advised schools about the use of artificial intelligence in February - noting the arrival of ChatGPT was ringing alarm bells in the education sector.
“At worst, the chatbot can produce quality essays, reports, etc on any topic, which might escape detection by regular plagiarism checks and be passed off as the student’s work,” NZQA said.
NZQA lists three programmes schools could use to detect plagiarism - AI Writing Check, AICheatCheck and Turnitin - which have already been used in schools and universities.
But NZQA also said it could be “nigh on impossible” to stop students from using AI tech.
NZQA deputy chief executive of assessment Jann Marshall said they don’t tell schools exactly how they should check whether work was authentic.
“While AI detection tools are developing quickly and can be useful, they are not infallible,” Marshall said.
Educational technologist Francis Valintine told the Herald it is entering a dangerous environment if education authorities begin accusing people of using AI to cheat when it isn’t always accurate.
“There is no 100 per cent certainty on any tool whatsoever and nobody is claiming to have that magic tool,” Valintine said.
She said the accusations present a “significant legal challenge” in countries that are banning the technology completely, not just in schooling.
“There is no way of validating or verifying unless someone has literally seen it and is able to go back to your search and find the use of GPT unless you don’t clear your history,” Valentine said.
“But if you went in and there was no history and someone saying, ‘I didn’t do it’, you’d be a very brave person [to] turn around and say on a legal constraint or a legal kind of territory, ‘I think this is plagiarised and therefore you have not passed’.”
Principal investigator of the Research on Academic Integrity in New Zealand (Rainz) Project and University of Auckland Associate Professor Jason Stephens has researched cheating for two decades.
He said with AI tools now readily available, “the opportunity for cheating and all manner of misconduct is seemingly endless”.
He discussed that to combat the risk of inauthenticity within schooling, students must document their work and work process.
“The problem we face as educators is that AI tools seemed to have arrived overnight. This isn’t just a technological development, it’s a technological revolution,” Stephens said.
“In education, this means rethinking not only how we teach and assess learning, but what we are teaching and assessing for.”
In a recent survey conducted by the Rainz project headed by Stephens of more than 5000 tertiary students at seven universities in NZ, 13.9 per cent of students indicated they had used an artificial intelligence tool to complete academic work.
“Given that we conducted this survey last year, before the release of ChatGPT, we can expect this number to increase in the year ahead,” Stephens said.
Valintine and Stephens also spoke about harnessing the use of AI for students and teachers alike as an ally, rather than working against it.
When the Herald asked ChatGPT - directly - if it could detect whether work was plagiarised, the computer programme offered this response: “I can certainly help detect plagiarism in a piece of writing.”
“However,” it continued, “it’s important to note that I don’t have access to every possible source of information, so my analysis might not be exhaustive. Additionally, while I can flag passages that match other sources, it’s ultimately up to a human evaluator to determine whether or not the similarities constitute plagiarism.”
Independent research conducted by Herald reporters on personal essays and essays completed by students that were identified as AI and AI-generated came back with inconsistent results.
Many essays not AI-generated came back as “likely” to be bot-created. Additionally, it was relatively easy to trick the AI checkers to believe essays were not AI written when they were.