
Opponents of a law change fear it will create a legal loophole allowing the use of artificial intelligence to cut benefits and impose sanctions on beneficiaries.
But the Ministry of Social Development (MSD) says it does not plan to use generative AI or automated decision making in that way.
A new clause in the Social Services Amendment Bill, which has passed its first reading, vastly expands decisions that can be made by automated systems to include sanctions.
Cabinet has agreed to introduce a suite of new obligations and sanctions for job seekers this year.
The Salvation Army warns if the door opens for AI to decide benefit sanctions, it cannot be easily closed.

"The bill will make things a lot worse for those that we support, it will push a lot more people into poverty. It will create a lot more barriers and difficulties for those that we support to be able to access the help that they need."
A clause expanding AI opened the door for the ministry to automate decisions relating to sanctions, she said.
"If you open this legal loophole here, how far can you take it?
"AI is artificial, but it's not that intelligent - the intelligent part is still being worked out, but because of that there are decisions that could be made that are unfair, that does marginalise those who are on the fringes of society and makes things a lot worse."
The Salvation Army and Law Society have called for the clause to be scrapped in submissions to the select committee considering the bill.
The Salvation Army's submission stated that automatic decision-making "cannot account for the complexities we often see in the individuals we support", such as financial hardship, addictions, mental health issues or unstable living conditions.
"Automated systems risk making inappropriate decisions that could increase hardship for those we support," it said.
MSD said it plans to use basic AI to decide whether people applying to renew their Jobseeker benefit have met their obligations, but automated decision-making would not be used to decline these.
Under the proposed changes, people on this benefit would have to re-apply every six months - instead of annually.
Ika said that would increase the ministry's workload but these decisions should be made by humans.
"We understand the argument around efficiency - but efficiency will come at a cost and for us that cost is increasing poverty, making things a lot more difficult for those that we support."
The Law Society wants the clause allowing the expanded use of AI to be dropped entirely.
"This raises significant concern about how the use of automated systems will apply where the sanctions provisions involve some form of evaluative judgement, for example those relating to money management and community work."
Its submission stated there were not enough safeguards in place and it is concerned a standard developed for MSD in 2022 would not be an effective safeguard under the new legislation.
"Such laxity is concerning where automated systems may now be used for the implementation of sanctions and punitive restrictions. It appears the standard may no longer operate as an adequate safeguard, if these amendments proceed."
MSD said it was reviewing the standard in consultation with the Privacy Commissioner. Deputy chief executive for organisational assurance and communications Melissa Gill said automation was used to improve the efficiency and consistency of decision-making in the welfare system.
One example was the winter energy payment which, if done manually, would take 600 staff two months, she said.
However, generative AI was not yet being used, she said.
"It is important to note that MSD's current automated decision making use does not include generative AI, a type of AI that includes tools such as ChatGPT. MSD doesn't use generative AI or automated decision making to make decisions around obligation failures or sanctions, and does not plan to."
MSD wanted to make sure it used such technology carefully and responsibly, she said. The Automated Decision Making Standard provides a framework for good practice and has specific safeguards, including the requirement that beneficiaries get their correct entitlement and are not discriminated against.
"We now make sure any new automated decision making process is designed in a way that is consistent with the standard, so that we can check to make sure it is operating as intended."
The select committee is due to report back to the House no later than April 22.