Taxpayers may have been surprised to learn, through RNZ’s reporting, details of up to half a million of them were being shared with Facebook, Google and LinkedIn several times a month for targeted advertising campaigns.
The advertisements were aimed at people with income tax, Goods and Services Tax, or student loan debts due, and those needing a Working for Families update.
Although the information given to the social media sites was encrypted using a system known as hashing, questions have been raised about the security of this.
Also, what happened to the information once someone clicked on an ad seemed murky. There were fears it could then be used by the social media outfits for training AI systems.
It was good to hear Inland Revenue has paused the practice and is reviewing it after several complaints following the airing of the story.
Better late than never, but this after-the-event behaviour illustrates the need for agencies to make privacy a core focus.
Privacy commissioner Michael Webster, in his briefing to incoming justice minister Paul Goldsmith, suggested such focus should be on a par with that for finance or health and safety.
Since we have had privacy law since 1993 it is hard to understand why this is still so difficult, particularly for government agencies.
Another belated action has come from the Ministry of Social Development over using fake profiles on social media sites to spy on beneficiaries in fraud investigations.
It knew this was dodgy in 2017, paused it in 2021 after a question about it in Parliament, but has only now brought in a policy banning it.
The Privacy Commissioner was kept in the dark about the fake personas too.
We have previously been critical of the police’s lax attitude to privacy when smartphones were introduced. Thousands of photographs of people were taken, often on the flimsiest of pretexts.
The storage of these images was so lackadaisical the police have had difficulty complying with the Privacy Commissioner’s order to delete them.
The police’s exploration of facial recognition technology (FRT) over several years has not been a master class in transparency either.
Only now have they produced a policy for its use but there has already been criticism of the fact the policy will be audited internally rather than externally.
It was not reassuring to learn, again through RNZ reporting, some police had been accessing two facial recognition websites subject to official complaints in Australia and the United Kingdom and not authorised for police use, even if it was only to see what these sites did rather than use them.
Mr Webster has yet to announce whether he will introduce a biometrics code after broad support for proposals in draft rules following a consultation earlier this year. This code would cover rules for the use of technologies such as FRT.
While the current Privacy Act is only four years old, it is based on policies agreed in 2013 and does not cover such things as biometrics and AI, or account for new risks to children’s privacy, he told Mr Goldsmith.
Mr Webster is concerned we are increasingly out of alignment with like-minded countries which have prioritised privacy reform.
He says falling behind global privacy regulatory approaches could impact on the country’s technology sector and place in the global data economy.
Whether there will be much appetite from the coalition government to modernise the Act is not clear, but Mr Webster’s call for tougher penalties to encourage greater compliance with existing law has yet to find favour. Multimillion-dollar fines can be imposed in Europe and Australia.
Mr Goldsmith said he does not have current plans to increase the maximum $10,000 penalty for breaches, but it might be considered in future.
He points out the Privacy Commissioner can refer a complaint to the Human Rights Review Tribunal, which can award damages up to $350,000.