Years later, ACC discloses its tool to target long-term claimants
The Accident Compensation Corporation (ACC) has admitted it analyses client data to target those who stay on its benefit for longer, despite a Government official saying agencies were not rushing to adopt such techniques.
The Government chief information officer Colin MacDonald, who oversees state agencies' use of technology, told Stuff in July that he was "personally not aware of areas [Government agencies] where artificial intelligence is being looked at."
He said it had learned its lesson since the Ministry of Social Development (MSD) tried to use predictive risk modelling to target vulnerable children who were likely to be abused and intervene accordingly.
MacDonald said the Government did not tend to be an "early adopter" of technology tools such as predictive risk modelling.
* Children 'not lab-rats' - Anne Tolley intervenes in child abuse experiment
* CYF software study raised same ethical dilemmas as medical trials, academic says
* Decision close on controversial predictive risk modelling to replace decile funding for schools
"We believe it is better to be a fast follower in these areas than a leader."
The ACC publicly announced on Friday that it had used predictive risk modelling since 2014.
The Department of Internal Affairs said MacDonald could not comment at the time of publication because he was overseas.
ACC spokesman Chris Ritchie said the tool compared the details of 1.9 million claims it received each year to 364,000 claims lodged between April 2007 and May 2013.
The client information it compares includes their initial injury claimed, their diagnosis, age, occupation, injury history, whether they sought treatment at a District Health Board and any delays in the lodging of their claim.
Ritchie said clients gave up that information when they agreed to its privacy notice. The notice mentions a claims database but does not say that the ACC uses predictive risk modelling internally.
He said ACC staff used the data to estimate each client's injury recovery time so a case manager could call a client "proactively" "if they need help".
International association of computing and philosophy executive director Steve McKinlay said the ACC would be using it to target people who utilised its benefit system for longer than necessary.
Referring to the MSD case, McKinlay said this was another example of the Government targeting vulnerable people.
"What the algorithm is doing is basically averaging out, based on data, and picking individuals or target groups of people that are going to be on their books for longer," McKinlay said.
"They would harass and harangue someone to get off ACC sooner than they should because their computer told them they should be off it by now."
ACC minister Michael Woodhouse said the information was anonymous and it was used to inform decision-making, not determine a client's financial entitlements.
He said it provided a better customer service and he would be disappointed if his agency was not using it to free up resources.
"It points to a signal when you do take longer and might need extra support … [So] ACC can invest more in you."
Woodhouse said the ACC using such a system was "hardly a revelation", but the privacy commissioner was only briefed on it this week.
Privacy commissioner John Edwards said ACC's use of the tool, despite it not being announced to the public for years, did not concern him.
"I have not seen anything from ACC to suggest that this is discriminatory or arbitrary," Edwards said.
He said it would only become a worry if the ACC became over-reliant on it and used it to replace human decision-making.
"The issue is not in the predictive risk model, but how you use it."
McKinlay disagreed with Edwards. He said algorithms like the one the ACC was using to determine outcomes for people was inherently biased and discriminatory.
"There is no way that you can build these systems without some kind of bias. It is almost impossible."
Woodhouse said biases existed in its system because it treated individual clients differently and not as generalisations.
"You call it targeting, I say it is a good customer service," Woodhouse said.
In July, MacDonald stressed that the Government should approach using predictive risk modelling with caution because the relationship between the state and the public was complex and fragile.
"The complexity of the relationship between the state and the citizen is such, that it [predictive risk modelling] does raise much bigger issues than it does typically for an individual private company.
"Government agencies do look at technology and they implement technology as best they possibly can while managing those risks. We will occasionally get things wrong. And when we get things wrong, we need to fess up, we need to front up and we need to fix it and move on in a way that keeps citizens comfortable that we are doing the right thing."