A controversial child safety tool is under review in NSW, after a similar algorithm was found to be “racially biased” and scrapped in Queensland.
But the NSW government said it wouldn’t be scrapping the Structured Decision Making (SDM) risk assessment and insisted it was different to the model used north of the border.
Jenkins said such tools were “likely to have high rates of false positives for Indigenous children”, worsening already major problems with the overrepresentation of Indigenous children in out-of-home care.
He said based on what had been found in Queensland, the system in NSW was “very likely” to also be “racially biased”.
“My hope is that they will see what’s taking place here in Queensland, and might consider subjecting their own risk assessment instrument to exactly the same type of test as we’ve applied here in Queensland to determine whether it’s racially biased,” he said, calling for the department to “have a look at that as a matter of urgency”.
A NSW Department of Communities and Justice spokesperson said its tool was different to that used in Queensland but did not answer 9news.com.au’s questions about how it differed or whether it was “racially biased”.
“DCJ are already working to make improvements to deliver more contemporary, equitable, fair and culturally safe assessment tools that will improve decision making and support better outcomes for children and families in NSW,” they said.
“The SDM tools and casework assessment processes in NSW are currently under review and updated SDM tools are expected to commence from mid-2023.
“An Aboriginal Engagement Plan has been developed to guide this work.”
Read Related Also: Australian influencer Jazmyn Forrest, 25, spends thousands on plastic surgery to look like Barbie
The DCJ spokesperson said the NSW government had committed to reduce the rate of Indigenous representation in out-of-home care by 45 per cent by 2031.
While stressing he hadn’t been given access to the NSW data like he had in Queensland, Jenkins said the SDM appeared to be effectively “exactly the same” as that used in Queensland and many other jurisdictions.
Jenkins’ issues with this algorithmic approach to child safety go beyond individual tools and the racial bias they can suffer depending on their make-up and calibration.
He compares the process of assessing children based on others in similar situations, as opposed to looking at each case purely on its merits, to the science fiction movie Minority Report, in which people are accused of crimes before they happen.
“In other areas of public policy and other areas of law, that’s not legal, you can’t do that,” he said.
“So you can’t, for example, if you’re looking at employing somebody, say, ‘Ok, well, I’m going to, rather than run a criminal history check on this applicant for this job, I’m going to use a set of demographic characteristics to determine whether or not they are likely to be a criminal offender, using their age, using their marital status using the number of people who live in their household, those sorts of that sort of logic.’
“That sort of prejudicial logic is not allowed in other areas of law and child protection.
“(But) that’s exactly the kind of logic that we use.”