People have a right to privacy and freedom of assembly
Amnesty International Aotearoa New Zealand welcomes new report into the use of facial recognition technology.
Earlier this year the Police commissioned an independent report into the current and potential use of facial recognition technology (FRT) in policing in Aotearoa New Zealand. The report, by Dr Nessa Lynch and Dr Andrew Chen, has now been released, with its first recommendation calling for the continued pause on any consideration of live automated FRT.
Amnesty International Campaigns Director Lisa Woods says effective and robust regulation is crucial.
"The capability of this technology means there is significant potential to breach human rights, and examples from overseas clearly illustrate how this technology can threaten people’s right to privacy and peaceful assembly. These are very sacred and crucial rights that all people should be free to enjoy. There is also a significant risk that this technology will exacerbate systemic racism and disproportionately impact people of colour."
Lisa Woods, Campaigns Director, Amnesty International Aotearoa New Zealand
She says Amnesty International supports the report’s call to uphold Te Tiriti in partnership with Māori.
"We would like to see the Police partner with Māori to determine next steps following this review and proactively seek input from communities likely to be disproportionately impacted by this technology. Discriminatory concerns about FRT are well recognized, and the report states that Māori are likely to be most impacted by any expanded use of FRT or implementation of live automated FRT."
Woods adds we urgently need robust systems to scrutinise the use of facial recognition technology.
"The Police have stated they will now develop a Response Plan which will include developing systems and protocols to analyse FRT’s implications for privacy, ethical, security, legal and human rights. We, along with others, have been calling for moves to improve scrutiny, it is urgently needed, and we hope it will happen in a swift and robust manner."
The report notes, "Live automated FRT/live biometric tracking is a high-risk activity which can have significant impacts on individual and societal interests. Its use is also likely to impact significantly on over-represented communities and vulnerable adults and youth".
The software has higher rates of false positives amongst people of colour, which creates a greater risk of misidentification and as a result, false arrest. Even when it "works", it can exacerbate discriminatory policing and prevent the free and safe exercise of peaceful assembly by acting as a tool of mass surveillance.