M

We have updated our website and myADLS. To access your myADLS account, please reset your password by using the “Forgot password” option or Click Here. If you need further assistance, contact us at helpdesk@adls.org.nz or 09 871 1385.

Back Home 5 News 5 Why the Privacy Commissioner has done an about-turn on facial recognition technology

Why the Privacy Commissioner has done an about-turn on facial recognition technology

1 Dec 2022

| Author: Reweti Kohere

The Privacy Commissioner says a “strong case” exists for tighter regulation of facial recognition technology amid concerns that the Privacy Act 2020 alone is insufficient to protect the public. This is a marked shift from an initial position paper dated October 2021, where the commissioner considered the Privacy Act could adequately protect individuals’ privacy rights while biometric data is collected and used. The regulator did reserve the option to consider additional action if necessary.

Facial recognition technology (FRT) is ubiquitous overseas, and used in smartphones, at airports and supermarkets, and by Police. But privacy concerns are mounting as FRT gains a foothold in Aotearoa/New Zealand. Accompanying its increased presence are issues about past misuses of established surveillance technologies, creepage and algorithmic bias.

The Office of the Privacy Commissioner (OPC) is working through about 100 submissions from a range of public and private stakeholders on its August 2022 biometric consultation paper, which states a “strong case” exists for further regulatory action to ensure the use of biometric information falls within “appropriate privacy protections”. The regulator, however, says it hasn’t “jump[ed] to conclusions”.

At the same time, the OPC has urged one of the country’s largest supermarket chains to be cautious as it trials the “privacy intrusive” technology in-store, prompting Foodstuffs North Island to carefully consider whether using facial recognition technology is a “necessary, proportionate and effective” response to harmful behaviour.

Regulatory options being considered include additional guidance, standards and principles, directives for government agencies, a specific code of practice and legislative change. The publicly available submissions include tech association umbrella group NZ Tech, which believes more legislative reform and an industry code of practice are not needed; privacy rights advocate Privacy Foundation, which is calling for a privacy code as use of biometric information causes a “high level of risk” to people and society; and artificial intelligence member organisation AI Forum, which has offered to convene a working group to better inform any potential industry code.

The OPC plans to outline its preferred approach by the end of 2022 – a deadline still in play, a spokesperson says. “We will be providing an overview of the responses to our public engagement on biometrics as part of our report back, so we can show how the feedback received has shaped our response.”

Counselling caution

The regulator accepts concern is mounting about the use of technologies that recognise individuals based on their face, fingerprints, voice, eyes and other biological or behavioural characteristics. For the purposes of the Act, biometric information is “personal information” because it helps identify and verify individuals.

In the case of FRT, individuals are verified or authenticated by algorithms that analyse their facial features and find probable – rather than certain – matches when compared with digital templates of their faces, or “face prints”. Facial images may be collected at a distance, without people’s knowledge, and in public. They are considered sensitive biometric information because they go to the heart of people’s sense of identity.

Biometrics are being used in more, diverse ways. iPhone owners are already familiar with using FRT to unlock their smartphones (earlier versions relied on their fingerprints as keys). Facebook has used FRT to suggest “tags” of people in photos shared on the social media platform. International airports use automated self-service border control eGates, which photograph people’s faces and match them with their passport pictures.

Banks can rely on biometric information in setting up accounts or ensuring compliance with anti-money laundering laws. Scottish school canteens have controversially used FRT to speed up payment for lunches while pig farmers in China have used the technology to help improve porcine welfare. At the same time, the Chinese Community Party is embracing FRT and artificial intelligence to spy on billions of its people.

New Zealand supermarkets are the recent focus of homegrown concern. Just last week, consumer rights advocate Consumer NZ confirmed Foodstuffs North Island, the owner of Pak’nSave, New World and Four Square, was the only major retailer nationwide using FRT in 29 of its stores. “We are seriously concerned that New Zealanders are having their sensitive biometric information collected and analysed while they go about their shopping,” Consumer NZ chief executive Jon Duffy said. “These shoppers may not know it is happening or understand the potential consequences of their data being collected in this way.”

In a statement, the OPC said it recognised the company must take steps to keep customers and staff safe. “However, it is not clear to our office how facial recognition technology is going to achieve this. As a result, we have been counselling caution given the privacy intrusive nature of facial recognition technology and the inaccuracy and profiling risks involved.”

Need to inform 

The OPC has asked Foodstuffs to provide details of the 29 stores that use FRT to determine whether their use complies with the Act. “Any store using facial recognition technology must clearly inform customers about its use. Customers who are concerned about what these stores may hold on them should ask for access to this information. Customers who are concerned that their privacy has been breached or that their request for their information has been inappropriately denied should make a complaint to our office,” it said. It’s not the first time Foodstuffs’ use of FRT has been made public. In August 2020, New World Papakura hit headlines when customers were asked to remove their masks to enable FRT to capture their faces – a policy it later reversed.


Two years earlier
, Foodstuffs admitted using the technology in “some” of its North Island stores amid revelations a man was mistakenly identified as a shoplifter at a New World supermarket in Dunedin. While that store didn’t use FRT at the time, it was using the Auror security system, which relies on images from existing CCTV cameras. In all three instances, Foodstuffs justified its use of FRT to help keep staff and customers safe and stem offending.


Misuse 

Law enforcement’s use of biometric information – to identify suspects, detect and investigate crime, and evidentially bolster prosecutions – has attracted greater scrutiny here and overseas. There are worries that biometric technologies may further entrench existing biases in the criminal justice system and that some groups may be falsely targeted due to inaccurate algorithms. However, greater convenience, efficiency gains and increased scalability of identification can flow from FRT and other crime prevention technologies if designed well and used appropriately.

A year ago, as part of a plan to adopt independent experts’ recommendations on its current and potential uses of FRT, the Police chose to continue not using live FRT, which can identify multiple people in large crowds in real-time, until they fully understood the security, privacy, legal and ethical impacts. “FRT is a subject that draws strong interest and sometimes distrust and controversy along with it,” said Police deputy chief executive Mark Evans. “However, with this technology’s fast-paced development, there are also opportunities for more effective policing. Getting this balance right is imperative, and the review has given us clear guidance on the legal and ethical use of this technology.”

Commissioned by the Police, researchers Dr Nessa Lynch and Dr Andrew Chen concluded current or imminent planned use of FRT is “limited and relatively low risk”; the Police collect facial images in a variety of contexts, under different legislative requirements and for a range of purposes; those images, varying vastly in age and quality, are retained in separate systems and there’s little to no capability to combine those databases for wider comparison.

While privacy is an embedded principle within the organisation, the researchers recommended Police should consider other rights and interests when assessing FRT’s impact, including the potential chilling effect on freedom of expression and peaceful assembly of monitoring protests or the presumption of innocence if facial comparison systems are expanded to collect images of those who haven’t been convicted or charged. Even with more established technologies like smartphone photography, CCTV and automatic number plate recognition, the Police have misused them.

The Independent Police Conduct Authority and the OPC in September 2022 found Police lacked general awareness of their Privacy Act obligations, which led to their routine and unlawful taking, using and retaining of photographs of Māori youth. The watchdogs’ joint report upheld three complaints from Māori whanau of Police officers photographing rangatahi without justification, finding the photographs were not necessary for a lawful policing purpose, consent hadn’t been properly sought, explanations were inadequate and in one incident, Police had wrongly threatened arrest if consent wasn’t given.

Also in September, Police admitted to a second case of misusing number plate-reading cameras, flagging a car as stolen to trigger camera tracking in a Counties Manukau homicide investigation in 2022 – even though the car wasn’t stolen. More notably, the Police falsely reported cars as stolen to gain access to powerful databases that record number plates when hunting for the women whose travel sparked the Northland covid-19 lockdown in October 2021. A month earlier, the Privacy Commissioner had warned Police to do better when it came to complying with their privacy obligations in using automated number plate recognition (ANPR).


Creepage 

The Privacy Act regulates how organisations and businesses can collect, store, use and share individuals’ personal information through 13 privacy principles. Unlike other parts of the world, New Zealand’s law doesn’t depend on consent as the primary authority – compliance largely depends on the holder of the personal information having a legitimate purpose.

Principles 10 and 11 of the Act limit use and disclosure, respectively. State agencies collecting personal information for one purpose shouldn’t use or disclose it for any other purpose unless an exception applies. Agencies must believe on reasonable grounds that one of a set of exceptions exists to justify straying from their stated purpose and using information for another reason.

According to the Police’s policy on ANPR, number plate information comes from Police-owned and operated cameras, those owned by other government agencies and third-party operators (such as businesses, councils and government agencies) of security platforms Auror and SaferCities, the latter of which hosts nearly 5,000 cameras across 246 sites that some 4,000 Police officers can access via smartphones. The Police’s use of other organisations’ collected information is governed by information-sharing agreements, which spell out among other things what purposes underpin use and disclosure.

It’s incumbent on third-party ANPR operators to consider one of the exceptions under principle 11 – typically to avoid prejudicing investigations of offences or to prevent serious threats to the public or individuals – is met to justify disclosure to the Police. Primary oversight of the ANPR policy rests with the Police’s Organisational Capability Governance Group, which ensures compliance checks, the automatic “purging” of outdated number plate information, proper training, and the policy remains fit-for-purpose.

The current ANPR policy has extended the length of time Police can retain number plate information. Storage deadlines for Police-owned and generated information are set at 60 days, six months or one year, depending on the type of investigation. The longer that time has elapsed since the offence occurred, and the more serious the offence, the higher the level of authorisation needed for access. In 2014, on announcing the expansion of the then five-year-old Police ANPR technology, the organisation said “all information” was deleted after 48 hours.

The “creep” evident in the longer storage timeframes of surveillance cameras mustn’t happen with FRT, says Source Legal Partners founder Amy Kingston-Turner.“When you’re talking about number plates, it’s not quite so scary – it’s only number plates. But when you’re talking about people’s faces and when they can then run those faces in some other software that says who you are, you’d have to be really clear about what the data was collected for and use it only for those purposes.”

Our history is no better

One criticism levelled at FRT overseas is that it can double down on existing racial and gender biases. Research over the last two decades has exposed divergent error rates across demographic groups in the US – women of colour are affected the most by some of the technology industry’s leading systems mistaking their identities. A US federal study found facial recognition algorithms worked poorly for people of colour, the elderly, women, and children, while middle-aged white men generally benefited from the highest accuracy rates.

No studies have specifically looked at the accuracy rates of algorithms on New Zealand’s population, according to Lynch and Chen’s independent review. However, they considered Māori are more likely to be most impacted by any expanded use of the technology, and Police should consult further. “Given the challenges that FRT has faced with ‘darker skin tones’ globally, it is logical that these systems may produce more errors for Māori and Pasifika faces too,” they said.

Gallagher & Co managing partner Lloyd Gallagher, who convenes ADLS’s technology and law committee, says algorithms created overseas must be scrutinised here if they are to have relevance, although they are only as good as they are programmed. “We have to start training it from 2022, not from 1978 or earlier, because all of that skewed statistical bias is going to come into it. There’s no way to avoid it so it has to be clean.” AI must be trained as neutrally as possible, he says. “There’s a real tension and difficulty because every single programmer has a preconception about how the maths needs to be done. If we try to neutralise that by saying ‘OK, we’re not going to programme it, we’re going to let itself learn from our history’, then our history is no better. Our history is just as biased as what we could programme.”

Protect the future

Contemporary data, however, might be just as unhelpful. Gallagher says fear – “especially after covid” – is breeding racial bias in society. “We can see that in America, everyone became panicked. They were worried about gun violence, they were worried about going to school because they might get shot, they can’t go to a movie theatre without being shot…If you’ve got a fear around violence, then that’s going to translate into your fear about who you think is committing that violence.”

While a perfectly neutral AI isn’t possible, a system full of checks and balances might be the next best thing, he says. One algorithm could collect the data and flag issues, for instance, while a second AI could review them to filter errors. Starting from scratch might take time, but “maybe that’s the correct approach”.

The idea of new technologies being used to suppress people is “terrifying”, Kingston-Turner adds, and highlights the necessity of a robust privacy regime. “It is definitely a space that needs careful consideration by our government now in order to protect our population in the future.” ■

 

Subscribe to LawNews

The weekly online publication is full of journalistic articles written for those in the legal profession. With interviews, thought pieces, case notes and analysis of current legal events, LawNews is a key source of news and insights for anyone working within or alongside the legal field.

Sign in or
become a Member
to join the discussion.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Latest Articles

Gloriavale wins fight to keep its bank accounts – for now

In July 2022, the Bank of New Zealand (BNZ) gave notice of its intention to terminate its banking relationship with various companies and entities associated with the Christian Church Community Trust, more commonly known as the Gloriavale Christian Community.

read more
Loading...