Time for a Face-Off: The Privacy Implications of Facial Recognition Technology in Supermarkets

By Emma Hooton

Introduction

Facial recognition technology remains a contentious subject across jurisdictions, as it faces the difficult task of balancing efficient identification against the invasion of privacy rights. This issue has reared its head following an announcement from Foodstuffs New Zealand that from 8 February 2024, it would begin trialling facial recognition technology (FRT) in 25 of its stores.[1] This decision was in response to the consistent increase in crime statistics, with 4719 incidents occurring in the October – December period of 2023.[2]

 

This piece will discuss what FRT is and the risks associated with its use. It will also critically examine what the Privacy Commissioner plans to do (and how this compares to other jurisdictions).

 

What is FRT?

Facial recognition technology forms part of a broad range of “biometric technologies”. Individual bodily characteristics like someone's face, fingerprints, voice, or gait are processed by a system to consider the truthful identity of a person.[3] These systems are pervasive in many areas of our lives; using your facial identity or fingerprint to unlock your phone or in places like border security and policing are common examples.[4] Given the speed and claimed accuracy, these technologies have provided convenience, particularly in areas dealing with security-based risks. However, a balance must be struck between convenience and privacy.

 

What are the risks associated with FRT use?

FRT has been adopted worldwide, and legislative developments have struggled to keep up. It is important to consider whether the use of this technology can be justified, given the particular sensitivity of the information, the fact that it would increase the level of surveillance the public is subject to, and its potential for discriminatory bias.

 

1)    Sensitivity

Biometric information is inherently sensitive as it is “unique to an individual and intimately connected to who you are.”[5] The capturing of your likeness in a public setting for security usage impedes physical privacy, informational privacy, and surveillance privacy, simultaneously impacting a broad range of privacy interests. Despite the increased sensitivity of this information, the Privacy Act 2020 provides no additional legislative safeguards.[6] However, the Privacy Commissioner has clearly stated that the agencies holding sensitive information will be held to a higher standard of accountability.[7] Regardless of any higher standard of accountability, a security breach could have high-stakes consequences for individuals, such as identity theft, stalking, or harassment.[8] A password can be changed in the event of a breach; a face, however, cannot.[9]

 

2)    Surveillance

In a world of constant interconnectedness, the need for a sphere of private life is increasingly important. The concept of privacy has long emphasised that “the intensity and complexity of life” require an area to be preserved where people can retreat from the world, a so-called “right to be let alone”.[10] Increased surveillance threatens this ability to retreat from the world, arguably impedes personal dignity, and minimises control over personal information.

 

An additional issue is the potential for “function creep” to occur. This involves the system's function moving beyond its original purpose in a way unforeseen by the developers, users or the wider public.[11] For example, businesses may collect FRT data for marketing purposes instead of merely security purposes.[12] This information can have clear commercial value, and the function creep often occurs without knowledge or consent from the public, making it particularly concerning.

 

3)    Discriminatory Bias

Even the most accurate software systems can be susceptible to inaccuracies. Research has shown that false matches in FRT systems disproportionately affect people of colour.[13] In a study by Buolamwini and Gebru, three large commercial FRT systems were analysed, with findings that all classifiers performed better on male faces than female and better on lighter faces than dark. Lighter male faces experienced an error rate of 0.0%-0.3%, while darker female faces experienced an error rate of 20.8%-34.7%.[14] Such an error rate is unjustifiable and can potentially exacerbate existing inequalities. This is a particular issue if the information is being used in a criminal prosecution capacity, as it may heighten existing issues of overrepresentation of certain demographic groups in the justice system.[15]

 

What does the Privacy Commissioner Plan to do about it?

The Privacy Commissioner has been following the supermarket FRT trial closely and has launched an official inquiry under the Privacy Act 2020 to monitor how stores run this trial and whether it is compliant with the Act.[16] Furthermore, the Biometrics Code exposure draft has been released for comment as of 10 April 2024 and asks people to consider when agencies should be allowed to use biometrics (proportionality), what people should be told when biometrics are collected (transparency) and limitations on biometric usage.[17] The Privacy Commissioner has noted that given the “cross-cutting issues raised by biometrics”, legislative amendments may be needed due to the uniquely personal nature of the information.[18]

 

While the Code will offer benefits by providing a more prescriptive framework, its effectiveness relies on companies' willingness to act in compliance.[19] Corporate actors are often only focused on legal compliance, but for discernable change, companies should aim to create a consistent environment where a goodwill privacy relationship is fostered between the company and the consumer.[20]

 

How does this compare with other jurisdictions?

Other jurisdictions have faced similar challenges with the increased use of FRT and minimal legislative protection for people subject to it.

 

In Australia, biometric information is included in the sensitive information category, providing greater protection.[21] Furthermore, unless a specific exception applies, public and private agencies can only collect biometric information where an individual consents and the information is reasonably necessary for the agency's functions or activities.[22] There has been considerable backlash regarding the use of FRT in commercial settings – particularly concerning Kmart and Bunnings in 2022.[23] The government has invested significantly in FRT infrastructure, but an enabling bill has yet to be introduced.[24]

 

In the United Kingdom, biometric data is defined as personal data resulting from specific technical processing that allows unique identification.[25] There is a general prohibition on the processing of special categories of data, which includes biometric data, unless a specific exception applies (limited to situations of consent, necessity, public interest, health, and safety).[26] The House of Lords Committee has recently questioned the legality of Live Facial Recognition Technology being used by police and is particularly concerned with the use being expanded “without proper scrutiny and accountability.”[27] The letter outlined the need for a legislative framework to deal with the current situation but also stated the need to future-proof in light of the rapid technological advancement.[28]

 

Conclusion

Privacy rights are fundamental, and any encroachment should be examined critically and thoroughly. The rising use of FRT technology is of particular concern, given the sensitivity of the information, the increased surveillance it creates, and the potential for discriminatory bias. The lack of legislative protection leaves New Zealand open to gaps in the law as technology advances. However, the inquiry into their Foodstuffs FRT trial will act as a beneficial case study that will hopefully enhance awareness around the issue and encourage the development of a comprehensive biometrics code. At the end of the day, technology is a value-neutral tool; it gains moral value through how we as a society choose to use it.[29]


[1] Foodstuffs North Island “Foodstuffs North Island begins trialling facial recognition in select stores as part of its commitment to keep teams and customers safe by keeping previous offenders out” (8 February 2024) Foodstuff North Island <https://www.foodstuffs.co.nz/news-room/2024/Foodstuffs-North-Island-begins-trialling-facial-recognition-in-select-stores>.

[2] Above n 1.

[3] Ana Valdivia, Júlia Corbera Serrajòrdia and Aneta Swianiewicz “There is an elephant in the room: towards a critique on the use of fairness in biometrics” (2023) 3 AI and Ethics 1407 at 1407.

[4] At 1407.

[5] Office of the Privacy Commissioner “Biometrics and Privacy” (23 November 2023) <https://www.privacy.org.nz/publications/guidance-resources/biometrics-and-privacy/>.

[6] Privacy Act 2020, s 3.

[7] Office of the Privacy Commissioner “Sensitive personal information and the Privacy Act 2020” (16 December 2021) <https://privacy.org.nz/assets/New-order/Your-responsibilities/Privacy-resources-for-organisations/Sensitive-Personal-Information-and-the-Privacy-Act-2020.pdf>.

[8] Hafiz Sheikh Adnan Ahmed “Facial Recognition Technology and Privacy Concerns” (21 December 2022) <https://www.isaca.org/resources/news-and-trends/newsletters/atisaca/2022/volume-51/facial-recognition-technology-and-privacy-concerns#:~:text=Unlike%20many%20other%20forms%20of,faces%20cannot%20easily%20be%20changed>.

[9] Gary KY Chan “Towards a calibrated trust-based approach to the use of facial recognition technology” (2021) 29(4) International Journal of Law and Information Technology 305 at 307.

[10] Samuel Warren and Louis Brandeis “The Right to Privacy” (1890) 4 Harv L Rev 193 at 195-6.

[11] Bert-Jaap Koops “The Concept of Function Creep” (2021) 13(1) Law, Innovation and Technology 29 at 37-38.

[12] Chan, above n 9, at 307.

[13] Nicol Turner Lee and Caitlin Chin “Police Surveillance and Facial Recognition: Why data privacy is imperative for communities of colour (paper presented to American Bar Association Antitrust Spring Meeting, Washinton D.C, 8 April 2022).

[14] Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification” (paper presented to Conference on Fairness, Accountability and Transparency, New York, 2018) 1 at 8.

[15] Office of the Inspectorate The Lived Experience of Women in Prison (Ministry of Justice, October 2021).

[16] Privacy Act 2020, s 17(1)(i)).

[17] Office of the Privacy Commissioner “Kiwis asked to have their say on new draft rules using biometrics” (10 April 2024) <https://www.privacy.org.nz/publications/statements-media-releases/kiwis-asked-to-have-their-say-on-new-draft-rules-for-using-biometrics/>.

[18] Radio New Zealand “A ‘lack of consequences’ for managing personal information poorly – privacy commissioner” Radio New Zealand (online ed, Wellington, 3 April 2024).

[19] Interview with Joshua Yuvaraj, Senior Lecturer University of Auckland (Emma Hooton, the author, 19 April 2024).

[20]  Above n 19.

[21] Privacy Act 1988 (Cth), s 6.

[22] At  sch 1 Australian Privacy Principle 3.3(a).

[23] Josh Taylor “Bunnings and Kmart halt the use of facial recognition technology in stores as privacy watchdog investigates” The Guardian (online ed, London, 25 July 2022).

[24] Christopher O’Neill “Disaster, facial recognition technology, and the problem of the corpse” 26(3) New Media & Society 1333 at 1335.

[25] Data Protection Act 2018, s 5(1).

[26]At s 10(1).

[27] Letter from Baroness Hamwee (Chair House of Lords) to Rt. Hon Hames Cleverly MP (Home Secretary) regarding the use of Live Facial Recognition Technology by police forces in England and Wales (26 January 2024) at [5].

[28] At [5] and appendix.

[29] Interview with Joshua Yuvaraj, Senior Lecturer University of Auckland (Emma Hooton, the author, 19 April 2024).

The views expressed in the posts and comments of this blog do not necessarily reflect those of the Equal Justice Project. They should be understood as the personal opinions of the author. No information on this blog will be understood as official. The Equal Justice Project makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. The Equal Justice Project will not be liable for any errors or omissions in this information nor for the availability of this information.