Australian hardware giant Bunnings has won the battle to monitor customers with artificial intelligence facial recognition technology to combat crime in its retail stores and reduce employee abuse.
Australian Privacy Commissioner Carly Kind ruled in 2024 that Bunnings breached privacy laws by scanning the faces of hundreds of thousands of customers without proper consent.
A review of that decision by the Australian Administrative Review Tribunal has now found the opposite result.
Bunnings has been fighting for two years against a Privacy Commissioner’s ruling that its use of facial recognition technology broke the law. (Getty Images: Daniel Pocket)
The ruling said the retailer did not violate the law by scanning customers’ identities, but it must improve its privacy policies and notify customers about its use of AI-based facial recognition technology.
Bunnings managing director Mike Schneider said in a statement: “We welcome the Administrative Review Tribunal’s decision regarding Bunnings’ past facial recognition technology exams.”
“Our aim in piloting this technology was to protect people from violence, abuse, serious criminal activity and organized retail crime.
“The court recognized the need for practical, common sense steps to keep people safe. It also identified areas where we are not doing everything right, including our signage, customer information, processes and privacy policy, and we are open to that feedback.”
The decision could provide a legal framework for other retailers to follow Bunnings’ lead and embrace the use of AI to reduce crime risk in their stores.
How Bunnings used AI to record customer identities
Bunnings’ facial recognition technology was first used in one store during a two-month trial in November 2018.
Between January 2019 and November 2021, the use of AI technology was expanded to 62 other stores in New South Wales and Victoria.
According to the court’s ruling, the technology was designed by Japan’s Hitachi and provided to Bunnings through a third party.
The retailer was solely responsible for operating an “enrollment database” containing biometric markers on customers’ faces scanned from photos taken by the store’s surveillance cameras.
According to the court’s ruling, the image database was stored on hard drives on a central server in Bunningsville, Sydney, with copies of the images stored in memory on local servers at each store.
The facial scans were matched against a list of “registrants” who the retailer claims have committed or are suspected of committing theft or refund fraud, or who have threatened store staff or other members of the public.
The list sometimes numbered in the hundreds, according to the court’s ruling.
Kind said a two-year study into the use of AI technology in 2024 found that customers were likely unaware that their identities were being monitored.
At the time, Bunnings told investigators that if a person’s face did not match a “registered individual,” customer data was collected and automatically deleted within an average of 4.17 milliseconds.
Loading…
Schneider also said that around 70% of incidents in Bunnings stores in 2024 will be caused by repeat offenders, which justifies the use of AI technology.
The practice came to the attention of the Australian Information Commissioner’s Office (OAIC) when consumer advocacy group Choice revealed in 2022 that Bunnings, Kmart and The Good Guys were using facial recognition technology.
Following Choice’s reporting, all three stores discontinued the practice.
“We find that during the relevant period Mr. Bunnings had the right to use: [facial recognition technology] “For the limited purpose of combating very serious retail crime and protecting employees and customers from in-store violence, abuse and intimidation,” the court’s ruling said Wednesday.
“Key factors in our decision include, firstly, the extent of retail crime faced by Bunnings staff and customers and, secondly, the technical characteristics of Bunnings. [facial recognition technology] The system minimizes privacy invasions by permanently deleting collected sensitive information and limiting susceptibility to cyber-attacks.
“With cutting-edge technology, [facial recognition technology] The system limited the impact on privacy so that it was not disproportionate to the benefits of providing a safer environment for staff and customers in Bunnings stores. ”
The OAIC said the tribunal’s decision showed Bunnings had not taken sufficient steps to properly manage the personal information of its customers, including those not listed as “registered individuals”, or to warn them that they were being monitored.
”[The] “This decision confirms that privacy law contains strong protections for individual privacy that are also applicable in the context of emerging technologies,” the OAIC statement said.
“We particularly welcome that this decision reaffirms a number of important interpretive positions taken by the OAIC, including that even the temporary collection of personal information by sophisticated digital tools constitutes collection under privacy law.”
“Australian communities continue to take privacy very seriously and are increasingly concerned about the challenges in protecting their personal information.”
‘Common sense prevails’: retail experts weigh in
Queensland University of Technology retail and consumer behavior professor Gary Mortimer said he supported the administrative tribunal’s decision.
“Retailers have a duty not only to keep their employees safe, but also to keep other customers safe and their inventory safe from loss and theft,” he said.
“They need to look at other ways to do it and use high-tech, innovative technology and computer vision. [and] AI systems are the way of the future. ”
Loading…
Professor Mortimer also said he hoped other major Australian retailers would follow Bunnings’ lead and introduce facial recognition technology into their stores to improve safety and quickly identify potential crime risks.
“This kind of technology will become commonplace,” he said.
“It’s not just retail, it’s having a wider impact. I think about civil servants in service jobs who regularly encounter aggressive behavior. I think about tram, train and bus drivers who encounter abuse.”
Gary Mortimer said the court ruling could lead to other major retailers also using facial recognition technology. (Four Corners: Nick Wiggins)
Mr Mortimer said artificial intelligence also offers retailers some convenience in reducing potential crime.
“Humans don’t have to sit there and wait for something to happen,” he says.
“The AI system identifies if someone is loitering near the restroom and identifies if someone has put something in their pocket or hidden a product.
“Obviously there needs to be clearer signage, but perhaps that is something that has been overlooked by society.” [Bunnings’] Initial implementation of this technology.
“After a certain period of time, the images are deleted. The images are never reviewed by a human. I think there needs to be more clarity and communication around the technology.”
