Retail Facial Recognition Comes Of AgeWritten by Mark Rasch
Attorney Mark D. Rasch is the former head of the U.S. Justice Department’s computer crime unit and today serves as Director of Cybersecurity and Privacy Consulting at CSC in Virginia.
Some years ago, I demoed an ATM that had no card, no chip, no PIN and only a limited keyboard. The ATM used facial recognition software to identify me (after registration), so I only had to walk up to the machine, type in $20 from checking and, voila! Money dispensed. Assuming that everything works as promised and that facial recognition software is close to 100 percent accurate and reliable (more on this later), retailers should consider the legal, privacy and compliance issues related to biometrics before rushing in. Like all innovative technologies (from credit cards to loss prevention devices), it’s not clear yet whether consumers will embrace or reject the new technology, or how regulators will ultimately react.
The legal issues for biometric technology surround various phases of its implementation. Capture. Enrollment. Storage and protection. Sharing. Comparison. Use. De-enrollment and purging. And this says nothing about the technical issues.
How do you get the image you are going to use for the facial recognition? Not an easy question. Sure, if it’s an ATM or payment-card replacement, the person can voluntarily sit down and give consent for a picture to be taken. But what about passive capture? Setting up a camera in a store or elsewhere and taking images of those who walk in? Benneton recently announced it was testing (but had not deployed) a technology called EyeSee, a camera and facial recognition software deployed inside mannequins. The technology captures shoppers and eye level, and it can be used for both loss prevention, trend analysis (what kind of people are doing what types of things in the store) and, ultimately, identification of customers by comparison with other databases.
This type of “passive capture” is particularly problematic from a legal perspective. Although we may have convinced the consuming public that they have no “right to privacy” in their images while they are in the store or mall (outside bathrooms or dressing rooms), the concept of creating a database of individual actions and movements based on facial recognition software takes that privacy expectation to a new level.
There’s a fundamental difference between monitoring traffic and monitoring individuals. Do people in your parking lot know they are consenting to your capture of their license plate numbers (and images of the number, race, gender and age of the occupants of their vehicles)? Once you add the possibility of facial recognition to “ordinary” capture devices (like theft prevention cameras), you have converted the data into personally identifiable information (PII). So how do you get the image that matters. If you get a picture taken at Costco for its membership, are you consenting to the chain’s use of that image for facial recognition and tracking?
The next issue is enrollment. How do you link a captured image to a specific person? Again, people can voluntarily enroll—like those credit cards that have pictures on them. Or they can be forced to enroll—like a person who is arrested for shoplifting, has a picture taken and then is banned for life not only from the individual store but from all of the chain’s stores and its affiliates forever. Stores use facial recognition software to create a nationwide database of such “banned” persons and to enforce the ban. Was consent required? Most likely not.