Back to News & Commentary

Retailers Secretively Using Face Recognition to Spot “Persons of Interest” — Including For the Government

A man holding a child's hand and walking down a colorful grocery store aisle
Are Wegmans and other retailers participating in the Trump war on immigrants?
A man holding a child's hand and walking down a colorful grocery store aisle
Jay Stanley,
Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
January 20, 2026

Subscribe to the Free Future Newsletter
Free Future home

The grocery store chain Wegmans, among other retailers, is using face recognition on its customers — and scanning their faces for resemblance not only to accused shoplifters but also to people whose photos have been submitted to the company by law enforcement.

In response to press coverage of the company’s use of face recognition in some of its stores, Wegmans said it uses face recognition to try to find “persons of interest,” who are “determined by our asset protection team based on incidents occurring on our property” — but also, “on a case-by-case basis, by information from law enforcement.”

Face recognition is an enormously powerful surveillance and tracking technology that continues to lack broad public acceptance and legitimacy (especially when used “on” people rather than “by” people for example to unlock their phone). It is unreliable, disproportionately imprecise in evaluating the faces of Black people and other groups, and has been the subject of misuse by companies and law enforcement alike, with at least 10 publicly reported cases of people, nearly all Black, suffering false arrests based on face recognition errors.

And the incorporation of "BOLO" (“Be On the Look Out for”) alerts by companies on behalf of law enforcement has the potential to become — and may already be becoming — a powerful nationwide government surveillance dr­agnet. If enough companies deploy this tech in enough stores, it could become a mass surveillance machine able to locate anyone who steps foot in a wide variety of establishments anywhere in the United States. Do we want our government to have that much power? Will we the public even have a say in the matter?

The government wants to use surveillance systems for deportation
At the current dark moment in our nation’s history, any corporate partnership with law enforcement in a mass surveillance scheme raises the question: are you facilitating the Trump Administration’s trigger-happy, smash-and-grab war on immigrants? Are Wegmans or other companies running BOLO scans they receive from law enforcement for people that are wanted by federal immigration agencies, or by some of the many local police departments that are cooperating with those agencies?

These are reasonable questions because we know of another corporate mass surveillance system that has definitely been used by the Trump immigration agencies: the driver-surveillance company Flock. Flock runs a network of thousands of automatic license plate reader (ALPR) cameras across the nation and makes most of that data available to any local department that wants to search it. Local police departments across the nation have been discovered carrying out searches the data on behalf of the federal immigration agencies, and many towns and police departments using Flock have been very unhappy to discover that basically any police officer in the nation can secretly hand the data on their residents over to ICE.

Given the heat that Flock is taking around the country over its use in the ongoing deportation drive, a retailer like Wegmans would no doubt deny any involvement in that drive — but how would we know if we can believe such a claim given that companies are not being at all honest and transparent with Americans about their deployments of this tech? As we have seen, even some of America’s biggest companies have bent to demands from this lawless and vengeful administration. It’s also possible that a retailer like Wegmans wouldn’t even know when law enforcement BOLOs are connected to the “Trump Terror” deportation drive.

Wegmans says that they use information from law enforcement “for criminal or missing persons cases,” and of course immigration enforcement is a civil not criminal matter. But in an environment where immigation enforcement is being intertwined with criminal investigations and prosecutions, can we be at all confident that will matter?

We don’t know how many other retailers are using face recognition on their customers. Various indications, however, suggest the practice is at a minimum growing, and potentially already somewhat common. Those indications include claims by vendors, a handful of confirmed uses such as by Lowe’s hardware stores and Madison Square Garden, and conversations I’ve had with people knowledgeable about the retail sector. In New York City, where local law requires companies to post a public notice if they are conducting biometric surveillance, several retailers appear to be using the technology including, most prominently, Macy’s, as well as some local grocery store companies. We don’t know whether Macy’s also does searches for the government.

The most detailed information we have about a retail face recognition system comes from an investigation of Rite Aid drug stores by the Federal Trade Commission, which found numerous problems with the stores’ program and led to a five-year ban on the use of face recognition by the company. Like Wegmans’, Rite Aid’s scanning for “persons of interest” incorporated law enforcement BOLO photographs as well as those accused of shoplifting by store staff.

Any nationwide retailer-government BOLO infrastructure is unlikely to have yet attained the scope of Flock’s driver-surveillance network — though we don’t really know because of the companies’ secrecy. Nevertheless, the potential for these creeping, shadowy deployments to evolve into such a network is very real.

A lack of openness and honesty with customers
Face recognition is highly controversial and lacks public acceptance and legitimacy, with at least 20 cities and 15 states having banned or restricted its deployment by police. Perhaps that is why Wegmans, like other stores that have been asked about their use of this technology, refuses to provide much detail about its use.

  • We only know about Wegmans’ deployment because of New York City, where the retailer has stores, and its disclosure law.
  • Gothamist, which first reported the story, writes that “Wegmans representatives did not reply to questions about how the data would be stored” or “if it would share the data with law enforcement.”
  • The company also refused to tell a reporter from a television station which of its stores have deployed face recognition or what its data retention period is.
  • Such secrecy appears to be in line with other companies’. In 2018, we wrote to a list of 20 top U.S. retailers asking if they were using face recognition on their customers, and all but two of them refused to even answer. Similarly, when Gothamist contacted nearly 50 major retailers recently to ask whether they use facial recognition in their New York City locations, most declined to respond. In 2023 the FTC investigation found that “Rite Aid specifically instructed employees not to reveal Rite Aid’s use of facial recognition technology to consumers or the media.”

Such secretiveness is especially problematic considering that face recognition is a powerful, relatively novel, and very controversial technology that our society is still grappling with. People need to know how it’s panning out. Wegmans issued a statement after the Gothamist story but left many questions unanswered. Among them: What vendors do you use? How accurate have those vendor’s algorithms been found to be in the government’s studies? How biased? What percentage of your alerts are false positives? Do you even know? What are your procedures for identifying and handling false positives?

Wegmans declared, “We understand concerns about fairness and bias in facial recognition systems. We employ a multitude of training and safety measures to help keep people safe.” That vague statement is worth little. What, exactly, are those “measures”? Are they at all effective? Do they even know? We know from thorough government studies that the tech has biases; has Wegmans somehow solved that problem? If so, they should share their breakthrough with the industry. If not, then it remains the case that despite these mysterious “measures,” Black people will be mistaken for faces in the watchlist database more often than White people.

It was also grimly amusing that the company declared in its statement that face recognition serves as only “one investigative lead for us,” because despite similar promises to that effect being made routinely by law enforcement, the inaccurate technology has been used repeatedly by police not as a lead but as a basis for arrest. And what does Wegmans even mean — that people won’t be barred from stores if they are the subject of a match with an accused shoplifter?

False matches and due process
A central problem with these systems is that they make mistakes and that when they do there’s no guarantee that people will be given due process and generally treated with elemental fairness and decency.

  • In New Jersey, a man shopping at a regional grocery chain describes being caught up in a face recognition system after he made a mistake at a self-checkout counter (which are often tricky in the best of circumstances) due to confusion over an advertised sale not being reflected in his bill. On a subsequent visit he was accosted by store security and treated like a criminal. The store would not let him see the checkout video; a right to confront the evidence against you before you suffer adverse consequences is a core part of due process.
  • Among a number of reported errors in the UK, an error in a face recognition system led to a man being falsely accused of shoplifting. He was eventually cleared but told by staff that the video that cleared him was scheduled to be deleted 3 days later; had it been, he said, he wouldn’t have been able to prove that he was innocent. He may have faced a ban — lifetime, for all we know — from that store (B&M) and its affiliates and partners. The vendor blamed “human error” but we’ve heard that one before; these are all techno-human systems where the precise source of an error matters little to the victim.
  • In Michigan, a Black teenager was summarily ejected from a roller skating rink after a facial recognition camera misidentified her as a match to someone previously barred from the premises. The business defended its action by claiming that the software flagged her as a “97 percent match.” That’s a gross misrepresentation of the reliability of this technology, but the teen had no chance to contest her ejection before she was put out on the sidewalk without a ride home.

The consequences of such errors are likely to grow over time. Wegmans says they “do not share facial recognition scan data with any third party.” That is cold comfort, however. First, they don’t need to retain or share your biometrics to ban you from shopping or call the police on you if they think you’re a wanted criminal. Second, how would we know if they did share your scan data? That promise is probably unenforceable given the secrecy involved as well as how the Trump Administration has been generally gutting consumer protection regulations. It would be essentially costless for the company to pool its blacklists with other retailers.

Create it and it will expand
If such sharing networks emerge — much as blacklists of “troublemakers” (i.e., labor organizers) were shared among companies in the 20th century — someone who is falsely accused might find themselves unjustly banned from a significant number of retail stores. That would be especially harmful for the most vulnerable people, who might for example live in one of our nation’s all-too-common food deserts, lack transportation options, and find themselves effectively shut out from shopping for groceries (and perhaps hardware and other goods too).

Once a BOLO face recognition infrastructure is in place, it would also be costless for companies to start expanding the purposes for which it is used. Wegmans says it’s only used for “keeping our stores safe and secure” (read: limiting our losses from theft), but at Madison Square Garden we’ve already seen the technology used as an abusive means of control, and it’s easy to foresee its use against labor activists, undercover journalists, politically disfavored enemies of the state, and who-knows-who-else. (We’ve discussed the troubling history and present reality of private blacklists in several other contexts.) The Rite Aid “persons of interest” database grew to an alarmingly large size; it had “at least” tens of thousands of people in its database, the FTC found. Certainly after years of ACLU litigation on behalf of innocent people caught up by government watchlists, we know that those have become enormously bloated. Such bloat just increases the chances that any given person subject to face recognition blacklist checks will find themselves falsely accused and potentially unfairly banned.

All of this might drive up profits for a company even at the cost of condemning a certain number of innocent customers to various rotten consequences, which a big retailer might just see as a “cost of doing business.”

Corporate secrecy is a problem for due process too. Who within these companies is permitted to add a person to a list — any low-level clerk? Is there any review of such a placement? Standards for who is listed? Does a company allow people to appeal their placement on a list? If so, what process does such an appeal involve? Do these companies keep people on their lists forever? Will an error by a troubled adolescent result in a lifetime ban? If not, for how long are they listed, and how is that decided? We don’t know.

The bottom line is that Wegmans, Macy’s, and any other companies that use face recognition against their customers are exposing them to the constant risk of being mistaken for a shoplifter or other lawbreaker— and again, it’s also possible that the law enforcement agencies with whom they are cooperating are in turn sharing information with ICE. Companies want to save money by condemning a certain number of their customers to the experience of a false accusation (because false positives occur in every system), while at the same time hiding that fact from their customers as much as possible. This should stop, and we customers should demand that it does.

Learn More About the Issues on This Page