Few Rules Govern Police Use of Facial-Recognition Technology

Groups call for Amazon to stop selling facial-recognition tech to police departments after documents reveal the practice.
Image may contain Human Person and Pedestrian
Albin Lohr-Jones/Pacific Press/Getty Images

They call Amazon the everything store—and Tuesday, the world learned about one of its lesser-known but provocative products. Police departments pay the company to use facial-recognition technology, which Amazon says can “identify persons of interest against a collection of millions of faces in real-time.”

More than two dozen nonprofits wrote to Amazon CEO Jeff Bezos to ask that he stop selling the technology to police, after the ACLU of Northern California revealed documents to shine light on the sales. The letter argues that the technology will inevitably be misused, accusing the company of providing “a powerful surveillance system readily available to violate rights and target communities of color.”

The revelation highlights a key question: What laws or regulations govern police use of the facial-recognition technology? The answer: more or less none.

State and federal laws generally leave police departments free to do things like search video or images collected from public cameras for particular faces, for example. Cities and local departments can set their own policies and guidelines, but even some early adopters of the technology haven’t done so.

Documents released by the ACLU show that the city of Orlando, Florida, worked with Amazon to build a system that detects “persons of interest” in real time using eight public-security cameras. “Since this is a pilot program, a policy has not been written,” a city spokesperson said, when asked whether there are formal guidelines around the system’s use.

“This is a perfect example of technology outpacing the law,” says Jennifer Lynch, senior staff attorney at the Electronic Frontier Foundation. “There are no rules.”

Amazon is not the only company operating in this wide open space. Massachusetts-based MorphoTrust provides facial-recognition technology to the FBI and also markets it to police departments. Detroit police bought similar technology from South Carolina’s Data Works Plus, for a project that looks for violent offenders in footage from gas stations.

The documents released Tuesday provide details about how Orlando, and the sheriff’s department of Oregon’s Washington County use Amazon’s facial-recognition technology. Both had previously provided testimonials about the technology for the company’s cloud division.

Orlando got free consulting from Amazon to build out its project, the documents show. In a prior testimonial, Orlando’s chief of police, John Mina, said that the system could improve public safety and “offer operational efficiency opportunities.” However a city spokesperson told WIRED, “This is very early on, and we don't have data to support that it does or does not work.” The system hasn’t yet been used in investigations, or on imagery of members of the public.

Washington County uses Amazon’s technology to help officers search a database of 300,000 mug shots, using either a desktop computer or a specially built mobile application. Documents obtained by the ACLU also show county employees raising concerns about the security of placing mug shots into Amazon’s cloud storage, and the project being perceived as “the government getting in bed with big data.”

There’s no mention of big data in the US Constitution. It doesn’t provide much protection against facial recognition either, says Jane Bambauer, a law professor at the University of Arizona. Surveillance technology like wiretaps are covered by the Fourth Amendment protections against search and seizure, but most police interest in facial recognition is in applying it to imagery gathered lawfully in public or to mug shots.

State laws don’t generally have much to say about police use of facial recognition, either. Illinois and Texas are unusual in having biometric privacy laws that can require companies to obtain permission before collecting and sharing data such as fingerprints and facial data, but they make exceptions for law enforcement. Lynch of the EFF says hearings by the House Oversight Committee last year showed some bipartisan interest in setting limits on law enforcement use of the technology, but the energy dissipated after committee chair Jason Chaffetz resigned last May.

Nicole Ozer, technology and civil liberties director at the ACLU of Northern California, says that for now, the best hope for regulating facial recognition is pressuring companies like Amazon, police departments, and local communities to set their own limits on use of the technology. “The law moves slowly, but a lot needs to happen here now that this dangerous surveillance is being rolled out,” she says. She thinks Amazon should stop providing the technology to law enforcement altogether. Police departments should set firm rules in consultation with their communities, she says. In a statement, Amazon said all its customers are bound by terms requiring them to comply with the law and “be responsible.” The company does not have a specific terms of service for law enforcement customers.

Some cities have moved to limit the use of surveillance. Berkeley, California, recently approved an ordinance requiring certain transparency and consultation steps when procuring or using surveillance technology, including facial recognition. The neighboring city of Oakland recently passed its own law to place oversight on local use of surveillance technology.

Washington County has drawn up guidelines for its use of facial recognition, which the department provided to WIRED. They include a requirement that officers obtain a person’s permission before taking a photo to check their identity, and that officers receive training on appropriate use of the technology before getting access to it. The guidelines also state that facial recognition may be used as an investigative tool on “suspects caught on camera.” Jeff Talbot, the deputy spokesperson for the Washington County Sheriff's Office, said the department is not using the system for “public surveillance, mass surveillance, or for real-time surveillance."

Ozer and others would like to see more detailed rules and disclosures. They’re worried about evidence that facial recognition and analysis algorithms have been found to be less accurate for nonwhite faces, and not accurate at all in law enforcement situations. The FBI disclosed in 2017 that its chosen facial-recognition system had only an 85 percent chance of identifying a person within its 50 best guesses from a larger database. A system tested by South Wales Police in the UK during a soccer match last year was only 8 percent accurate.

Lynch of the EFF says she believes police departments should disclose accuracy figures for their facial-recognition systems, including how they perform on different ethnic groups. She also says that, despite the technology’s largely unexamined adoption by local police forces, there’s reason to believe today’s free-for-all won’t last.

Consider the Stingray devices that many police departments began to quietly use to collect data from cell phones. Following pressure from citizens, civic groups, and judges, the Department of Justice and many local departments changed their policies. Some states, such as California, passed laws to protect location information. Lynch believes there could soon be a similar pushback on facial recognition. “I think there is hope,” she says.

Louise Matsakis contributed to this article.


More Great WIRED Stories