Police body cameras are widely seen as a way to improve law enforcement’s transparency with the public. But when mixed with police use of facial-recognition tools, the prospect of continual surveillance comes with big risks to privacy.
Facial-recognition technology combined with policy body cameras could “redefine the nature of public spaces,” Alvaro Bedoya, executive director of the Georgetown Law Center on Privacy & Technology, told the US House Oversight Committee at a hearing on March 22. It’s not a distant reality and it threatens civil liberties, he warned.
Technologists already have tools, and are developing more, that allow police to recognize people in real time. Of 38 manufacturers who make 66 different products, at least nine already have facial recognition technology capabilities or have made accommodations to build it in, according to a 2016 Johns Hopkins University report, created for the US Department of Justice, on the body-worn camera market.
Rather than looking back retrospectively at footage, cops with cameras and this technology can scan people as they pass and assess who they are, where they’ve been, and whether they are wanted for anything from murder to a traffic ticket, with the aid of algorithms. This, say legal experts, puts everyone—even law-abiding citizens—under perpetual surveillance and suspicion.
A 2016 report from the Georgetown Law Center on Privacy & Technology notes the free speech and privacy concerns this raises, and warns that citizens will become unwitting participants in an unending police procedure. From the report:
There is a knock on your door. It’s the police. There was a robbery in your neighborhood. They have a suspect in custody and an eyewitness. But they need your help: Will you come down to the station to stand in the line-up? Most people would probably answer “no.”
The researchers note that 16 US states already let the FBI use face-recognition technology to compare suspected criminals to their driver’s license or other ID photos, creating an algorithmically determined virtual lineup of residents. And state and local police departments are building their own face recognition systems, too.
“Face recognition is a powerful technology that requires strict oversight. But those controls by and large don’t exist today,” said Clare Garvie, one of the report’s authors. “With only a few exceptions, there are no laws governing police use of the technology, no standards ensuring its accuracy, and no systems checking for bias. It’s a wild west.”
The interest in this technology extends internationally. NTechLab, which is located in Cyprus and in Russia, and which claims to make the world’s most accurate facial-recognition technology, has pilot projects in 20 countries, including the US, China, and Turkey. The company says it uses machine learning to “build software that makes the world a safer and more comfortable place.”