Skip to navigationSkip to content

The use of facial recognition by the Delhi police could make systemic biases against Muslims worse

A worker installs a security camera in front of the historic Red Fort on the eve of India's Independence Day celebrations in Delhi
Reuters/Anindito Mukherjee
All eyes on you.
  • Ananya Bhattacharya
By Ananya Bhattacharya

Tech reporter


The use of advanced technology by the police in India’s capital city may do more harm than good.

The Delhi police have been equipping their control room vehicles with facial recognition systems, and have already made at least 42 arrests with the help of this technology. However, recent research has shown that this technology could make Muslims more likely to be targeted. A key factor that makes the religious group vulnerable is the uneven distribution of police stations across Delhi.

The research was conducted by independent think tank Vidhi Centre for Legal Policy and is based on the populations covered by various police station jurisdictions in Delhi.

Muslim-dominated areas in Delhi have more police stations than others, the research revealed. Police stations are spread unevenly across the city, with central Delhi and old Delhi being the most policed. Barring areas with low civilian populations—the places where important government or diplomatic buildings are located—nearly half of the districts in these localities have a significant Muslim presence. The study defines a “significant” population as more than that of Delhi’s average share of the Muslim population (12.86%).

“There are a few biases inherent in policing, including that policing disproportionately targets some groups of people. Such a bias creates a skewed spatial distribution of policing, which can intensify the disproportionate targeting,” Jai Vipra, senior resident fellow at Centre for Applied Law and Technology Research at Vidhi, noted. “This bias remains when new technology is applied to such a system. The victims of the shortcomings of policing technology will more likely be these disproportionately targeted groups.”

Vidhi Centre for Legal Policy
Police stations spread unevenly in Delhi.

This skew is especially threatening since facial recognition technology (FRT) is ridden inherent with flaws.

For one, it uses machine learning or other techniques to match or identify faces to a “training database,” which is a compilation of large troves of images of faces. “If the training database of FRT has an over-representation of certain types of faces, the technology tends to be better at identifying such faces,” Vipra wrote. “Even if it does not have a training bias, the technology is rarely completely accurate and can easily misidentify faces.”

Additionally, the bias can be exacerbated by the technologists designing the systems, who may not think to correct for certain errors or may mislabel certain images.

Facial recognition and CCTVs

The researchers aren’t proving that the placement of police stations in Delhi is intentionally designed to over-police Muslim areas. However, “given the fact that Muslims are represented more than the city average in the over-policed areas, and recognising historical systemic biases in policing Muslim communities in India in general and in Delhi in particular, we can reasonably state that any technological intervention that intensifies policing in Delhi will also aggravate this bias,” Vipra noted.

There’s already some proof of this. In the aftermath of the February 2020 Delhi riots, the police force used and misused this tech.

“After a pogrom ravaged North-East Delhi in February 2020, a further victimisation of Muslims followed, with arbitrary arrests and police harassment,” Vipra wrote. “Delhi Police claimed that 137 of the 1800 arrests connected to this violence were made using FRT. They also claimed that ‘the accused were arrested mainly on the basis of CCTV footage and open-source videos,’ and that identification took place using FRT.”

The think tank tried to acquire CCTV placement data to look for additional signs of bias in the region since areas with more cameras will likely be over-surveilled and over-policed, too. While it found CCTV distribution was uneven, there was nothing further to say since data was limited and several CCTV cameras are also lying defunct, distorting whatever data was available.

Vidhi Centre for Legal Policy
CCTVs in Delhi.

India against facial recognition

The wider concern is that the divide created by tech-enabled policing will occur not just across religious lines but also on the basis of caste, homelessness, and sex work, among other factors.

And it’s not just the national capital that’s adopting the worrisome tech with little due diligence. Forces in several cities across the country like Delhi, Hyderabad, Lucknow, Mumbai, Coimbatore, and Patiala have experimented with it but remain tight-lipped. When Vidhi filed Right to Information applications with some of these police departments asking for data on CCTV cameras and separately about the procedure used to implement FRT systems, the organisation either received evasive replies or none at all.

“It is also not the case that an equal and unbiased deployment of FRT by the police will necessarily benefit the public,” Vipra wrote. “The use of FRT in policing can impact privacy and liberty of people independently of bias as well.” Critics have raised concerns over privacy, calling the use of FRT an “act of mass surveillance.”

📬 Kick off each morning with coffee and the Daily Brief (BYO coffee).

By providing your email, you agree to the Quartz Privacy Policy.