While facial recognition software can clearly add some value to policing, given the well-documented concerns regarding privacy and accuracy, governments and private firms must tread cautiously (Photo: Indian Express)
While facial recognition software can clearly add some value to policing, given the well-documented concerns regarding privacy and accuracy, governments and private firms must tread cautiously (Photo: Indian Express)

Don’t smile, you are being monitored

  • The street protester now has to contend with facial recognition systems. What can it do and why should you care?
  • While facial recognition software can clearly add some value to policing, given the well-documented concerns regarding privacy and accuracy, governments and private firms must tread cautiously

New Delhi: On the morning of 22 December 2019, Delhi was preparing for an important speech by Prime Minister Narendra Modi. By end-December, several parts of India were in the grip of intense protests against newly codified changes to the citizenship law, and Modi was expected to directly address the fears that were being expressed by the people who were out on the streets.

The Delhi Police, however, was busy ensuring “habitual protesters" wouldn’t get into the venue. The tool of choice: facial recognition (FR). Everyone who walked in through the metal detectors that day was screened against a FR database. Unofficial estimates put the Delhi Police’s growing database of “rabble-rousers and miscreants" in the hundreds.

If the Delhi Police begins to tap into public databases of photos, then anything is fair game. For instance, that picture you posted on Facebook with the settings set to “public" is available to the police to track you.

Many aspects of the Delhi Police’s FR technology are still not publicly known. But Tarun Wig, co-founder of InnefuLabs, the Delhi-based security and research firm that supplied the software, confirmed that the system can use both video and static images as input to check a person’s credentials against a database.

The database of photos that the software has access to depends entirely on the user, he said. Wig claims he doesn’t know what is powering the Delhi’s Police’s background database, but says that since the software runs on an artificial intelligence platform, it has the ability to improve over time. That is, the more faces it sees, the better it becomes at recognizing and matching them. Both Mandeep Singh Randhawa, public relations officer (PRO) at Delhi Police, and Anil Mittal, deputy PRO, declined to offer comment.

But in an age of mass public protest where cameras have become cheap, anxiety runs deep. For example, when the National Crime Records Bureau (NCRB) issued a tender in mid-2019, proposing to build a “national-level searchable platform for facial images", advocacy body Internet Freedom Foundation immediately shot off a legal notice.

NCRB now insists that there is no proposal to link the automated FR software system to an existing public database, such as Aadhaar. But in the absence of an explicit privacy or data protection law— which remains stuck in Parliament—trust is hard to come by.

Most privacy activists and lawyers point to the Orwellian FR nightmare that is already unfolding in China, where an infamous social credit system has been proposed. It seeks to assign scores to citizens in order to determine their trustworthiness in the eyes of the government.

While aspects of the social credit system use existing mechanisms like bank transactions, it is also closely tied into China’s mass surveillance system that uses FR. In 2017, the Chinese government demonstrated to the BBC how it could track one of its reporters in seven minutes by using FR and surveillance cameras.

Whether India goes that way or not, the technology is out there. Even many private Indian firms have begun to tap into its potential. Cafe chain Chaayos, for example, recently introduced a face-based tool to track “loyal customers".

The rapid growth in FR technology is thus inevitably going to result in some hard questions: What is the balance between targeting criminals and a mass-surveillance regime? How much facial data should governments and private firms have access to? And how does an ordinary citizen exercise choice and “opt out"?

Targeted surveillance

While the Delhi Police’s use of FR software has caused alarm and raised questions about mass surveillance, the use of FR in India for purposes of policing dates back to at least a couple of years now.

Three years ago, the Punjab Police won the FICCI Smart Policing award for an app called Punjab Artificial Intelligence System (PAIS). The app helped the police digitize its records and analyse a crime (or the possibility of one) on the go, using FR and other emerging technologies. It was the first such system in Indian policing.

When a criminal gets sentenced and then jailed, the police feeds in particulars such as personal information, crime details, known associates, etc. into PAIS. It also takes photos of the criminal’s face from the front, left and right. Once the information is in the database, the software can then identify the criminal with more than 99% accuracy.

“Until very recently, there was no digital database for criminals," said Nilabh Kishore, who was the Inspector General (IG) at Punjab Police at the time. “We used to maintain files for crimes and criminals." The manual system dates back almost to the British era, when crime was mostly localized. But that system of manual filing had become obsolete as criminals began to frequently cross state borders, and several new types of crimes also emerged.

The fact that crime and criminals have evolved is why NCRB first made a push to develop a national Crime and Criminal Tracking Network and Systems (CCTNS) a decade ago. However, CCTNS is a record of only the criminal’s name and the crime. It also works only on desktop computers and on closed networks inside police stations. CCTNS doesn’t allow a police officer to check a person’s criminal profile on the go. With PAIS, a police officer in the field can simply whip out the mobile phone, boot the app and take a photo of the criminal. The officer can also use other search parameters and the latest update even incorporates fingerprints.

If a person has a past criminal record, the app flags it in a matter of seconds. According to Kishore, nearly 90% of criminals are repeat offenders. But the critical element of PAIS is that the database is restricted to convicted offenders.

“We specifically made our software in a way that it will target criminals, not citizens," said Atul Rai, chief executive and co-founder of Staqu Technologies, the company behind PAIS. He said it is a targeted surveillance system that cannot be linked to citizen databases like Aadhaar. It works only with data the police digitizes and creates in-house.

When Staqu’s app is used to click a person’s photo, it doesn’t enter any database unless it matches an existing criminal profile. Kishore said that the photos a police officer takes are temporary on the app and enter a database only if the person has a criminal record.

Mass surveillance

I’m not sure about what Delhi Police does," Staqu’s Rai said, adding, “But I’ve read in the newspapers that they’re doing face recognition in crowds as well. If they are, then they’re connecting it (FR) to citizen-centric records."

FR systems connected to a common citizen database such as Aadhaar or voter records is an example of mass surveillance. Such a system essentially allows government or law enforcement to conduct surveillance on any and all citizens, with or without probable cause.

“The Chinese are supposed to be resorting to this sort of technology heavily where they’re looking for a fugitive and the man can be identified from a crowd," said N. Ramachandran, president and CEO of think tank Indian Police Foundation. “I don’t think we have reached there. But the frontiers of technology do lie in these applications, and like all good applications, they can be misused."

Indian citizens currently have little or no defence against such misuse because there is almost no regulation to guide the use of FR. “This (lack of any regulation) is what the police seem to be taking advantage of," said N.S. Nappinai, cyber law expert and a Supreme Court lawyer. According to Nappinai, even targeted systems that only screen against other databases are impermissible. “You are then, technically, treating all the citizenry as criminals unless proven otherwise," she added.

Furthermore, the facial tech adoption that is fast spreading across many state police units may be in violation of the Supreme Court’s tight to privacy judgement. “The (right to privacy) judgement very clearly says that you need laws to allow the state to do it. It is not a matter of whether there are laws against it or not," said Nappinai.

Former IG Kishore said that the FR software they put in place cannot be used as proof in a court of law. He called FR an investigative tool and said it has to be combined with existing methods of investigation to build the whole case. It cannot be used to prove guilt in a court of law.

Checks and balances

But even if a few checks and balances eventually end up governing the use of FR, the debate is only going to become more and more contentious as the technology improves. Pam Dixon, founder and executive director of the World Privacy Forum, said, “We may be at the cusp of some ‘very problematic’ forms of identity theft. Your face may only unlock your phone right now, but with public biometric FR systems, it can be used for authentication for ration cards and voter records eventually."

According to reports, Indian Railways—one of the largest transporter in the world—is going to instal FR systems to “fight crime" soon.

“If the system is used broadly across the country (such as on the railway network), it means that there are a lot more uses for that data (for hackers and cybercriminals)," said Dixon.

Fraud and breaches aside, FR software has been found to be affected by demographics as well. An FR vendor test by the National Institute of Standards and Technology (NIST) in December 2019 found that “across demographics, false positive rates often vary by factors of 10 to beyond 100 times".

NIST also noted that false positives “are highest" in West/East African and East Asian people and lowest in Eastern European individuals. It also found that false negatives are “higher in Asian and American Indian individuals, with error rates above those in white and African American faces". Simply put, FR is inherently racist.

Whereas Staqu’s and Innefu’s software used faces from various parts of India, there is no one size fits all when it comes to FR. While FR software can clearly add some value to policing, given the well-documented concerns regarding privacy and accuracy, governments and private firms must tread cautiously.

And the debate is not restricted to India. Various cities in the US have banned or issued moratoriums on the use of FR software in public spaces, including California, San Francisco, Oakland, etc. The European Union is also considering a five-year ban on FR, buying itself time to understand how to regulate such systems.

The United Nations special rapporteur on freedom of opinion and expression, David Kaye, had called for an immediate moratorium “on the sale, transfer and use of surveillance technology until human rights-compliant regulatory frameworks are in place," in his report to the UN Human Rights Council on the surveillance industry in June last year.

Law enforcement bodies in India, though, have largely ignored these concerns as a concerted push towards implementing FR systems got under way a couple of years ago. Interestingly, the proposed Personal Data Protection Bill, 2019, classifies face data as sensitive and explicitly states that “consent of data principal in respect of the processing of any sensitive personal data shall be explicitly obtained". When the government scans faces, doesn’t it become a data principal?

The answer to that question and the legal complexities it throws up may determine whether FR systems show up more frequently at political rallies and public protests.

Close
×
My Reads Logout