By: Isheeta Sharma*

The advances in digital technology have significantly redefined human dynamics. Not only have human-to-human interactions rapidly changed through social media and facilitation of long-distance communication, the relationship between nation-states and its citizens have also seen a major shift. Our physical bodies have now become a literal embodiment of our legal and political identities. Facial Recognition Technology (FRT) is one such form which uses our bodies – our faces, to be exact – as our identity cards. Nation-states across the world are rapidly deploying FRT in the public domain. However, its, mostly unregulated, usage continues to be an impingement on the right to privacy and a major concern to human rights everywhere. 

Facial Recognition Technology uses images, videos or real-time visuals to identify human beings. An example of FRT in our everyday lives is photo tagging on social media platforms or unlocking our smartphones with our face. However, FRT has been used for much more than this. Recently, in India, there is an FRT system being created by the National Crime Records Bureau (NCRB) to identify and verify criminals for policing purposes. Similar activities have been carried out by nation-states across the world. As per the AI Global Surveillance Index by Carnegie Endowment for International Peace, 75 out of 176 countries are using Artificial Intelligence (AI) technology for surveillance purposes out of which FRT is being deployed in 64 countries. Interestingly, the index also shows that 51% of advanced democracies use AI technology for surveillance. This means a significant amount of concentrated power lies in the hands of the state and tilts the scales in their favour resulting in over-policing and social control.

In her paper Adoption and Regulation of Facial Recognition Technologies in India: Why and Why Not?, Smriti Parsheera examines the important concerns in India with reference to FRT being deployed by government agencies for surveillance purposes. In India, FRTs are being used without any proper law that protects the privacy of the citizens or delineates how FRTs should be used ethically. The Information Technology Act, 2000 classifies biometric data as ‘sensitive personal data’ but it limits this classification only for ‘body corporates’, removing all government agencies from its purview. A look at how FRTs are being used in India and other countries for surveillance is evident of its gross abuse due to the lack of a proper, ethical law. 

In Myanmar, the ‘safe city’ initiative has been used as a guise to install 335 surveillance cameras which scan faces and license plates. This technology has then been used to curb protests against the militaristic government and amplify a state-led crackdown on voices of dissent. During the Hong Kong protests of 2019, protesters were seen cutting street lamppost wires and covering their faces during protests to avoid FRTs deployed by the state to track down the protestors. FRTs were used during the Black Lives Matter movement in the USA which then led to the Ban the Scan initiative by Amnesty International. In India, FRTs have been used during Aadhar Card verification, it is being employed for attendance purposes and also being used at airports for security and check-in. Not to mention its voluntary usage by consumers for online applications and smart devices. It has also been deployed to track ‘miscreants’ during the Farmers Protest movement in 2020-21. Internet Freedom Foundation’s Project Panoptic works towards tracking the number of FRTs in use in India. The data shows that there are currently 58 FRT systems installed in India. 

Some of the primary concerns of the usage of FRT are its impact on the Right to Privacy, the lack of any regulatory law or body, its questionable accuracy and its usage for racial profiling and abuse, among others. In recent news, Lucknow police announced that they would use FRT to track women’s faces in public spaces and study their emotions for signs of distress in order to protect women. However, a 2018 AI Now Report shows that the science of reading emotions on faces is faulty. This reading of micro-emotions harks back to physiognomy which is now considered pseudoscience and was associated with Nazi race science. This technology attempts at fixing and bucketing emotions and their facial expressions universally without taking into account cultural, social, and contextual factors. Perhaps the most pervasive use of FRT is with CCTV cameras being installed in public spaces, schools, colleges and workspaces which allow for real-time tracking of individuals. IFF found out that in Delhi CCTV cameras and FRTs are being used in schools. This puts underage students at risk and violates their sense of privacy and consent. 


“Technologies are unstable things,” wrote Brian Larkin in his book Signal and Noise. While FRT does have the potential to be used productively, such as for searching missing children (as it has been used previously), its current trajectory seems to be leading towards a bleak future. The lack of knowledge in the public realm and limited public discussion on FRT policies by the state is a human rights violation unknown to most. Now is the time to question its usage as a state surveillance tool and regulate its impact on everyday life. We need not shun technology in our public and private lives but we must be aware of how deep the technology (and the state along with it) penetrates.

Isheeta is a features writer and a student of Gender Studies at Ambedkar University, Delhi. She enjoys dissecting popular culture through a gendered lens, adding new books to her overflowing book rack and sipping coffee in quiet corners. Presently she is interning with kractivist.org