Posted by Anushka Jain
A few months ago, the Lucknow Police unveiled a plan to use Artificial Intelligence (AI) enabled cameras equipped with Facial Recognition Technology (FRT) that would generate alerts based on expressions of “distress” on a woman’s face even before she reports an incident. However, the truth is that surveillance perpetuates existing patriarchal norms by casting a woman as the helpless damsel in “distress” who would be protected by a male authority figure.
As we reimagine the modern Indian city, the Smart City initiative, which was launched in 2015, stands as a parable of not only progress but of surveillance and the erosion of privacy. It symbolises a broader trend of the use of digital tools by the police to not merely ensure legality but also carries the risk of perpetuating architectures of inequality and subjugation. At its root is the unaccountable growth of facial recognition technologies.
The Smart City initiative, which is a Centrally Sponsored Scheme under the Central Government, covers 100 Indian cities, and aims to harness technology to improve the quality of life in them. A “core infrastructure element” of this initiative is the “safety and security of citizens, particularly women”. The initiative has a proposed budget of Rs. 48,000 crores over five years i.e. on an average Rs. 100 crore per city per year. An area in which utilisation of these funds has been consistent across cities is investment in developing a robust surveillance infrastructure.
In addition to this is the “Safe City” initiative of the MHA which is being undertaken under the Nirbhaya Fund in eight Indian cities. It aims to “create a safe, secure and empowering environment for women in public places, to enable them to pursue all opportunities without the threat of gender-based violence and/or harassment” and has an estimated budget of Rs. 2,919.55 crores.
Both these initiatives have contributed to an increase in spending for acquiring technology based measures to combat crime against women like extensive CCTV and FRT systems. According to the research undertaken over the past year by the Project Panoptic tracker, which maps FRT projects across India, more than 22 State Police departments in India have acquired FRT systems. Let us look at them a little closely to understand the reasons for their proliferation on the touchstones of objective and evidence lead reasoning.
Are technology measures effective in curbing crime?
According to the 2019 annual report of the NCRB, crime against women has increased by 7.3% as compared to 2018. However, crime against women in India has been on an upward trend in the past decade with the total number of incidents reported in 2011 being 2,28,650 as compared to 4,05,861 in 2019. Here it is important to note that while incidents of rape and sexual assault garner the most public outrage and media attention, it is domestic violence which has the highest number of incidents overall (30%) of all crimes against women.
It would then be correct to surmise that not only is the government failing to adequately address the underlying problem but is also creating new obstacles for women by putting into place surveillance regimes which perpetuate existing patriarchal norms.
Constant surveillance has its roots in patriarchy
The excessive investment into surveillance technology such as facial recognition confirms that increasing reliance is being placed on it to solve security issues specifically related to women’s safety in public spaces. The resultant constant surveillance of women not only violates their privacy but also perpetuates existing patriarchal controls that limits the choices of women. Surveillance manifests itself as a protectionist police authority figure who will respond in real time to any ongoing distress that is captured by CCTV cameras. In a country where only 7.28% of the police force is female, this would invariably result in men or flawed technology determining how a woman should exist in a public space. Surveillance measures such as facial recognition would also result in excessive intrusion into the privacy and autonomy of women by these actors and lead to a loss of anonymity in public spaces which is essential for most women in our country to fully exercise their freedoms.
Culturally, in India, women are constantly surveilled at home, at their place of education/hostels and at their workplace. This surveillance is usually to ensure that the woman is behaving in compliance to societal norms. Women are constantly told not to dress a certain way, not to talk loudly or express their opinions emphatically. Thus, surveillance of women in public spaces is just another step by the society in maintaining complete control over the actions of women.
Facial Recognition Technology is not the solution
“Tech solutionism” is the term used to describe the inherently flawed idea that every social problem can be solved through the use of technology as it is considered to be free of bias and inaccuracies. This is specifically the case for facial recognition which has been hailed as the ultimate solution to crime across the world and in India. Currently, India, through a procurement led by the NCRB, is on the course to build the world’s largest FRT system which would be accessible to each Indian intelligence and security agency.
A 100% accurate FRT system has not yet been developed. Therefore, its use in investigations by the police could potentially result in the apprehension of an innocent person through facial recognition. However, the real danger of FRT lies in its potential to violate the rights to privacy and freedom of speech and expression and lead to further human rights violations by increasing systemic bias against marginalised communities. Use of the technology by the police has been banned in various cities across the USA and companies like Amazon, IBM and Microsoft have pledged to not sell their FRT systems to the police in the USA.
Are we adopting a culture of surveillance through facial recognition?
Surveillance has been increasing in India, not just of our physical bodies but also of our thoughts and opinions. The MHA’s National Cyber Crime Volunteer Programme encourages citizens to register themselves as “unlawful content flaggers” to report online content if they feel it fulfills the vague criteria of unlawful content as under the provisions of the programme. This takes surveillance from a vertical structure, wherein the State surveils the citizen, to a horizontal structure where peers surveil one another. This is reminiscent of 1950s East Germany, where the Stasi, the security agency of the government, used a network of 2 lakh informants to spy on friends, colleagues and relatives.
This is in stark contrast to the ideal of fraternity enshrined in the Indian Constitution. Fraternity, according to Ambedkar, was “a sense of common brotherhood of all Indians”. This common brotherhood shared goals and aspirations thereby providing unity and solidarity to social life. Constant surveillance carried out by peers or facial recognition through faulty technology hampers this idea of solidarity by casting a suspicion of criminality on a majority of the population.
Tenders for acquiring FRT are being released without there being any parliamentary or judicial oversight on how this technology may be used and in the absence of a wider data protection regime in India. The Personal Data Protection Bill, 2019 also fails to solve these problems as it provides wide exemptions to government authorities and misses the opportunity to put in place much needed provisions related to state surveillance. Provisions related to surveillance were included in the private member’s bills on privacy filed by Dr. Shashi Tharoor and Dr. Ravikumar, however, it is unlikely that these bills will be passed into law. With a dismal outlook on surveillance reform and continued facial recognition deployments, patriarchy will embed itself within technology rather than solving it.
Anushka Jain works as the Associate Counsel (Transparency & Right to Information) at the Internet Freedom Foundation, an Indian digital liberties organization. Her work focuses on issues of privacy and surveillance surrounding disruptive technologies like facial recognition, artificial intelligence, and machine learning, as well as the need to ensure transparency and accountability from the government on its use of these technologies.
Featured Image Source: Shruthi Venkataraman via the ‘Gendering Surveillance’ website