I wanted to start this article by talking about how the Orwellian dystopia of state surveillance has finally come knocking at our doors – but who am I kidding, that ship sailed a long time ago.

The infamous UP Police, (yes the very same one that denied rape in the case of the Hathras gangrape victim) has come up with a rather ingenious way to ensure safety of women in the state capital Lucknow. Speaking at a press briefing under Mission Shakti, an initiative started by the UP govt to ensure safety of women – the Lucknow police announced that 200 hotspots had been identified in the city which were prone to seeing more cases of women being harassed. These hotspots were going to be equipped with five AI enabled facial recognition CCTV cameras to recognise signs of distress on a woman’s face and alert the police to a possible crime so they may reach the spot for help. 

Allow that to sink in – a camera on the street is going to scan your face for expressions, decide if you’re in trouble and alert the cops to reach you.

Surveillance isn’t a New Word in our Dictionary

The age old concept of ‘keeping an eye’ on women for their safety is not a new concept in any of our lives. Most women have been policed, and had their freedom curtailed in the name of ‘security’ from the very beginning of our socialisation process. Over the years even apparently ‘progressive’ households have justified this form of oppression in the name of safety and security as a means to limit women’s free movement and agency.

One question however always ends up arising is that consecutively for the past many years, over 90% of rape cases reported in our country are by perpetrators who were known to the survivors – relatives, friends, acquaintances. Where does this entire narrative of ‘danger on the streets’ then become more spoken about than danger at our homes and at the hands of people we know?

Why Facial Recognition Cameras are not the Answer

Madhya Pradesh Chief Minister Shivraj Singh Chouhan has announced a new system where a woman going out of home for work will have to register herself at the police station, and the police will then track her for her safety. While this in itself is bizarre, the UP government’s proposal of using AI cameras takes things to another level.

The question at the moment that begs an answer is if artificial intelligence facial recognition cameras are a solution to any problem at all. We spoke to Anushka Jain from Project Panoptic, a project focused on mapping the facial recognition systems that are already functional in India (there’s over 30) about the issue. 

The issue with facial recognition cameras is two fold. Firstly that they’re not accurate so they generate false positives and false negatives. There is no 100% accurate facial recognition system. A false positive means someone is identified as someone they’re not, and a false negative means the system fails to recognise the person as their own identity. The second issue is that these are already inaccurate technologies and since they’re being used for security and surveillance purposes, or for access to government schemes – this could lead to complete profiles of persons being made, with no knowledge on how this data is being used as well as no clarity on how this affects our privacy and fundamental rights.”

Can the Camera read my Face?

Engineers around the world are on a mission to make AI more advanced as the days progress and one major part of this intelligence pertains to emotional intelligence as well. To be able to read and gauge human emotions and take action accordingly. But when the tech surrounding simple facial recognition cameras is not fully developed or accurate what can one infer about more advanced emotion mapping?

The belief that we can decide what a person is feeling by studying their expressions is controversial. Various studies across the globe have suggested that this technology is half baked and can be dangerous. Speaking to The Verge, and online technology magazine, Lisa Feldman Barrett, a professor of psychology at Northeastern University said, “They can detect a scowl, but that’s not the same thing as detecting anger.” 

Become an FII Member

Also read: What Good Can Facial Surveillance Bring In A Fascist State?

Language here is also something that must be taken into consideration. The UP Police said that the cameras are equipped to identify signs of distress – but distress is a vague term that one can’t really draw a conclusion from. I could be in distress because I’m being stalked by a stranger or I could be in distress because of excessive flatulence caused by an oily meal, or I could be frowning in distress because of a fight on a phone call. Is the police going to be alerted on all accounts? Must I show my distress by loud facial expressions? What if I freeze up in peak distress? What about my internal distress? Who’s answering these questions?

That this decision by the UP Police, is among the many policies that are actively seeking to impinge upon a woman’s autonomy over her own body and decisions is not news to anyone. However, mass surveillance, where an individual has no information on where private data is being stored – let alone have any say or consent on how this information will be used, is a far more glaring problem than anything else. Regulation of technology is near impossible, because tech develops at the speed of light, and law can never possibly catch up. Our data protection laws are not in place, anyone can get access to any kind of data. We still don’t know who will be monitoring these cameras live, and what is the guarantee that this information will not be misused.

In a state that’s constantly trying to prove to the public that women have no agency with concepts such as ‘Love Jihad’, is it a surprise that women’s consent is so often ignored? What’s needed more than these blockades to our already limited freedom is the understanding that women have autonomy and agency, as well as active unlearning of years of patriarchal saviour complex.

Also read: State And Surveillance In Urban Spaces For ‘Protecting’ Women In Distress

One can only wait for a day when those in power understand that the solution to stalking can’t be…well stalking. And while you wait, remember to smile, the UP Police is watching.


Author’s Note: To know more about how facial recognition cameras are currently operating in India do check out Project Panoptic by Internet Freedom Foundation. The project is currently trying to map all existing facial recognition systems in the country and create accountability around them.

Featured Image Source: Telegraph India





About the author(s)

Nishtha is a former student of philosophy and enjoys discussions on ethics. She's currently a video journalist and wants to make films some day. When not working Nishtha can be found hoarding stationery, listening to ghazals and playing with her dog.

Follow FII channels on Youtube and Telegram for latest updates.

Feminist media needs feminist allies!

Get premium content, exclusive benefits and help us remain independent, free and accessible.

BECOME AN FII MEMBER

Choose Your Plan!