Rss

  • stumble
  • youtube
  • linkedin

India – The State likes to WATCH

They like to watch
By Invitation Malavika Jayaram

If the state reads what we read and knows what we think, something essential is at stake: intellectual privacy.

A few weeks ago, the future arrived in Surat. A science fiction future, no less. It became the first Indian city to roll out an advanced automatic surveillance system featuring, among other things, facial recognition. Part of the Surat Safe City Project, it was heralded as everything from the new age of policing to the dream of living in a safe palace.

A few weeks ago, the future arrived in Surat. A science fiction future, no less. It became the first Indian city to roll out an advanced automatic surveillance system featuring, among other things, facial recognition. Part of the Surat Safe City Project, it was heralded as everything from the new age of policing to the dream of living in a safe palace.

Procured from the Japanese company NEC Corporation, it includes NeoFace Watch for Live CCTV surveillance and NeoFace Reveal for forensic investigation. It matches video feeds from CCTV cameras located around the city with the databases that it is connected to. It “intelligently matches” faces in real time, generating alerts if a “person of interest” is revealed. Koichiro Koide, Managing Director, NEC India, described it as “efficient enough to identify an offender even if he has got plastic surgery done”.

So far, so exciting. Or is it? In all the media reportage of this event, you would struggle to locate the words “privacy” or “risks” within the techno-utopian hype and breathless excitement about world-class technology coming to our very own Surat. I know, because I tried.

The idea of being watched, recorded and tagged in a system while going about one’s daily business is intensely creepy to some, yet desirable to many others. Framed as a solution to crime and a guarantee of safety, technologies of surveillance depict and promise an ideal society, one which secures the wellbeing of its citizens, 24x7x365. That such systems track ordinary people not under suspicion of anything, and sweep them up in the policing gaze, is either a non-issue, or one that is accepted as a necessary trade-off in our security-obsessed times.

Reports of this launch were replete with glee (bordering on schadenfreude) about the system’s ability to ensure no offender could ever escape. That news coverage tends towards the reproduction of corporate propaganda and dissemination of police press statements is perhaps no surprise. The mainstream narrative is replete with naive expectations about the infallibility of data, the neutrality of algorithms and the perfection of the machine. And the desire for cleansing society of “bad elements”, the bloodlust that comes from seeing alleged transgression punished, and seeing society saved from itself, these speak to a larger compulsion to wipe out difference, discriminate against outliers and punish deviance.

Far from being mitigated, those desires are only fed by the growing hype about Big Data, Smart Cities and the Internet of Things. Their role in policing by numbers and policing by formulae cannot be understated. Against this backdrop of security theatre and this vision of progress, concerns about privacy and autonomy are made to seem anachronistic and retrograde. In developing countries in particular, technology is seen as much more than just efficient, quotidian or fun; it is cast as a saviour that will transform ground realities, leapfrog stages of development and eradicate societal inequalities. That systems are only as good as the people who design them (encoding their own biases along the way), that the research shows the sheer magnitude of errors (false positives and false negatives when it comes to matches), do not frame the discussion: the possibility of redemption does.

When the private sphere is invaded in tangible ways, such as when physical property is searched and seized, or when Madh Island hotel rooms are raided and couples dragged out by the police, the sense of invasion is immediate. When biometric software on Facebook remembers faces in order to tag friends, when the Fast Track lane at an airport flashes you past after an iris scan, when your smart card for public transport permits cashless bus journeys, they are utilitarian, desirable. That they collate data about habits, movements and behaviours, however invasive, isn’t the sort of thing that gets people to sign petitions or march on the streets.

The Surat system also collects licence plate numbers, at a time when many other countries are protesting against their deployment. Just as with facial recognition, concerns are being raised about the use of surveillance technology to track innocent people, those not accused of any wrongdoing, let alone any crime. Moreover, both types of technology function at a distance, eliminating the physical interface that signals that collection is taking place. Automated facial recognition and licence plate scanning do not create any awareness and consent (explicit or implied) of the fact that your presence and location are being mapped and stored.

This is different from police officers entering your home to search it and seizing contraband that they might locate, or the pressing of your finger on a scanner to enter a top security government building only open to persons with a certain level of clearance. In those cases, the body is complicit in the act of surveillance, not an unknowing (and perhaps unwilling) co-conspirator.

When the state makes spatial incursions into the zone of the private, it is a certain kind of trespass. Incursions at a mental level are subtler, and therefore often invisible. Digital dragnets to flush out unlawful acts, such as child pornography, can proscribe or restrict perfectly legitimate activities such as researching sexual health concerns or viewing every other kind of (legal) adult entertainment. This too is an invasion of one’s autonomy to read the content of one’s choice, in private.

If the state watches what we watch, reads what we read and knows what we think (or thinks it does), something far bigger, far more essential is at stake: intellectual privacy. Whatever our politics about the role of the state in our lives, that it doesn’t include living inside our heads and our bodies must frame our thinking about and relationship with technology.

The author is a Fellow, Berkman Center for Internet & Society at Harvard University, researching privacy, identity, data ethics and the

 

http://www.mumbaimirror.com/others/sunday-read/They-like-to-watch/articleshow/48497655.cms

Related posts

Comments are closed.

%d bloggers like this: