The Conflicting Jurisprudence of Facial Recognition Technology and Privacy in a Democratic Setup

Sanah Javed


Abstract

An advancement in technology has inevitably equipped the government with new surveillance mechanism, the most prominent one being the Facial Recognition Technology [FRT]. The unfettered advancement of the use of FRT by the State machineries in India has posed multiple privacy concerns. The FRT is in line with Jeremy Bentham’s panopticon prison; its use justified due to the maximum surveillance it provides with limited resources. However, this leads to self-implication of restrictions by the citizens often to the extent of curbing speech and dissent. The author argues that the following surveillance mechanism, which finds a justification in Jeremy Bentham’s panopticon prison design, inevitably clashes with the moral autonomy argument for privacy rights. In a democratic State, it is essential that a balance be found between the privacy and related rights of the citizens and the security interests of the state.


 

PRIVACY: IN THE CURRENT CONTEXT

The concept of privacy has developed over the years with the change in societal institutions, and further, with the advent of technology. Charles Fried in his paper titled ‘Privacy’, stated that the right to privacy is not limited to information about us in the hands of others, but refers to the control we possess over the said information. The role of privacy is defensive in protecting our personal liberty. This is a context-based argument which states that privacy is essential to protect other related goals. With the rapid advance in behaviour detection technology, this definition seems appropriate. Nissenbaum has explained that what individuals care about most is not the restriction on the flow of information but ensuring that the said information flows in an appropriate manner and there exists some degree of accountability towards the person whose information is in question.

Information technology has affected the way in which we perceive our right to privacy. Our conventional understanding of the right to privacy should now include the right to informational privacy. Surveillance is viewed as antithetic to said informational privacy as the primary goal of surveillance is to obtain information often without the consent or knowledge of the subject.

MORAL AUTONOMY AND RELATED RIGHTS TO PRIVACY

The moral autonomy argument in favour of the right to privacy is as follows – an individual must be in control and able to shape his moral identity and to gauge one’s moral choices without being subject to criticism or scrutiny by a third party. The following is considered particularly important in order to follow a liberalist perspective of oneself. One does not wish to be subjected to constant judgement and scrutiny by others in the socio-economic environment, and hence, value’s one’s right to privacy to safeguard his/her moral autonomy.

With the advancement in technology, new and radical forms of surveillance and power have become possible. The imposition of identities on persons has become convenient for the authoritarian powers, hence, leading to a ‘Big Brother’-like situation. The following makes information protection a desideratum in the current scenario.

Privacy is not a secluded right, it is related to multiple other essential rights such as liberty, autonomy and selfhood. Privacy gains its essence and importance due to the functional role it plays in protecting or accessing other rights especially in a democracy such as preserving human relations and furthering a free society. However, one must keep in mind that this right is not absolute and is required to be weighted in balance with other rights present in society, for instance – security. The dispute arises when the right balance is not achieved.

BIOMETRIC AND OTHER TECHNOLOGY: ENHANCMENT OF SURVEILLANCE

Biometric technology assists in the identification of persons using one’s physical characteristics or behavioural characteristics. The classification of biometric technology is as follows: first generation biometric technology focuses on the identification of a particular person – addressing the question of ‘who are you’ whereas second generation biometric technology without knowing a person’s individual identity, recognises the nature of the person in order to classify him/her into a ‘which type of person are you’ category.

In the present paper, I will be addressing merely the first-generation biometric technology and its impact on modern democratic governments. The technology enhancing the way governments function for the purpose of screening out criminals/ terrorists, combat identity fraud, assist in conclusive evidence to prove a case before court, and find missing persons. However, the following benefits come at a cost – privacy, autonomy and equal treatment.

Facial Recognition is of key regard in this context. To have a rudimentary understanding, FRT captures the facial image of an individual, identifies the nodal points of the face and then compares it to the existing data base. Hence, FRT can be used to identify with the help of surveillance cameras and existing data bases, repeat offenders, terrorists or even underage drinkers.

The fundamental issue with using this technology is obvious – constant surveillance by the government of its citizens using centralised data bases in order to curb individual freedom of expression and dissent. To balance the requirement of security with the privacy of an individual is an onerous task – one that hasn’t yet been achieved with success. In this model the government curbs a key element of a democratic setup – individual autonomy and privacy in order to provide security to the very same people whose rights it has now curbed.

FRT AND THE ‘PANOPTICON PRISON’ BY BENTHAM

Advanced biometric technology such as FRT creates a system of ‘panopticon prison design’. An individual is under the continuous and synoptic view of the government. In the panopticon prison design proposed by Jeremy Bentham, the inmates are under maximum visibility to the prison guards, however, are unable to determine when the prison guards are watching, the result is an internalised omniscience; the prisoners discipline themselves as they believe they might be watched. Foucault further built upon the panopticon to justify how asymmetric surveillance where the subject is unaware of precisely when he is being watched, he exercises restraint and self-discipline. There is no need of external authority as the subject itself regulates his behaviour, speech and conduct due to the unverifiable eyes of the government. The FRT plays the role of the panopticon prison; one is not aware when he is under surveillance but assumes that the state is watching his actions and he is identifiable. Hence, dissent and opposition to the government’s mechanisms and policies becomes sparse. The following adversely affects a free-thinking society and individual autonomy which forms the basis of a strong democracy. The protesters in Hong Kong arguing for a pro-democratic set-up were identified by the use of facial recognition technology, adopting an archaic stance of banning face masks. Further, in India protesters opposing the Citizenship Amendment Act, 2019 and the National Registry of Citizens on grounds of it being exclusionary and discriminatory violating the fabric of the democratic set-up were identified using FRT initially developed and deployed in order to locate missing children.

CONCLUSION

Privacy is fundamental in a democracy and interrelated with other essential rights making it worthy of high protection. Further, technology in a democratic set-up must be used wisely in order to not deter into an authoritarian regime.

Indian governmental bodies are increasingly using FRT and other behaviour detection technology in their day-to-day tasks such as maintaining law and order, criminal identification, etc. The centre has advocated and is rapidly working towards an Automated Facial Recognition System (AFRS) in order to facilitate ‘criminal identification, verification and dissemination.’ The government justifies the legality of the following system on the basis of a cabinet approval. Firstly, the following is not sufficient to meet the threshold of legality as its basis lies in a cabinet note of 2009 and not in a statutory enactment. Further, even if it is assumed that the system is permitted by law, it cannot on this sole basis be considered valid. Privacy extends to liberty, dignity and individuality. To give legitimacy to State surveillance such that it breaches an individual’s privacy on the grounds that it is based in law, still endangers the right.

Democratic discourse on the issue is the only way to achieve a greater balance of rights between the State and subject, this is lacking in the current approach taken by the NCRB. Apart from the consent of the subject exposed to surveillance a checks and balance system must be established. Holding accountable the State to its surveillance technology and drawing the line when it is not acceptable. Where the balance lies, however, is to be decided by the populace through discourse time and again by examining the requirements of society. The introduction of technology for security purposes must be deliberated upon through the mechanism of informed political debate. If the FRT is used as per the discretion of the governing bodies it can lead to the panopticon prison becoming a reality, leading to the consequence of freezing speech and stifling dissent which is of colossal importance to a democratic society such as ours.


Sanah Javed is a 4th Year, BA LLB (Hons.) student at School of Law, Christ University.


 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.