0% found this document useful (0 votes)
62 views8 pages

The Dialogue Response - FRT

The document provides feedback on a draft discussion paper regarding facial recognition technology in India. It recommends enacting new legislation to specifically regulate facial recognition, as the current legal framework may not adequately address privacy and proportionality issues. It also emphasizes the need to ensure user autonomy and informed consent when using facial recognition, especially for law enforcement purposes. Risks like data breaches, mission creep, and impacts on privacy principles must be properly addressed and mitigated in the legislation. The feedback aims to help roll out facial recognition in a safe and responsible manner that balances national security with individual rights.

Uploaded by

Sacredly Yours
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views8 pages

The Dialogue Response - FRT

The document provides feedback on a draft discussion paper regarding facial recognition technology in India. It recommends enacting new legislation to specifically regulate facial recognition, as the current legal framework may not adequately address privacy and proportionality issues. It also emphasizes the need to ensure user autonomy and informed consent when using facial recognition, especially for law enforcement purposes. Risks like data breaches, mission creep, and impacts on privacy principles must be properly addressed and mitigated in the legislation. The feedback aims to help roll out facial recognition in a safe and responsible manner that balances national security with individual rights.

Uploaded by

Sacredly Yours
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Response

Discussion Paper ‘Responsible #AIforAll:


Adopting the Framework – A use case approach
on Facial Recognition Technology’
Dear Ma’am,

We are happy to reach out to you to provide inputs on the draft discussion paper on facial
recognition technology (‘FRT’) released for public comments titled “Responsible AI for All:
Adopting the Framework – A use case approach on Facial Recognition Technology”. It is
clear to us that the consultative approach taken by commissioning this report is the optimal
way to roll out the application of this emerging technology. The Dialogue has researched
various technical policy briefs - particularly on the need to balance national security with the
need to ensure consumer surplus, individual rights and economic growth.1

With that context in mind, we are pleased to offer brief remarks in a thematic manner on the
draft below. Our focus in this response has been on the risk-mitigation aspect in technological
policy for Facial Recognition Technology. We base our inputs against the backdrop that much
of this technology is in usage currently - exclusion and data harms may be accruing to
citizens in a manner that could be beyond the remedy offered by an AI explainability regime.
We are committed to supporting the continuing efforts by the NITI Aayog to bring about FRT
usage in various public and private sector applications in a safe and responsible manner.

Best,
The Government Technology team at The Dialogue

1
Yashovardhan Azad et al. (2022, January 12). Analysing the National Security Implications of weakening
encryption. New Delhi. The Dialogue and DeepStrat; Arya T., Karthik V.V., Trisha P., (January 2021). IMPACT
STUDY: PERSONAL DATA PROTECTION BILL ON THE START-UP ECOSYSTEM. New Delhi. The
Dialogue; Shekar, K., & Mehta, S. (2022, February 17). The state of surveillance in India: National security at
the cost of privacy? ORF. Retrieved September 9, 2022, from
https://www.orfonline.org/expert-speak/the-state-of-surveillance-in-india/; Shekar, K. (2021). On an Uneven
Keel: Primer on Surveillance and Right to Privacy in India. The Grassroots Privacy Initiative - The Dialogue.
https://secureservercdn.net/160.153.138.178/3mv.9da.myftpupload.com/wp-content/uploads/2022/03/On-an-Un
even-Keel-Primer-on-Surveillance-and-Right-to-Privacy-in-India.pdf ; Tiwari, P., & Tripathi, A. (2020,
December 31). To What Extent Does India's Surveillance Regime Affect Citizen's Privacy? The Bastion.
Retrieved September 9, 2022, from
https://thebastion.co.in/politics-and/to-what-extent-does-indias-surveillance-regime-affect-citizens-privacy/;

1
Urgent Legislative Action

In the status quo, the use of FRT in India is operating in a legal vacuum. The rigorous and
detail-heavy data protection legislation that the discussion paper envisages has not transpired
in the latest draft. It would be beneficial to enact a new law specifically for regulating FRT as
the current regime may not pass all prongs of the proportionality test, making it susceptible to
judicial scrutiny.2 A comprehensive law can also counter arbitrariness in the application of
FRT, and act as an oversight on its application.3 A public consultation process inherent in
such a law could also provide an opportunity to engage and mitigate all possible risks that
various stakeholders may encounter in their usage. Engaging with professionals who can
provide a realistic assessment of the state of FRT algorithms both in the public and private
ecosystems is the pathway we would recommend in the process of finalising such a law. It is
pertinent to mention that legislation is catching up with the use of facial recognition
technology - in some cases offering moratoriums on certain use cases and in other cases
laying down the process of rollout - such as those in the United States4 and China.5
In the Indian context, risk-mapping needs to be connected to the legislation - some of these
risks are-
(a) integration of intermediates like the Common Service Centres within the ecosystem may
reduce quality of data and increase chances of unauthorised use. There is regulatory interest
in deploying facial recognition6 as part of its authentication system for disbursing pensions;
however, using (CSC) for this task without suitable training and accountability measures is
not advisable. At this stage, an appropriate monitoring mechanism must be deployed such
that agencies which aid the government in collecting facial recognition data are accountable
(one of the seven principles laid down in the draft) for their actions.

(b) the report acknowledges that individuals lose control over data at rest (when stored at the
government or with a private party). A major threat to the data at rest stage relates to data

2
Hickok, E. et al. (2021 August 31) Facial Recognition Technology in India, Centre for Internet & Society
(Online) retrieved form https://cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf
3
Jauhar,A. & Vipra,J. (2021 December 17) Procurement of Facial Recognition Technology for Law Enforcement
in India: Legal and Social Implications of the Private Sector’s Involvement, Vidhi Centre for Legal Policy
(Online) retrieved from
https://vidhilegalpolicy.in/research/procurement-of-facial-recognition-technology-for-law-enforcement-in-india-
legal-and-social-implications-of-the-private-sectors-involvement/
4
Internet Freedom Foundation (2021 May 03), Facial Recognition Laws in the United States #ProjectPanoptic
https://internetfreedom.in/facial-recognition-laws-in-the-united-states-projectpanoptic/
5
Internet Freedom Foundation (2021 June 03), Facial Recognition Laws in China #ProjectPanoptic
https://internetfreedom.in/facial-recognition-laws-in-china/
6
Government launches 'unique' face recognition tech for pensioners. (2021, November 29). The Economic
Times. Retrieved September 8, 2022, from
https://economictimes.indiatimes.com/news/india/government-launches-unique-face-recognition-tech-for-pensio
ners/articleshow/87985779.cms?from=mdr

2
breaches through malicious hacking, leaks etc. It is also important to ensure that individuals
have an option for seeking deletion, recollection or functional modification of facial
recognition data if they think it no longer serves their purpose.

(c) usage of FRT can lead to a breach of universal data protection principles like purpose
limitation, data minimisation and user control. Such breaches can become particularly critical
when the deployment of FRT solutions for law enforcement purposes operates at a mass level
where every individual’s data is processed. This has implications for the proportionality
principle laid down by the Supreme Court as part of the Puttaswamy7 judgement. Specific
protections must be drafted and clarified to ensure this risk is mitigated.

Informational Autonomy of Users

A common feature of data protection frameworks particularly those inspired or based on the
European Union General Data Protection Regime, ‘consent and notice’ mechanism is the
central functioning component of the regulation.8 Consent is kept at a focal point for serving
two purposes namely, 1) respect of user’s autonomy and 2) to identify and impose liability on
the entity to whom the consent is granted, in case of any violation of the user’s privacy rights.
In the Indian legal context, the constitutional morality line (as discussed in the draft) provides
a vision to understand the contours of executive action. This understanding dictates two
bright lines in the context of privacy rights, namely that consent should be given in a manner
which is meaningful and informed on the part of the data principal and should not be
automatically assumed in cases where the final use is ambiguous or prone to changing.

Most public FRT systems, particularly 1:n systems, do not work on the premise of ‘informed
consent,’ and collect biometric data without one’s meaningful consent. Even with the
assumption that one has given implied consent, the purpose for which the biometric data was
collected in the first place should not be compromised. The risk that the FRT data, if
combined with other welfare databases, such as Aadhaar or the ones already existing with
law enforcement departments is high in the absence of specific protective rights.9 When it
comes to law enforcement the proportionality requirement is to be analysed differently than
in the consumer application as the draft report states correctly. More detailing of the argument
used through evidence of usage may be added to make this distinction between consumer
application, welfare distribution and law enforcement clear. The draft distinguishes two uses
of FRT, namely, non-security uses and security uses. The use of FRT in the DigiYatra may
fall under the non-security purposes if it specifically aims at providing greater ease of access
to airport facilities with certain pre-conditions. In such a scenario, the pressing need to adopt

7
Justice K. S. Puttaswamy (Retd.) and Anr. vs Union Of India And Ors. AIR 2017 SC 4161
8
Justice B.N Srikrishna Committee Report (2018), A Free and Fair Digital Economy, Protecting Privacy and
Empowering Indians.
9
Jawahitha Sarabdeen (2022), Protection of the rights of the individual when using facial recognition
technology. Heliyon 8(2):e09086

3
FRT should be weighed against the necessary or intended outcome. Risks of probabilistic
mismatching and human rights impact associated with both security and non-security uses
needs to be clearly included in the draft. Explainability as it exists for FRT today is not
user-friendly but rather operator-friendly which the paper does not acknowledge. Similarly
the idea of brittleness is not acknowledged with or engaged in by the discussion paper.
Brittleness is the way in which FRT algorithms can fail when faced with new types of data
which is significantly different from the test data that the FRT algorithm has been trained
on.10 Finally, the stochastic nature of FRT algorithms is not sufficiently engaged with.
Without an understanding of the inherent randomness of pattern matching algorithms, it is
also not clear how a probabilistic error can be effectively explained to affected users.
Similarly it remains unclear whether explanation is sufficient remedy for serious harms that
accrue for erroneous identification.11

A clear demarcation can be made on the basis of what the purpose or context of the usage of
FRT is, so as to either achieve the consent of the individuals specifically or generally mandate
the usage through user control. If the objectives are only intended to make an easy experience
of travel to the individuals, i.e. commercial application for consumer ease of access then an
appropriate mechanism can be provided to opt in or opt out of the FRT system, without any
penalty and with functional and meaningful alternatives to FRT available.12 On the other hand
if the intention of the relevant department is to substitute traditional identification through
documents completely, then the effect on the legal right of privacy would be greater. It would
be recommended that a wider analysis is done on the DigiYatra use case to ensure responsible
use as the discussion paper sets as the objective for itself.

Effect on Participants

FRT systems pose well demonstrated challenges with respect to their accuracy and reliability,
particularly with diversity of facial features in the Indian context. Systems that can offer
individuals and representative groups the right to challenge accuracy, question bias and
modify or trigger recollection of the source data, need to be developed or adopted on a
priority basis. This kind of rights-embedded system is uncommon in Indian markets currently
and the discussion paper should include policy inputs on how such systems can be
introduced.

10
Andrew J. Loan (2020), Estimating the brittleness of AI: Safety Integrity levels and the need for testing out of
distribution performance, arXiv preprint arXiv:2009.00802
11
Dang Minh, H. Xiang Wang, Y. Fen Li & Tan N. Nguyen (2022), Explainable artificial intelligence:a
comprehensive review, Artificial Intelligence Review
12
James A Lewis & William Crumpler (2021 September), Facial Recognition Technology: Responsible use
principles and the legislative landscape,Centre for Strategic and International Studies.

4
There have been various studies conducted by the National Institute of Standards and
Technology (NIST) to establish performance benchmarks for FRT performance.13 It has been
identified that facial recognition softwares have both gender as well as racial or skin type
biases. Sophisticated softwares developed by private sector technology giants such IBM,
Microsoft and Amazon all have faltered in the past and showed racial bias against darker
skinned women.14 Accuracy of FRT systems and its wide distribution in many mature
markets raise the possibility of overseeing this specific technological market. Emotion
detection products are already appearing in the Indian markets, currently unregulated and
posing extremely high risks of workplace surveillance.15 The discussion paper acknowledges
some aspects of risk and its implications with India’s great diversity of phenotypes - further
research needs to be developed and not be limited to increasing the participant pool of source
data which goes in violation of a data minimisation principle. Trying to push higher accuracy
of FRT algorithms by increasing the size of data pools would be highly counterproductive to
the privacy rights of citizens.

Considering the associated risks of misidentification or false positives due to lack of accuracy
in such cases, it is very important that biometric data collected should be shared on the basis
of clear guidelines and publicly accessible policies. The non-definitive results relying on
probabilities (a feature of AI technologies), if used in an opaque manner and without the rule
of law, would create a great risk to the individuals who might face legal consequences if they
are misidentified during the process.16 A limitation of purpose clause in relevant law and
policy alongside judicial oversight can be considered for sensitive FRT systems such the
DigiYatra use case. Focus is needed to update privacy and other user policies on time to
ensure an informed public consultation. Current literature around the status quo has presented
multiple false positive cases and posing significant surveillance threats, as mentioned in the
draft report.17 The private market for FRTs also needs to be scrutinised through an oversight
mechanism. Otherwise the prevalence of technological products which many observers term
as 'AI snake oil' (such as those which claim to be able to execute emotion detection) poses
significant risks to citizen's data rights and bottom lines of productive firms in the economy.

13
Joy Buolamwini and Timnit Gebru (2018), Gender shades: Intersectional accuracy disparities in commercial
gender classification, Proceedings of Machine Learning Research, Conference on Fairness, Accountability and
Transparency.
14
Alex Najibi, (2020 October 24), Racial Discrimination in Face Recognition Technology, Science Policy and
Social Justice, Harvard University
15
Vidushi Marda & Shazeda Ahmed (2021), Emotional Entanglement: China’s emotion recognition market and
its implication for human rights.
16
European Data Protection Board (2022 May 12), Guidelines 5/22 on the use of facial recognition technology
in the area of law enforcement, Version 1
17
Harwell, D. (2021 April 13) Wrongfully arrested man sues Detroit police over false facial recognition match,
The Washington Post (Online) retrieved from
https://www.washingtonpost.com/technology/2021/04/13/facial-recognition-false-arrest-lawsuit/

5
6
The DialogueTM is a public-policy think- tank with a vision to drive a progressive narrative in
India’s policy discourse. Founded in 2017, we believe in facilitating well-researched policy de-
bates at various levels to help develop a more informed citizenry, on areas around technology
and development issues. The DialogueTM has been ranked as the world’s Top 10 think- tanks to
watch out for, by the Think-Tank and Civil Societies Programme (TTCSP), University of Pennsyl-
vania in their 2020 and 2021 rankings.

https://thedialogue.co

You might also like