Facial Recognition Technology and Data Privacy in India: Challenges and Recommendations
Facial Recognition Technology and Data Privacy in India: Challenges and Recommendations
Facial Recognition Technology and Data Privacy in India: Challenges and Recommendations
Introduction
The rapid advancements in Facial Recognition Technology (FRT) have sparked critical concerns about privacy in both public
and private spaces. The recent development of the I-XRAY program exemplifies these advancements, showcasing how smart
glasses, like Meta Ray-Ban, can identify strangers on the street while extracting detailed personal information from facial
recognition databases such as PimEyes and FaceCheck.ID. In India, however, the legal framework governing biometric-based
surveillance technology remains inadequate to prevent violations of data privacy—a concern acknowledged by NITI Aayog.
With over 170 identified FRT systems across the country, of which 20 are currently operational, and substantial investments
exceeding ₹1,513 crore made by both central and state governments, the rapid growth of FRT underscores an urgent need for
robust data privacy protections. The convergence of these technological advancements and the lack of comprehensive legal
safeguards highlights the necessity for immediate attention to privacy concerns in the face of an expanding surveillance
landscape
The right to privacy was formally recognized as a fundamental right under Article 21 of the Indian Constitution in the
landmark case of Justice K.S. Puttaswamy v. Union of India. This right encompasses three key aspects: the right to be alone,
control over personal information, and conditions necessary to safeguard one's dignity and independence. The Digital
Personal Data Protection Act 2023 (from here in forth DPDP Act) is the primary legislation addressing privacy concerns in
India. While it offers some protections, such as restrictions on the use of sensitive personal data, it has significant limitations,
including exemptions for government agencies and a narrow focus on digital personal data only.
The DPDP Act recognizes the need for lawful and legitimate use of personal data, yet its narrow scope, which applies only to
digital personal data, leaves other forms of data vulnerable. This raises concerns when applied to FRT, which processes
biometric data that may be collected in non-digital forms and later digitized.
Helen Nissenbaum’s theory of contextual integrity provides a useful framework for understanding data privacy. It emphasizes
that privacy is maintained when information is used within appropriate contexts and is violated when it is used
inappropriately. In India, the DPDP Act allows individuals to withdraw consent, but exemptions such as in Section 17 pose
Clearview AI, a U.S.-based company, collects billions of images from social media and other websites without users' consent
to build its facial recognition database. Law enforcement agencies use this database to identify individuals, sparking
widespread concerns about privacy violations and data abuse. Despite facing legal challenges in multiple U.S. states and
abroad, Clearview continues to operate, illustrating the challenges of regulating FRT across different jurisdictions.
Similarly, the Transportation Security Administration (TSA) in the U.S. has introduced a pilot program using FRT at several
airports. While the program aims to streamline identity verification, it raises privacy concerns, particularly regarding how
long biometric data is stored and who has access to it. Although participation in the program is voluntary, privacy advocates
call for stronger safeguards and clearer policies on the use of such data in public spaces.
These examples highlight the global risks of FRT misuse and emphasize the need for strict legal frameworks to protect
individual rights. Much more when Initiatives like DigiYatra and facial recognition-based attendance system are being
The European Union (EU) has established comprehensive data protection principles through various instruments, including
the Charter of Fundamental Rights, the Treaty on the Functioning of the European Union, and GDPR. These principles and
regulations encompass rules on data quality, personal data, sensitive data, independent supervision, purpose limitation, and
data transfer.
In contrast, India's DPDP Act provides more limited protections. While it allows individuals to request the cessation of data
processing under certain circumstances, the act lacks clear definitions for key terms and does not specify the extent of
The use of FRT raises several ethical and privacy concerns. These include system errors and inaccuracies, function creep
(expansion of use beyond original purposes), lack of comprehensive policies, privacy violations through data aggregation,
To address these issues, FRT systems must adhere to data protection principles such as purpose limitation, proportionality,
and security. Some American jurisdictions, like San Francisco, Boston and Oakland, have banned the use of FRT by local
Page |3
agencies due to concerns about bias, inaccuracy, and privacy infringement. With the European Parliament voting in favor of a
comprehensive ban on live facial recognition technology during the AI Act’s negotiations in June 2023
DPDP Act has several limitations that hinder its effectiveness in protecting data privacy, especially in the context of Facial
Recognition Technology (FRT). Its limited applicability to government agencies (Section 17 exemptions) and narrow
definition of "personal data," confined to digital data only, are significant drawbacks. The act also lacks a clear definition for
"consent," by restricting itself to the primary means for lawful processing of personal data under the Act, and fails to provide
guidelines for data breach procedures. Moreover, the safety procedures for data processing outlined in the act are insufficient
The Current Act in India does not adequately address privacy concerns related to Facial Recognition Technology (FRT). The
limitations of the DPDP Act, combined with the intrusive nature of FRT, create a significant risk of data abuse and privacy
violations. FRT's ability to collect data without explicit consent or awareness further exacerbates these concerns.
In light of rising privacy concerns, it is essential to impose strict regulations on biometric surveillance systems, including
facial recognition technology, used by government entities in India. No agency should acquire or use such systems without
explicit authorization from Parliament. There is a need for specific amendments detailing authorized entities and auditing
requirements. Information obtained unlawfully must be inadmissible in court, except in cases alleging violations of this law.
Individuals should have the right to seek legal recourse for violations, with state officers authorized to sue on behalf of
residents. Furthermore, state and local units should be ineligible for central law enforcement grants unless compliant with
these regulations.
Recommendations
To improve personal data privacy protection in the context of Facial Recognition Technology (FRT) in India, several key
Firstly, it is essential to establish clear and precise definitions within the Digital Personal Data Protection (DPDP) Act. This
should include expanding the definition of "personal data" to cover non-digital data and providing a clear definition of
Secondly, The exemption clause in Section 17 also requires re-examination to ensure that proper safeguards apply even when
the government is involved in the use of personal data. In addition, implementing data protection by design, such as through
pseudonymization and encryption techniques, would significantly enhance the security of sensitive information.
Thirdly, the scope of liability and jurisdiction under the DPDP Act should be expanded. This means applying the law more
broadly to include more government agencies and extending its reach to data processing activities that take place outside
India.
Fourthly, creating comprehensive data breach procedures is essential. Clear guidelines or rules should be established to
ensure affected parties are notified within 72 hours of a breach, along with specific procedures for managing and mitigating
Finally, enacting specific regulations for FRT is critical to addressing the unique challenges this technology poses.
Developing comprehensive guidelines for FRT usage in both the public and private sectors—covering aspects like purpose
limitations, usage methods, and security measures—will ensure its ethical and responsible implementation.
By implementing these recommendations, India can strike a better balance between leveraging the benefits of FRT and
Conclusion
The current legal framework in India, particularly the DPDP Act, is inadequate in addressing the privacy concerns raised by
the widespread use of Facial Recognition Technology (FRT). Significant improvements are needed in both policy and
legislation to ensure the ethical and secure implementation of FRT. Key areas for improvement include strengthening the
DPDP Act by expanding its scope and clarifying definitions, implementing robust safety procedures and impact assessments,
establishing comprehensive data breach protocols, and enacting specific regulations for FRT use. By addressing these issues,
India can work towards a balance between leveraging the benefits of FRT and protecting individuals' fundamental right to
privacy. This approach will help create a safer and more ethical environment for the use of advanced surveillance