Delhi HC Questions Legality of Mandatory Facial Recognition in Public Exams
- ByAdmin --
- 25 Apr 2025 --
- 0 Comments
The Delhi High Court has recently raised concerns regarding the mandatory use of facial recognition technology (FRT) in public examinations conducted by various government agencies and recruiting bodies. The court’s observations came while hearing petitions filed by aspirants who challenged the practice on grounds of privacy infringement, data misuse, and lack of legal backing.
This development adds to the growing debate on the use of biometric surveillance in the absence of a dedicated data protection law in India.
Background of the Case
- Petitioners approached the court after government-run examination bodies like SSC (Staff Selection Commission) and NTA (National Testing Agency) introduced mandatory facial recognition for authentication during registration and examination.
- Candidates argued that the use of FRT without explicit consent, proper safeguards, or clear policy frameworks violates their fundamental right to privacy.
- The Delhi High Court issued notices to the Union Government and the relevant exam-conducting bodies seeking justification for deploying such technology.
Court’s Key Observations
The Delhi High Court emphasized several critical issues:
- There is currently no legislation governing the deployment of FRT in India.
- Compelling candidates to undergo facial scanning without a statutory framework or opt-out mechanism may amount to coercion.
- The use of FRT must be tested against the standards of legality, necessity, and proportionality, laid down by the Supreme Court in Justice K.S. Puttaswamy v. Union of India (2017), which recognized privacy as a fundamental right under Article 21 of the Constitution.
Legal Issues Raised
- Violation of Article 21 – Right to Privacy
The mandatory collection and processing of biometric data without informed consent or data protection safeguards infringes on the right to privacy.
- Lack of Statutory Backing
There is no comprehensive data protection law currently in force. The proposed Digital Personal Data Protection Act, 2023, is not yet fully operational or implemented in administrative processes.
- Potential Discrimination
Petitioners argued that FRT is often unreliable for marginalized groups, particularly darker-skinned individuals and women, leading to exclusion and technical rejection during exam check-ins.
- Due Process Concerns
Denying candidates access to public examinations for not consenting to facial recognition can be challenged as a violation of Article 14 (equality before law) and Article 19(1)(g) (right to practice a profession).
Government’s Justification
The exam authorities and central government responded by:
- Asserting that FRT is used to prevent impersonation and fraud, ensuring a fair examination process.
- Claiming that the use of such technology is voluntary and consent-based during the application process.
- Highlighting the broader objective of digital transformation in public services and examination reforms.
However, the court questioned whether mere mention in application terms could substitute statutory authorization for biometric processing.
Emerging Concerns on Surveillance
This case adds to growing apprehension about state surveillance and use of AI-based technologies in India,
- Lack of transparency in FRT deployment across sectors such as policing, transportation, and public health.
- Fear of mass data collection without oversight, particularly in absence of a functioning Data Protection Board, as envisioned in the new Digital Personal Data Protection Act.
- Risk of function creep, where data collected for one purpose may be repurposed for surveillance or profiling.
Possible Outcomes and Next Steps
- The Delhi HC has not issued an interim order but has called for detailed affidavits from the concerned authorities.
- A final ruling could set a precedent on how biometric surveillance is regulated in public processes.
- The court may ask for clear policies, impact assessments, or even seek a temporary halt on FRT usage until legal safeguards are instituted.
Conclusion
The Delhi High Court’s intervention signals a critical moment in India’s digital governance framework. As technology outpaces lawmaking, courts are increasingly being called upon to balance efficiency with constitutional rights. The decision in this case could significantly influence how emerging technologies like facial recognition are integrated into public systems while preserving citizen autonomy and privacy.
0 comments