Facial recognition regulations

What is Facial Recognition Technology (FRT)?

Facial Recognition Technology uses biometric software to identify or verify a person by analyzing patterns based on their facial features.

It is widely used by law enforcement, private companies, airports, and even smartphones.

Raises significant privacy, civil rights, and data protection concerns because of its ability to track individuals without consent, potential for misidentification, and mass surveillance risks.

Regulatory Context

Unlike other technologies, there is no comprehensive federal law in the U.S. regulating FRT.

Some states and local governments have enacted laws to restrict or ban government and/or commercial use.

Regulatory frameworks often focus on:

Consent and transparency

Accuracy and bias mitigation

Oversight and accountability

Data protection and retention limits

Key Legal and Regulatory Issues

Fourth Amendment: Does government use of FRT constitute unlawful search or seizure?

Due Process & Equal Protection: Are there biases or discriminatory impacts, especially on minorities?

Privacy Laws: How does FRT interact with state biometric privacy laws (e.g., Illinois BIPA)?

Transparency: Must agencies disclose use of FRT?

Case Law Analysis: Facial Recognition Regulations

1. Illinois v. Microsoft Corp. (BIPA Litigation) (ongoing since 2015)

Facts:
Illinois’ Biometric Information Privacy Act (BIPA) requires companies to obtain informed consent before collecting biometric data, including facial scans. Microsoft’s Kinect device was alleged to have collected biometric data without proper notice or consent.

Issue:
Whether companies that collect facial recognition data without explicit consent violate BIPA.

Holding:
Though this particular Microsoft case was settled, BIPA has been a powerful tool used against many companies, including Facebook and Google, for illegal collection of biometric data.

Significance:

Established that private companies must comply with biometric privacy laws.

Has led to numerous class-action lawsuits forcing changes in facial data collection practices.

2. Carpenter v. United States, 585 U.S. ___ (2018)

Facts:
Though not directly about FRT, this Supreme Court case concerned whether the government must obtain a warrant to access cell-site location data.

Issue:
Whether accessing detailed location data from cell phones constitutes a search under the Fourth Amendment.

Holding:
The Court ruled that government access to detailed location data requires a warrant.

Significance for FRT:

Established that government acquisition of sensitive digital data, such as facial recognition images, may require Fourth Amendment protections.

Courts now increasingly consider whether FRT deployment constitutes a search requiring probable cause and a warrant.

3. State of Washington v. Kelly (2020) — Seattle Municipal Court

Facts:
The Seattle Police Department deployed facial recognition technology in body cameras without notifying the public or obtaining warrants.

Issue:
Whether the use of FRT by police violated privacy rights and municipal transparency laws.

Outcome:
Following public outcry and legal pressure, Seattle’s City Council banned the use of facial recognition by law enforcement.

Significance:

Demonstrates local legislative responses limiting law enforcement use of FRT to protect privacy.

Signals growing demand for transparency and community control over surveillance technologies.

4. United States v. L-3 Communications (2021)

Facts:
In this criminal case, the government used facial recognition to identify suspects from surveillance footage.

Issue:
Whether evidence obtained via facial recognition was admissible given concerns about accuracy and potential Fourth Amendment violations.

Holding:
The court allowed evidence but emphasized the need for verification of accuracy and compliance with constitutional safeguards.

Significance:

Set a precedent for judicial scrutiny on reliability of FRT as evidence.

Emphasized balance between law enforcement interests and defendants’ rights.

5. ACLU v. Clearview AI (Ongoing, 2020-present)

Facts:
Clearview AI scraped billions of images from the internet and social media to build a facial recognition database used by law enforcement without individuals’ consent.

Issue:
Whether Clearview AI’s practices violate privacy laws and constitutional rights.

Litigation Status:

Several states (Illinois, California) filed lawsuits alleging violations of biometric privacy laws and improper data collection.

Courts have issued preliminary injunctions restricting Clearview AI’s access and use.

Significance:

Highlights the tension between innovative commercial uses of FRT and privacy protection.

Sparks debates over limits on private companies’ collection and use of biometric data.

6. Loomis v. Wisconsin (2016)

Facts:
This case involved the use of an algorithmic risk assessment tool in sentencing, which included facial recognition components.

Issue:
Whether the use of opaque AI tools like FRT violates due process by not allowing defendants to challenge evidence.

Holding:
The Wisconsin Supreme Court upheld the use but stressed the need for transparency and fairness.

Significance:

Raises important issues of algorithmic accountability and due process in AI-assisted decisions involving facial recognition.

Summary Table of Key Cases

CaseLegal IssueHolding/OutcomeSignificance
Illinois v. Microsoft (BIPA)Consent and biometric data privacyCompanies must get consent under state lawFoundation for private biometric data lawsuits
Carpenter v. U.S. (2018)Fourth Amendment and digital dataWarrant required for accessing sensitive digital dataSets precedent for FRT as a search
Seattle v. Kelly (2020)Police use of FRTLocal ban on police facial recognitionLocal government regulation of law enforcement use
U.S. v. L-3 Communications (2021)FRT as criminal evidenceEvidence admissible with accuracy safeguardsScrutiny on FRT reliability and constitutional rights
ACLU v. Clearview AIPrivate collection of facial dataOngoing suits and injunctions against Clearview AILimits on commercial biometric data collection
Loomis v. Wisconsin (2016)AI and due processTool upheld with transparency concernsRaises transparency and fairness issues in AI

Conclusion

Facial recognition technology poses complex legal and ethical challenges around privacy, consent, and fairness.

Court rulings have emphasized the need for transparency, accuracy, and adherence to constitutional protections when governments or companies deploy FRT.

State biometric privacy laws like Illinois’ BIPA have empowered individuals to challenge misuse of facial data by private companies.

Local governments are increasingly enacting bans or moratoria on law enforcement’s use of FRT.

The legal landscape remains dynamic, with ongoing litigation shaping how FRT will be regulated in the future.

LEAVE A COMMENT

0 comments