Criminal Liability For Systemic Censorship Of Online Platforms

1. Understanding Systemic Censorship of Online Platforms

Systemic censorship refers to deliberate, structured suppression of content on online platforms, often by governments, private corporations, or collusive networks. It can include:

Blocking or removing political speech, news, or public-interest content

Manipulating search results or algorithmic feeds to suppress certain viewpoints

Coordinated takedowns of content critical of authorities or corporations

Pressure on platforms to delete or hide user-generated content

Criminal liability arises when:

Censorship violates statutory protections for freedom of speech and expression

Platforms or their executives participate knowingly in suppressing lawful content

Collusion with authorities or malicious intent leads to harm, defamation, or deprivation of rights

Consequences: Civil and criminal penalties, fines, regulatory action, imprisonment for responsible individuals, and corporate liability.

2. Legal Framework

Indian Law

Indian Penal Code (IPC):

Section 153A – Promoting enmity between groups

Section 295A – Deliberate acts to outrage religious feelings

Section 505(1)(b/c) – Statements creating fear or public mischief

Section 66A (repealed, but historical reference) – Sending offensive messages via electronic communication

Information Technology Act, 2000:

Section 66 – Hacking, unauthorized access

Section 69A – Blocking content under government order

Section 66E – Violation of privacy

Constitution of India: Article 19(1)(a) – Right to freedom of speech and expression

Subject to reasonable restrictions under Article 19(2)

International Framework

UN Human Rights Council Resolutions: Protect online freedom of expression

European Convention on Human Rights, Article 10: Freedom of expression

International Covenant on Civil and Political Rights (ICCPR): Protects speech from arbitrary censorship

Principle: Systemic censorship can lead to criminal liability for platform operators, authorities, or colluding entities when it infringes on legally protected rights, involves coercion, or causes public harm.

3. Landmark Cases

Case 1: Shreya Singhal vs. Union of India (2015)

Facts:

Challenge to Section 66A of the IT Act, which allowed penal action for “offensive” online content.

Section widely used for removing content and prosecuting users.

Legal Findings:

Supreme Court struck down Section 66A as unconstitutional for being overbroad and violating Article 19(1)(a).

Recognized the need to prevent systemic censorship that is arbitrary or disproportionate.

Outcome:

Declared the section unconstitutional; users regained protection for online expression.

Platforms not liable if content removal is challenged; authorities held accountable for arbitrary censorship.

Key Principle: Systemic censorship without legal authority constitutes infringement of rights and can attract liability.

Case 2: Facebook Content Moderation Case – India (2020)

Facts:

Facebook was accused of systematically removing posts critical of government policy.

Alleged collusion with local authorities for selective takedowns.

Legal Findings:

IT Rules, 2021 require platforms to have grievance redressal mechanisms and transparency reports.

Platforms may be criminally liable if they knowingly comply with illegal or discriminatory takedown requests.

Outcome:

Facebook required to provide transparency reports; internal audits mandated.

Highlighted corporate liability for systematic content suppression.

Key Principle: Platforms are liable when systemic censorship violates legal protections or arises from collusion with authorities.

Case 3: Twitter vs. Government of India (2021)

Facts:

Twitter faced penalties for not complying with government-mandated content removal orders.

Government alleged Twitter was hosting “illegal content” under IT Act Section 69A.

Legal Findings:

Court examined the balance between freedom of expression and government authority.

Platforms must follow lawful takedown notices but cannot arbitrarily suppress lawful content.

Outcome:

Twitter directed to comply with lawful orders; systemic suppression of lawful posts held problematic.

Emphasized need for clear guidelines to prevent arbitrary censorship.

Key Principle: Corporate liability arises when platforms intentionally implement systemic censorship beyond legal mandates.

Case 4: Google India vs. Right to Privacy Petition (2012)

Facts:

Google challenged petitions demanding removal of sensitive content on its platform.

Alleged that systemic removal of content without due process could violate rights.

Legal Findings:

Supreme Court emphasized due process, transparency, and proportionality in content takedowns.

Platforms must follow clear rules; arbitrary censorship attracts liability.

Outcome:

Set precedent for judicial oversight of platform moderation.

Key Principle: Systemic censorship without due process can trigger criminal or civil liability.

Case 5: Kerala State Police vs. Social Media Users (2018)

Facts:

Alleged “fake news” led to systemic deletion of posts by platforms under police direction.

Users claimed arbitrary censorship and suppression of lawful content.

Legal Findings:

Court ruled that while police can flag unlawful content, systemic removal without proper review violates Article 19(1)(a).

Platforms may be liable if they implement indiscriminate takedowns.

Outcome:

Platforms instructed to have grievance mechanisms and review processes.

Key Principle: Collusion between authorities and platforms in systemic censorship can create criminal or civil liability.

Case 6: TikTok Ban & Content Moderation in India (2020)

Facts:

TikTok removed large volumes of content flagged by authorities.

Allegations of disproportionate censorship affecting lawful content.

Legal Findings:

Court recognized the need to balance public order concerns with freedom of expression.

Platforms may face penalties if systematic suppression is arbitrary or discriminatory.

Outcome:

TikTok banned temporarily; platforms required to implement fair and transparent moderation policies.

Key Principle: Systemic censorship without proportionality and legal basis can attract corporate and individual liability.

4. Patterns and Lessons

Corporate Liability Exists: Platforms are accountable if they participate in systemic suppression beyond lawful mandates.

Collusion Aggravates Liability: Cooperation with authorities to remove lawful content can result in criminal or civil penalties.

Due Process & Transparency: Judicial oversight and grievance mechanisms reduce corporate exposure.

Algorithmic Censorship: Automated systems must comply with legal standards; failure can trigger liability.

Public Interest Matters: Suppression of lawful speech impacting public discourse is scrutinized by courts.

International Standards: Global human rights law reinforces liability for arbitrary online censorship.

LEAVE A COMMENT