Bombay High Court struck down Centre’s Fact Check Unit

Bombay High Court struck down Centre’s Fact Check Unit

25-10-2024
  1. In Sept 2024, Bombay High Court declared a key provision of the amended Information Technology (IT) Rules, 2021 unconstitutional.
  1. This provision allowed the government to identify "fake news" on social media through a Fact Check Unit (FCU).
  1. The government may appeal this decision before the Supreme Court, as similar cases are pending in the Delhi and Madras High Courts.

Court’s Observations on the Fact Check Unit (FCU)

  1. The IT Amendment Rules, 2023 were found to violate the Constitution, particularly:
  1. Article 14 (Right to equality)
  2. Article 19 (Freedom of speech and expression)
  3. Article 19(1)(g) (Freedom of profession)
  1. The definition of fake or misleading news remains vague, lacking precision and clarity.
  2. The court stated that there is no legally recognized "right to the truth," meaning the State is not required to ensure that only fact-checked information is available to the public.
  3. The measures introduced were deemed to not meet the required proportionality standards.
Key Facts on Fake News
  1. Fake news cases rose significantly, with 1,527 cases in 2020, marking a 214% increase from 486 cases in 2019 and 280 cases in 2018 (National Crime Records Bureau (NCRB) data).
  2. The Fact Check Unit of PIB has exposed 1,160 instances of false information since its launch in November 2019.

What is the Fact Check Unit (FCU)?

  1. Establishment: In April 2023, the Ministry of Electronics and Information Technology (MEiTY) introduced amendments to the Information Technology (IT) Rules, 2021 through the IT (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023.
  2. Purpose: The Fact Check Unit (FCU) was created to counter misinformation related to the Indian government. Its main role is to verify facts and promote accurate information in public discourse.
  1. This unit will flag posts containing fake, false, or misleading information.
  1. Expanded Definition of Fake News: The amendment to Rule 3(1)(b)(v) of the IT Rules, 2021 broadens the scope of "fake news" to include ‘government business.’
  2. Legal Issue: In March 2024, the Supreme Court stayed the FCU’s establishment under the Press Information Bureau.
  1. The government defended the unit, stating it was the least restrictive way to combat misinformation and prevent the spread of false information.
  1. Compliance and Consequences: If the FCU identifies content that violates the amended rules, it will notify social media intermediaries.
  1. These intermediaries must take down flagged content to maintain their "safe harbour" protection (legal protections) under the IT Rules, 2000, which shields them from legal liability for 3rd-party content.
  2. Failure to comply could result in action under Section 79 of the IT Act, 2000, which could lead to the loss of safe harbour provisions.

Powers of FCU

  1. Content Labelling: The FCU has the authority to label content about the government on platforms like Facebook and Twitter as "fake" or "misleading."
  2. Content Removal: Social media platforms must remove flagged content if they wish to maintain their 'safe harbour' status, which grants them legal immunity for third-party content.
  3. Blocking URLs: Internet service providers will be required to block URLs containing such flagged content when directed by the FCU.
Who are the Intermediaries?
  1. Intermediaries are entities that help in the transmission or hosting of content and services on the internet. They connect users to the online world, facilitating the exchange of information.
  1. Example: Social media platforms (e.g., Facebook, Twitter), E-commerce platforms (e.g., Amazon, Flipkart), Search engines (e.g., Google), Internet service providers (ISPs) and Cloud service providers
  1. Significant Intermediaries: Significant intermediaries are a specific group of intermediaries with a large user base and substantial influence on public discussions.
  2. Criteria: According to the IT Rules, 2021, intermediaries with over 5 million users in India are classified as significant intermediaries.

What are the Key Issues with the Amended IT Rules, 2023?

  1. Violation of IT Act 2000: The Fact Check Unit (FCU) established by the Executive can issue takedown orders to social media platforms and intermediaries.
  1. This bypasses the procedures under Section 69A of the IT Act 2000 and bypass parliamentary processes required to expand the scope of the parent legislation.
  1. Violation of Fundamental Rights: Rule 3(1)(b)(v) violates Articles 14 (equality before law), 19(1)(a) (freedom of speech and expression), and 19(1)(g) (right to practice a profession or trade) of the Constitution.
  1. It imposes restrictions that go beyond the reasonable limitations allowed under Article 19(2) and cannot be justified through delegated legislation.
  1. Vague Definitions: The terms "fake," "false," and "misleading" in the Rules are unclear and over-broad.
  1. The FCU flags posts containing such content to social media intermediaries, leaving the scope of these terms undefined and open to misuse.
  2. This could lead to fears of arbitrary interpretation which could suppress valid discussions and dissent.
  1. Against Natural Justice Principles: The FCU acts as the sole decision-maker on what is or isn’t the truth, giving the government arbitrary power over online content.
  1. This bypasses the principles of natural justice, making the process unconstitutional.
  2. The government's defense that decisions can be challenged in court is insufficient to safeguard rights, and the rule cannot be corrected by limiting its operation.
  1. Excessive Government Control: The creation of the FCU under the PIB raises concerns about government dominance in information regulation, potentially weakening the role of independent media and civil society.
  2. Erosion of Accountability: The rules may weaken government accountability, as the FCU could be misused to silence criticism rather than promote transparent fact-checking.
  3. Lack of Judicial Oversight: There is no clear provision for judicial review of decisions made by the FCU, raising concerns about unchecked power and potential abuse.
  4. Fails Proportionality Test: The Rule creates a “chilling effect” on social media intermediaries who face the threat of losing "safe harbor" protections, as well as on freedom of speech.
  1. The 2023 amendments grossly violate the Supreme Court's ruling in Shreya Singhal vs. Union of India (2015), which set strict procedures for blocking content.

About the Shreya Singhal vs. Union of India (2015)

  1. The Shreya Singhal vs. Union of India case is a landmark Supreme Court judgment concerning the constitutionality of provisions in the IT Act 2000.
  2. Key Points:
  1. Section 66A Unconstitutional: The court struck down Section 66A, which criminalized “offensive” or “menacing” content, stating that it infringed (violet) on the right to freedom of speech.
  2. Overbreadth and Vagueness: Section 66A lacked clear guidelines for defining "offensive" or "menacing" content, leading to arbitrary enforcement.
  3. Chill Effect: The provision had a chilling effect on free expression, as people feared prosecution for expressing opinions online.

Information Technology Amendment Rules 2023

  1. The Information Technology Amendment Rules 2023 were framed under the powers granted by the IT Act, 2000. These rules replace the earlier Information Technology (Intermediaries Guidelines) Rules, 2011.

Key Provisions

  1. Due Diligence by Intermediaries: Intermediaries must prominently display their rules, regulations, privacy policies, and user agreements on their platforms.
  1. They are required to prevent the publication of unlawful content, such as obscene (indecent), defamatory, or misleading information.
  2. A Grievance Redressal Mechanism must be in place to handle user complaints.
  1. Additional Due Diligence for Significant Social Media Intermediaries: These intermediaries must appoint key personnel, including a Chief Compliance Officer and a Grievance Officer.
  1. Monthly reports must be submitted on compliance, including complaints received and actions taken.
  1. Grievance Redressal Mechanism: Intermediaries must acknowledge user complaints within 24 hours. Complaints should be resolved within 15 days.
  1. Urgent Complaints: Issues related to privacy violations or harmful content must be addressed within 72 hours.
  1. Code of Ethics for Publishers: Publishers of news and online content must follow a Code of Ethics to ensure their content respects India's sovereignty and complies with existing laws.
  2. Regulation of Online Games: Intermediaries must provide clear policies on withdrawals, winnings, and user identity verification. Real-money Games must be verified by a self-regulatory body (SRB).

Self-Regulatory Body (SRB): An SRB is an organization that ensures compliance with ethical standards, guidelines, and best practices for digital media and intermediaries.

Must Check: Best IAS Coaching In Delhi

UPSC Prelims Result 2024 Out: Expected Cut Off & Other DetailsUPSC Prelims 2024 Answer with ExplanationDaily Prelims QuizDaily Current AffairsMONTHLY CURRENT AFFAIRS TOTAL (CAT) MAGAZINEBest IAS Coaching Institute in Karol BaghBest IAS Coaching Institute in DelhiDaily Mains Question Answer PracticeENSURE IAS UPSC ToppersUPSC Toppers MarksheetPrevious Year Interview QuestionsUPSC Syllabus

Prime Minister's Inauguration of Projects in Amaravati, Andhra Pradesh

Government of Karnataka Notifies Hesaraghatta Grassland Conservation Reserve

Cancer Patients in UK to Receive Immunotherapy Drugs via Single Injection for Faster Treatment