Syllabus: GS2/Polity and Governance
Context
- The government has issued a fresh advisory to social media platforms to be stricter in taking down obscene, vulgar, pornographic and other illegal content from their platforms.
About
- Reason: The Ministry of Electronics and Information Technology (MeitY) has repeatedly received complaints that certain content circulating online does not comply with laws on decency and obscenity.
- Under the IT Rules, 2021, platforms are required to make “reasonable efforts” to ensure that users do not upload or share content that is obscene, pornographic or illegal.
- Advisory:
- MeitY asked platforms to ensure that users have easy access to reporting and grievance redressal systems.
- Large social media platforms, in particular, have been told to use automated and technology-based tools to proactively prevent such content from spreading and to ensure faster takedowns.
- 24-hour Takedown Rule: Platforms must remove or disable access to material that is “prima facie” sexual in nature, including impersonation, within 24 hours of receiving a complaint.
- Non-compliance with the provisions of the IT Act and/or the IT Rules, 2021 may result in consequences, including prosecution under the IT Act, BNS, and other applicable criminal laws, against the intermediaries, platforms and their users,
Digital content censorship
- Digital content censorship refers to the control of online content by governments, organizations, or other entities. This includes:
- blocking websites and apps;
- removal of social media content;
- regulation of OTT (Over-The-Top) streaming platforms;
- restrictions on digital news and journalism.
Need for the Censorship
- Curbing Misinformation and Fake News: Prevents rapid spread of rumours that can trigger mob violence, panic, and public disorder.
- Controlling Hate Speech and Communal Content: Essential to stop content that fuels communal tensions, incites violence, or threatens social harmony.
- Safeguarding Children and Vulnerable Groups: Restricts access to harmful, explicit, violent, or manipulative content that can exploit minors.
- Loopholes in Platform Accountability: Social media platforms delay content moderation, lack transparency, and often evade responsibility due to weak enforcement mechanisms.
- Preventing Cybercrimes: Blocks websites and content related to child pornography, trafficking, drug markets, or illegal financial activities.
- Addressing AI Threats and Deepfakes: Necessary to regulate AI-generated fake videos/photos that can damage reputations, distort democratic processes, and mislead citizens.
Legal Framework Governing Digital Censorship in India
- Right to Freedom of Speech (Article 19(1)(a)): Subject to reasonable restrictions under Article 19(2) concerning decency, morality, and public order.
- Information Technology (IT) Act, 2000: Section 69A grants the government power to block online content for security or public order concerns.
- Intermediary Guidelines & Digital Media Ethics Code, 2021: Regulates social media, OTT platforms, and digital news media.
- Self-Regulation by OTT Platforms: Platforms like Netflix and Amazon Prime follow self-regulatory frameworks such as the Digital Publishers Content Grievances Council (DPCGC).
- The Central Board of Film Certification (“CBFC”), which was established by the Cinematographic Act, of 1952, is responsible for censoring movies in India.
Challenges in Digital Censorship in India
- Balancing Freedom of Speech & Regulation: Over-regulation can suppress creativity, while under-regulation can spread harmful content.
- Transparency & Accountability: Content moderation and censorship decisions often lack clear guidelines, raising concerns about misuse.
- Jurisdictional Issues: Many digital platforms operate from outside India, making enforcement difficult.
- Technological Advancements: The rapid evolution of digital media complicates consistent and fair regulation.
- Ethical Concerns: The subjective nature of obscenity laws can lead to arbitrary censorship.
Way Forward
- Enhancing Transparency in Content Moderation: Digital platforms should publish periodic transparency reports on content takedowns.
- Encouraging Digital Literacy: Educating citizens to identify fake news rather than enforcing restrictive censorship.
- Public Consultation in Policymaking: Involving journalists, legal experts, and civil society in framing digital content regulations.
Source: TH
Next article
100 Years of Quantum Mechanics