Syllabus: GS2/Polity and Governance
Context
- The Supreme Court suggested a tougher line on user-generated content, calling for a neutral, autonomous regulator for social media platforms.
SC Observations/Suggestions
- SC said that there has to be some impartial, autonomous system to regulate social media platforms which will be free from the influence.
- Freedom of speech is a valuable fundamental right, but it cannot lead to perversity, obscenity.
- There can be some mechanism where the fundamental right to free speech can also be protected.
- It also suggested using the Aadhaar number or income tax PAN to verify the age of the user.
- SC has asked the Centre to come back with draft rules for public consultation within four weeks.
Ministry’s Response
- The Ministry of Information and Broadcasting said it is planning to amend the Code of Ethics published with the Information Technology Rules, 2021 to incorporate guidelines on obscenity for all digital content.
- The proposals include rating of online content for different age groups and a bar on anti-national digital content.
- This was being proposed in accordance with Article 19(1)(a) and the reasonable restrictions imposed under Article 19(2).
Digital content censorship
- Digital content censorship refers to the control of online content by governments, organizations, or other entities. This includes:
- blocking websites and apps;
- removal of social media content;
- regulation of OTT (Over-The-Top) streaming platforms;
- restrictions on digital news and journalism.
Need for the Censorship
- Curbing Misinformation and Fake News: Prevents rapid spread of rumours that can trigger mob violence, panic, and public disorder.
- Controlling Hate Speech and Communal Content: Essential to stop content that fuels communal tensions, incites violence, or threatens social harmony.
- Safeguarding Children and Vulnerable Groups: Restricts access to harmful, explicit, violent, or manipulative content that can exploit minors.
- Loopholes in Platform Accountability: Social media platforms delay content moderation, lack transparency, and often evade responsibility due to weak enforcement mechanisms.
- Preventing Cybercrimes: There is a need to block websites and online content linked to child sexual abuse material (CSAM), trafficking, drug markets, and illegal financial activities to prevent cybercrimes and safeguard vulnerable users.
- Addressing AI Threats and Deepfakes: Necessary to regulate AI-generated fake videos/photos that can damage reputations, distort democratic processes, and mislead citizens.
Legal Framework Governing Digital Censorship in India
- Right to Freedom of Speech (Article 19(1)(a)): Subject to reasonable restrictions under Article 19(2) concerning decency, morality, and public order.
- Information Technology (IT) Act, 2000: Section 69A grants the government power to block online content for security or public order concerns.
- Intermediary Guidelines & Digital Media Ethics Code, 2021: Regulates social media, OTT platforms, and digital news media.
- Self-Regulation by OTT Platforms: Platforms like Netflix and Amazon Prime follow self-regulatory frameworks such as the Digital Publishers Content Grievances Council (DPCGC).
- The Central Board of Film Certification (CBFC), which was established by the Cinematographic Act, of 1952, is responsible for censoring movies in India.
Challenges in Digital Censorship in India
- Balancing Freedom of Speech & Regulation: Over-regulation can suppress creativity, while under-regulation can spread harmful content.
- Transparency & Accountability: Content moderation and censorship decisions often lack clear guidelines, raising concerns about misuse.
- Jurisdictional Issues: Many digital platforms operate from outside India, making enforcement difficult.
- Technological Advancements: The rapid evolution of digital media complicates consistent and fair regulation.
Way Forward
- Strengthening Independent Regulatory Bodies: Ensuring that courts and neutral institutions review censorship decisions.
- Enhancing Transparency in Content Moderation: Digital platforms should publish periodic transparency reports on content takedowns.
- Encouraging Digital Literacy: Educating citizens to identify fake news rather than enforcing restrictive censorship.
- Public Consultation in Policymaking: Involving journalists, legal experts, and civil society in framing digital content regulations.
Source: IE
Previous article
News in Short – 27 November, 2025
Next article
Law Ministry Defends its Simultaneous Polls Proposal