Syllabus: GS2/ Governance
Context
- The Ministry of Electronics and Information Technology (MeitY) has proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 under powers granted by the Information Technology Act, 2000.
About
- A new amendment to the IT Rules, 2021 aimed at:
- Strengthening online content regulation and transparency in takedown procedures;
- Tackling AI-generated and synthetic media (deepfakes); and
- Ensuring that government orders for content moderation are lawful, proportionate, and accountable.
Key Proposed Amendments
- Authorized Officers for Content Regulation: Only Joint Secretary-rank officers (or equivalent in Central Government) and Director General of Police-rank officers (for states) can issue takedown requests.
- This would prevent misuse of lower-level bureaucratic orders and ensure accountability.
- Legal Justification: Every takedown order must clearly cite:
- Statutory basis (specific provision under IT Act or related laws);
- Precise URLs, posts, or content identifiers; and
- Reasons for removal in writing.
- Ensures decisions are traceable and subject to judicial review.
- Review Mechanism: All takedown actions will undergo a monthly review by a Secretary-level officer to ensure legality, necessity, and proportionality.
- This will enhance transparency and reduce arbitrary censorship.
- Regulation of AI-Generated Content: Introduces definition of “synthetically generated information” — any content (image, video, or audio) created or modified algorithmically to appear authentic.
- Establishes legal clarity on what constitutes AI-manipulated media.
- Labelling of Deepfakes: Platforms must embed visible labels or metadata on AI-generated visuals and audios, covering at least 10% of the frame area or duration.
- Platform Accountability: Significant Social Media Intermediaries (SSMIs) — such as Facebook, Instagram, X, and YouTube — must:
- Obtain user declaration while uploading potentially AI-altered content.
- Use automated detection tools to identify and tag deepfakes or synthetic media.
- Due Diligence Obligations: Non-compliance with these norms will lead to loss of “safe harbour” protection under Section 79 of the IT Act, 2000.
- Platforms can be held legally liable for unlawful or misleading content hosted.
Significance
- Strengthens India’s digital governance framework and online safety ecosystem.
- Aligns with the government’s focus on tackling AI misuse, fake news, and deepfake threats.
- Reinforces trust and transparency in the digital space.
Concerns
- Risk of censorship and overreach by government-appointed fact-check bodies.
- Ambiguity in defining “fake” or “misleading” content.
- Challenges for smaller platforms in complying with technological requirements.
Way Forward
- Need to ensure independent oversight in content moderation.
- Maintain a balance between freedom of speech and user protection.
- Encourage digital literacy and ethical AI use to complement regulatory measures.
Source: TH
Previous article
STEM Brain Drain in India: Beyond Incentives, Towards Ecosystem
Next article
Can match-fixing be Legally Considered Cheating?