LATEST NEWS :
Mentorship Program For UPSC and UPPCS separate Batch in English & Hindi . Limited seats available . For more details kindly give us a call on 7388114444 , 7355556256.
asdas
Print Friendly and PDF

Regulating User-Generated Internet Content

28.11.2025

Regulating User-Generated Internet Content

Context

In November 2025, the Supreme Court asked the Ministry of Information & Broadcasting to set up an independent regulator for user-generated content, citing rising obscene, harmful, misleading, and AI-manipulated material on social media.

 

Key Concerns

  • Proliferation of Harmful Content

UGC on platforms like YouTube often includes vulgar, obscene, violent, defamatory, or extremist material that remains online for long periods and garners massive views before action is taken.

  • Impact on Vulnerable Groups

Adolescents face distorted perceptions due to explicit content; studies link aggressive pornography with violence. Women, children, and rural users remain more exposed and less protected.

  • Borderline Content & Dissent

Political dissent is protected, but mass-inciting or inflammatory content raises safety concerns and may not contribute to democratic dialogue.

  • Technology-Driven Risks

AI-generated deepfakes and synthetic media accelerate misinformation and harmful behavioural patterns.

 

Legal Framework

Constitutional Provisions

  • Article 19(1)(a): Freedom of speech.
     
  • Article 19(2): Reasonable restrictions on decency, morality, public order, defamation, and security.
     

Statutory Framework

  • IT Act 2000: Sections 67 & 67A target obscene or sexually explicit content.
     
  • IT Rules 2021: Require due diligence, grievance redressal, and takedown protocols.
     
  • Safe Harbour (Sec. 79): Protects intermediaries if due diligence is followed.
     

Proposed Amendments

  • Age-based content ratings (U, U/A, A).
     
  • Stricter rules on obscenity, explicit material, and harmful “anti-national” content.
     
  • Ethical norms for AI, deepfakes, and manipulated media.
     

 

Challenges & Court Directions

  • Ineffective Self-Regulation

Platforms often delay takedowns and avoid accountability for harmful UGC.

  • Need for an Independent Authority

The court proposes a regulator with judicial, technical, and domain experts, free from state or corporate influence.

  • Stakeholder Consultations

Inputs from NBDA, civil society, digital rights groups, and platforms recommended.

  • Balance Between Rights & Safety

Regulation must curb harmful content without suppressing free speech.

  • Content Warnings & Safety Labels

Courts stress clear, age-appropriate warnings for sensitive material.

 

Way Forward

  • Establish a statutory, independent regulator for UGC oversight and appeals.
     
  • Strengthen platform duties on proactive monitoring, age-gating, and takedown timelines.
     
  • Use advanced AI for detecting deepfakes, child abuse content, and violent material.
     
  • Boost digital literacy for identifying harmful or manipulated content.
     
  • Ensure proportionate regulation aligned with Article 19(2) while protecting legitimate dissent.
     

 

Conclusion

A dedicated UGC regulator is essential to curb harmful online content without undermining free expression. Transparent, independent, and tech-enabled oversight can create safer digital spaces and balance constitutional rights with user protection.

 

Get a Callback