28.11.2025
In November 2025, the Supreme Court asked the Ministry of Information & Broadcasting to set up an independent regulator for user-generated content, citing rising obscene, harmful, misleading, and AI-manipulated material on social media.
UGC on platforms like YouTube often includes vulgar, obscene, violent, defamatory, or extremist material that remains online for long periods and garners massive views before action is taken.
Adolescents face distorted perceptions due to explicit content; studies link aggressive pornography with violence. Women, children, and rural users remain more exposed and less protected.
Political dissent is protected, but mass-inciting or inflammatory content raises safety concerns and may not contribute to democratic dialogue.
AI-generated deepfakes and synthetic media accelerate misinformation and harmful behavioural patterns.
Platforms often delay takedowns and avoid accountability for harmful UGC.
The court proposes a regulator with judicial, technical, and domain experts, free from state or corporate influence.
Inputs from NBDA, civil society, digital rights groups, and platforms recommended.
Regulation must curb harmful content without suppressing free speech.
Courts stress clear, age-appropriate warnings for sensitive material.
A dedicated UGC regulator is essential to curb harmful online content without undermining free expression. Transparent, independent, and tech-enabled oversight can create safer digital spaces and balance constitutional rights with user protection.