India’s Three-Hour AI Content Takedown Rule: Legal Impact on Digital Platforms and Businesses
The rapid evolution of Artificial Intelligence has transformed digital communication, enabling automated content creation, synthetic media production, and AI-assisted public engagement. However, the same technology has also facilitated the spread of deepfakes, impersonation, misinformation, and digitally manipulated narratives. Recognising these risks, the Government of India has introduced a stricter compliance mandate under the Information Technology regulatory framework. Significant social media intermediaries are now required to remove flagged unlawful AI-generated content within three hours of receiving valid notice from the competent authority. This regulatory development marks a significant tightening of intermediary liability standards and signals a stronger approach toward digital accountability. Legal Framework Governing Intermediaries in India Digital platforms operating in India are governed by the Information Technology Act, 2000, along with the Intermediary Guidelines and Digital Media Ethics Code Rules.…

