IT Ministry mandates label for AI-generated content, reduces takedown timeline to 2–3 hours | Current Affairs | Vision IAS

Upgrade to Premium Today

Start Now
MENU
Home
Quick Links

High-quality MCQs and Mains Answer Writing to sharpen skills and reinforce learning every day.

Watch explainer and thematic concept-building videos under initiatives like Deep Dive, Master Classes, etc., on important UPSC topics.

A short, intensive, and exam-focused programme, insights from the Economic Survey, Union Budget, and UPSC current affairs.

ESC

Daily News Summary

Get concise and efficient summaries of key articles from prominent newspapers. Our daily news digest ensures quick reading and easy understanding, helping you stay informed about important events and developments without spending hours going through full articles. Perfect for focused and timely updates.

News Summary

Sun Mon Tue Wed Thu Fri Sat

IT Ministry mandates label for AI-generated content, reduces takedown timeline to 2–3 hours

11 Feb 2026
1 min

Amendments to the Information Technology Act, 2021

The Union Government has introduced significant amendments to the Information Technology Act, aimed at regulating AI-generated content and expediting the removal of unlawful material.

Key Changes Notified

  • Reduced Timeline for Takedown:
    • Platforms must act on government or court orders within three hours, down from the previous 36 hours.
    • Content deemed illegal must be taken down in 3 hours, while sensitive content like non-consensual nudity and deepfakes must be removed within 2 hours.
  • Labelling of AI-generated Content:
    • Photorealistic AI-generated content must be prominently labelled.
    • Social media firms are required to seek disclosures from users if their content is AI-generated.
    • If no disclosure is received, firms must either proactively label the content or remove it in cases of non-consensual deepfakes.

Definitions and Compliance

  • Synthetically generated content is defined as information created or altered using a computer resource to appear real.
  • AI-generated imagery must be labelled prominently, though platforms have been given some leeway from the initial 10% coverage requirement.
  • Failure to comply could result in losing the safe harbour protection, meaning platforms could be held liable like traditional publishers.

Procedural Adjustments

  • States are permitted to notify more than one officer to issue takedown orders, addressing the needs of larger populations.

Explore Related Content

Discover more articles, videos, and terms related to this topic

RELATED TERMS

3

Takedown orders

Official directives issued by government authorities or courts compelling online platforms to remove specific content deemed illegal or harmful from their services. The IT Act amendments have significantly reduced the timeline for compliance with these orders.

Safe Harbour Protection

A legal shield for intermediaries that protects them from liability for the content posted by their users, provided they comply with certain regulations. Non-compliance can lead to the loss of this protection.

Deepfakes

Synthetically generated or manipulated media, often videos or audio, that depict someone saying or doing something they did not actually say or do, created using AI techniques.

Title is required. Maximum 500 characters.

Search Notes

Filter Notes

Loading your notes...
Searching your notes...
Loading more notes...
You've reached the end of your notes

No notes yet

Create your first note to get started.

No notes found

Try adjusting your search criteria or clear the search.

Saving...
Saved

Please select a subject.

Referenced Articles

linked

No references added yet