Amendments to IT Rules 2021
Three Hour Digital Takedown Mandate and AI Labelling Norms: The Union Government has notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The revised framework will come into force on February 20, 2026.
The amendment tightens compliance obligations for social media intermediaries and digital platforms. It specifically targets deepfakes, unlawful AI-generated material, and non-consensual content.
Static GK fact: The parent legislation is the Information Technology Act, 2000, India’s primary law governing cybercrime and electronic commerce.
Mandatory AI Labelling
Under the new rules, platforms must ensure that AI-generated or synthetically generated information (SGI) carries a clearly visible and prominent label. The earlier proposal of fixing a minimum percentage display area has been removed.
However, once applied, the AI label cannot be removed or suppressed. This measure enhances transparency and enables users to distinguish real content from artificially generated material.
Users must also declare when content is AI-generated. Platforms are required to verify such declarations and ensure proper display of the AI label.
Three Hour Takedown Rule
A major shift is the reduction of the content removal timeline from 36 hours to three hours. This significantly increases compliance pressure on intermediaries.
In cases involving non-consensual intimate imagery, the deadline has been further reduced to two hours. The objective is to prevent rapid virality of harmful content in the digital ecosystem.
Failure to comply within the prescribed time may lead to the loss of Safe Harbour Protection.
Understanding Safe Harbour
Safe harbour refers to legal immunity granted to intermediaries for user-generated content, provided they exercise due diligence.
If platforms fail to remove unlawful material within the new three-hour window, they risk losing this protection under the IT Act, 2000. This exposes them to direct legal liability.
Static GK Tip: Section 79 of the IT Act deals with intermediary liability and safe harbour provisions.
Definition of Synthetic Generated Information
The amendment clarifies the meaning of Synthetic Generated Information (SGI). Routine editing done in good faith for assistive or quality-enhancing purposes does not qualify as SGI.
However, if a platform becomes aware that its services are being used to create unlawful SGI, it must take expeditious action. This may include removal, disabling access, or suspension of accounts.
Technical Safeguards and Global Context
Intermediaries must deploy reasonable technical measures to prevent misuse of AI tools. They are also required to block SGI that misrepresents real-world events or impersonates individuals.
The amendments come amid growing global concern over deepfake technology and AI misuse. Several countries are exploring regulatory frameworks to balance innovation with digital safety.
India’s revised IT Rules 2021 Amendment signals a stronger regulatory stance to enhance accountability in the digital ecosystem.
Static GK fact: India has over 800 million internet users, making it one of the largest digital markets globally, increasing the importance of effective content regulation.
Static Usthadian Current Affairs Table
Three Hour Digital Takedown Mandate and AI Labelling Norms:
| Topic | Detail |
| Parent Law | Information Technology Act, 2000 |
| Key Amendment | IT Rules 2021 Amendment notified |
| Effective Date | February 20, 2026 |
| Major Reform | Three-hour content takedown rule |
| Special Provision | Two-hour removal for non-consensual imagery |
| AI Transparency | Mandatory prominent AI labels |
| Legal Provision | Section 79 – Safe Harbour |
| Regulatory Focus | Deepfakes and Synthetic Generated Information |





