MeitY Enforces 24-Hour Removal Rule for NCII

India has rolled out one of its strongest digital safety frameworks to date, mandating the removal of non-consensual intimate imagery (NCII) within 24 hours across all online platforms. Issued by the Ministry of Electronics and IT (MeitY), the new Standard Operating Procedure (SOP) marks the country’s first unified, victim-centric protocol to tackle revenge porn, deepfakes, morphed images, and hidden-camera content — crimes that have surged sharply over the past two years.

The SOP is a direct response to a July 2025 Madras High Court order in the case X vs. Union of India, which demanded a clear, fast, and trauma-sensitive mechanism for victims whose private images are circulated online. MeitY’s framework, now formalised, strengthens obligations under the IT Rules 2021 and gives victims multiple entry points to report abuse.

Victim-First Reporting, 24-Hour Removal

Platforms must now remove or disable access to NCII within 24 hours of receiving a complaint through any of these channels:

  • Direct report to the platform (Instagram, WhatsApp, YouTube, etc.)

  • National Cybercrime Reporting Portal (NCRP) & helpline 1930

  • One-Stop Centres (OSCs) for women

  • Local police stations

The scope of NCII includes real intimate photos, videos, deepfakes, morphed images, hidden-camera footage, cloud leaks, and re-uploads. Failure to comply triggers penalties under the IT Rules, making rapid action non-negotiable.

Hash Matching to Block Re-Uploads

A major breakthrough is the creation of a national “hash bank” managed by the Indian Cybercrime Coordination Centre (I4C). Once flagged, an image’s digital fingerprint is stored securely and used across platforms to prevent reposts — even under new URLs or modified formats.

CDNs must purge content globally, ISPs must throttle access to offending links, and search engines must de-index NCII, even if the original source is gone.

Seamless Coordination Across Agencies

The SOP establishes a tightly aligned ecosystem:

  • MeitY – oversight and enforcement

  • I4C – hash-sharing and forensic support

  • DoT/ISPs – blocking and link suppression

  • Platforms – 24-hour takedown and victim communication

  • Police/LEAs – criminal action under IPC 354C, 66E IT Act, and 67A

Victims must be notified once takedown is complete — a major shift from the opaque, slow processes typical until now.

Nine Abuse Scenarios the SOP Explicitly Covers

The SOP outlines detailed victim pathways for:

  1. Revenge porn by ex-partners

  2. Morphed images circulating on WhatsApp

  3. Hidden-camera recordings leaked online

  4. Intimate videos posted on social platforms

  5. Deepfake sexual content

  6. Cloud storage hacks leading to leaks

  7. Search engine results showing cached NCII

  8. Persistent re-uploads by trolls

  9. Cases where victims cannot file complaints themselves

These examples establish clear, uniform expectations for platforms, law enforcement, and grievance officers.

A Turning Point in Digital Safety Enforcement

Digital rights groups call the SOP a long overdue “structural safeguard” in a country witnessing a 40% year-on-year rise in NCII complaints. In October 2025 alone, NCRP logged more than 4,200 NCII cases, many involving minors or deepfake technology.

MeitY will conduct nationwide awareness campaigns under #SafeDigitalBharat, aiming to reach 100 million citizens by March 2026.

With deepfake abuse rising — I4C detected 1,800 such cases in Q3 2025 — India’s new protocol represents one of the world’s fastest takedown standards. For victims, the message is unambiguous: digital dignity must be protected, and platforms now have 24 hours to act — no exceptions.

Latest articles

Related articles