Balancing Safe Harbour: Insights into India’s Draft IT Rules, April 2026
India’s Draft IT Rules 2026 propose a structural shift in how digital speech is governed. While framed as clarificatory and procedural, the fine print suggests otherwise, raising legitimate questions about regulatory architecture, constitutional safeguards, and the future of online expression.
OPINION


India today has over 750 million internet users. Every day they post, share, comment, and critique with relative freedom. They do so because platforms have a legal shield. That shield is now being redesigned. How it is redesigned matters more than most headlines have conveyed.
In March this year, the Ministry of Electronics and Information Technology released the Draft IT (Intermediary Guidelines and Digital Media Ethics Code) Second Amendment Rules, 2026. Which is described as “clarificatory and procedural in nature.” Making it worth decoding what those clarifications actually do.
What the amendment proposes
The centrepiece of the amendment is Rule 3(4), carrying the most regulatory weight. It would make platform compliance with MeitY-issued advisories, directions, and standard operating procedures a condition for retaining safe harbour protection under Section 79 of the IT Act. Safe harbour is the legal protection that allows platforms to host user-generated content without liability for every post. Without it, platforms face existential legal exposure in India. The compliance logic Rule 3(4) creates is therefore not optional in any practical sense.
What the Supreme Court established
In Shreya Singhal vs Union of India (2015), the Supreme Court held that platforms must act on unlawful content only upon a court order or a government notification grounded in law, and Rule 3(4) creates a compliance obligation requiring neither. Advisories are not court orders. Standard operating procedures are not statutory notifications.
The Bombay High Court has stayed provisions of the IT Rules 2021 on similar grounds. The Madras High Court has flagged concerns about editorial oversight undermining media independence. These proceedings remain pending. The new draft does not wait for their resolution.
The practical consequence is straightforward. A platform receives a ministry advisory. A three-hour takedown window, introduced under the Synthetically Generated Information Rules notified in February 2026. Contesting the advisory risks losing safe harbour entirely. Immediate removal becomes the only rational choice. Legal scholars call this over-compliance. Experts have documented the three-hour window as among the most compressed takedown timelines anywhere in the world.
The scope problem
The amendment extends the Code of Ethics framework, previously applicable only to registered news publishers, to any individual sharing “news and current affairs” content online. The term is not defined. Opinion posts, political satire, and citizen journalism all fall into an undefined grey zone.
The response from media bodies has been swift. In April, six press bodies including the Editors’ Guild of India, DIGIPUB, and the Press Club of India formally demanded the amendment’s complete withdrawal. The compliance burden, they noted, is financially unsustainable for independent creators and freelance journalists.
The government’s case
To be clear, the regulatory intent is grounded in documented problems. Deepfakes targeting public figures, non-consensual intimate imagery, and viral disinformation are real harms that the existing framework handles poorly. The February 2026 SGI Rules, which mandated watermarking and provenance disclosure for AI-generated content, represent a targeted, proportionate policy effort. That approach offers a useful design reference. The challenge with the Second Amendment Rules is not the policy direction. It is the breadth of the instrument relative to the precision of its objectives.
What a more durable framework could look like
Four changes would improve both regulatory effectiveness and constitutional durability. Delegated legislation governing 750 million users warrants structured oversight.
First, the term “news and current affairs” must be defined through a statutory instrument. Precision in scope prevents arbitrary application and reduces litigation risk.
Second, Rule 3(4) should be anchored to formal directions under Section 69A of the IT Act, which already incorporates procedural safeguards and judicial oversight. Enforceability without bypassing the Shreya Singhal threshold.
Third, data retention requirements under Rules 3(1)(g) and (h) should include proportionality limits, rather than imposing an open-ended 180-day minimum.
Fourth, establish a Parliamentary Standing Committee review mechanism for rules under Section 87 that carry substantive speech implications.
What comes next
As of today, the rules have not been notified. Pending challenges before the Bombay, Delhi, and Karnataka High Courts will eventually produce judicial clarity. The more efficient path is to produce regulatory clarity first.
India’s digital public sphere has grown precisely because it has made room for a diversity of voices. Some regulation of that space is both necessary and appropriate. Good internet governance and constitutional safeguards are not opposing objectives. The instruments chosen to govern online speech must be precise enough to achieve their goals, grounded enough in law to survive scrutiny, and accountable enough in process to earn public confidence.


Debodipta Nandan is an independent public policy professional working at the intersection of digital governance and strategic communications. She write on regulation, media, and emerging technology. Views expressed are personal.
