As per Section 2(1)(w) of the IT Act, an "intermediary" is anyone who, on someone else's behalf, receives, stores, or sends electronic records, or provides services related to those records. For example, Internet service providers, social media platforms, and online marketplaces are intermediaries because they help users send or share information, but they are not the original creators of that information. The IT Act envisions distinct standards for these individuals β since they do not exercise their autonomy with respect to such electronic records, and instead operate on the basis of a set relationship between them and the party on whose behalf they act. That is, while they are important for all forms of digital communication, they do not create or control the information that they transmit (Jitendra Singh Yadav v. Union of India (2017)).
The Safe Harbour Principle (Section 79)
The question of how liable an intermediary should be held for the content it transmits in the context of an Internet Service Provider ("ISP") was initially discussed by the U.S. District Court for Northern California is the landmark Religious Technology Center v. Netcom On-Line Communication Services, Inc. (1995) judgment, which held that an ISP is a passive service and like telephone company cannot be held liable (in this case for infringement of copyright) for content transmitted through its server. Statutory/legal protection of intermediaries in such a manner is known as "safe-harbour."
The IT Act as enacted in 2000, too, only envisioned protections for ISPs (and other network service providers) β a position which was changed by the 2008 amendment, which broadened the effect of Section 79 to provide safe harbour to more than just network service providers.
Section 79(1) provides that an intermediary shall be exempted from any liability for content hosted/made available by them, subject to the conditions laid out under Sections 79(2) and (3).
Section 79(2) basically states that an intermediary is protected under safe harbour if it only provides access to a communication system where third-party information is shared or stored, without initiating the transmission, choosing the receiver, or altering the content. It must also follow due diligence and comply with any guidelines set by the government.
Section 79(3) goes on to state that safe-harbour protections will NOT apply if the intermediary aids, abets or commits the unlawful act connected to the storage/sharing or any processing of the third-party information or if, despite receiving information regarding some unlawful act happening connected to the third-party information being processed by or interacting with the computer resources of the intermediary, the intermediary fails to act in a timely manner.
The Supreme Court in Shreya Singhal v. UoI (supra) also clarified that intermediaries are not required to monitor content that they transmit, and will not be held liable merely because the any of the content they transmit/process is violative of the law. hey are only required to take down content when a court order or notification from a government authority (or any other competent authority) is received β and only this would constitute "actual knowledge."
However, challenges emerge in determining whether an entity would fall under the definition of "intermediary," for the purposes of this Act. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 attempt to provide additional clarity to the matter.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 ("Intermediary Rules")
Legislative Background
The Intermediary Rules were framed in exercise of the powers conferred by Sections 87(2)(z) and 87(2)(zg) of the Information Technology Act, 2000. They were notified by the Central Government via G.S.R. 139(E) on 25 February 2021, and have since undergone amendments, most recently updated by G.S.R. 264(E) dated 6 April 2023.
These Rules supersede the earlier Information Technology (Intermediary Guidelines) Rules, 2011, and represent a significant overhaul of the regulatory architecture applicable to intermediaries and digital media entities in India. The stated objectives of the Rules are threefold:
- To prescribe due diligence obligations that intermediaries must observe in order to qualify for the legal immunity available under Section 79 of the IT Act;
- To promote greater accountability of social media and digital platforms, particularly those with wide reach and influence;
- To establish a co-regulatory enforcement mechanism for publishers of news and curated content on online platforms, including digital news media and OTT streaming services, under a newly introduced Code of Ethics.
These rules attempt to strike a balance between enabling innovation and safeguarding constitutional rights, particularly freedom of speech and user safety in an increasingly complex online ecosystem.
The Intermediary Rules introduce a layered regulatory approach by classifying intermediaries into distinct categories based on the nature and scale of their operations. Rule 2(l) defines intermediary in alignment with Section 2(1)(w) of the IT Act. It includes entities that store or transmit information on behalf of others or provide services related to such transmission.
Rule 2(v) introduces a special class of intermediaries called Significant Social Media Intermediaries ("SSMIs"). As per Rule 2(v) read with Rule 6, an SSMI is any social media intermediary which has a number of registered users in India above the threshold notified by the Central Government, currently set at 50 Lakh. These intermediaries are subject to additional compliance requirements under Rule 4.
Due Diligence Obligations
The safe harbour protection granted to intermediaries under Section 79 of the IT Act is not absolute β it is conditional upon the intermediary observing due diligence as prescribed by the Central Government. These requirements were significantly expanded by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, notified under Sections 87(2)(z) and 87(2)(zg) of the Act.
The Intermediary Rules apply to all intermediaries, but impose an additional layer of obligations on a class of entities termed Significant Social Media Intermediaries ("SSMIs") β those providing messaging or social media services with a registered user base in India above a threshold of 5 million. The due diligence obligations under the Intermediary Rules form the basis on which intermediaries retain or lose their immunity under Section 79.
Duties of All Intermediaries
Rule 3 of the Intermediary Rules outlines the baseline duties applicable to all intermediaries, including:
- Every intermediary must appoint a Grievance Officer and publish their contact details and complaint-handling mechanism on their website or app. Complaints must be acknowledged within 24 hours and resolved within 15 days.
- Intermediaries must publish clear terms of service that prohibit users from uploading or sharing content that is unlawful, defamatory, obscene, invasive of privacy, or violates intellectual property or other legal rights.
- Upon receiving a court order or a notification from the government or its agency under Section 79(3)(b), intermediaries must take down unlawful content within 36 hours.
- Intermediaries are required to retain user data and records of removed content for 180 days after account termination or content takedown.
Failure to comply with any of these duties may result in loss of the legal immunity provided under Section 79.
Additional Obligations for SSMIs
For intermediaries classified as SSMIs, the Rules impose enhanced due diligence obligations under Rule 4. These include:
- Compliance Personnel Based in India β SSMIs must appoint three officers based in India:
- A Chief Compliance Officer, responsible for ensuring compliance with the IT Act and Rules.
- A Nodal Contact Person, for coordination with law enforcement.
- A Resident Grievance Officer, who handles user complaints and publishes periodic compliance reports.
- Monthly Compliance Reports β SSMIs are required to publish reports disclosing the number of complaints received, actions taken, and content removed proactively.
- Voluntary User Verification β Platforms must enable users in India to voluntarily verify their identity using appropriate mechanisms (such as mobile numbers). Verified users must be provided with visible identification marks.
- Automated Content Moderation β SSMIs must deploy automated tools to identify and remove content relating to child sexual abuse material, terrorism, and other content previously flagged as unlawful or harmful.
- Traceability Requirement for Messaging Services β Messaging-based SSMIs (like WhatsApp, Signal, etc.) must enable identification of the first originator of information upon an order by a court or competent authority under Section 69 of the IT Act. This requirement has been subject to significant legal and technical criticism, especially due to its implications on end-to-end encryption.
Legal Controversies and Practical Concerns
The Intermediary Rules have been challenged before various High Courts, with critics arguing that they exceed the delegated powers under the IT Act and impose disproportionate compliance burdens, especially on private messaging platforms. Concerns have also been raised regarding freedom of speech, surveillance risks, and the feasibility of traceability mandates.
Despite these concerns, the Rules remain in effect, and intermediaries must comply with their obligations to retain the safe harbour protections under Section 79. For social media platforms operating at scale in India, this has meant significant restructuring of internal compliance functions and user-facing policies.
This section continues with detailed coverage of Digital Media Ethics Code and key case law...
Β