Critically Analysing the MeitY’s Proposed Amendments to the IT Rules 2021

On 22nd October 2025, the Ministry of Electronics and Information Technology (“MEITY”) published the draft 2025 amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“ IT Rules 2021”), and invited stakeholder inputs / public comments thereon. These amendments (import of which is dealt with below) evidently flow from and in furtherance of a larger objective to counteract the rampant rise of deepfakes, misinformation and other unlawful content specifically through synthetically generated information (“SGI”) / AI-generated content (see, the Explanatory Note published by MEITY behind these proposed amendments).

In this vein, the objective itself of these proposed amendments can hardly be faulted. While generative AI tools undoubtedly present vast potential and are often put to positive and productive use, there are also innumerable instances where such tools have been misused to generate and disseminate fake, harmful or misleading information and/or content, including non-consensual intimate or obscene imagery, fake political news, content in the nature of impersonation, etc. To this extent, the overarching intent behind these proposed amendments, i.e., providing a legal framework for labelling and enhancing traceability and accountability related to SGI, is understandable and perhaps even laudable. However, the manner in which these counteractive measures are sought to be effectuated has raised valid substantive and procedural concerns, as briefly outlined below.

Nature of the Proposed Amendments

Before delving into the aforesaid concerns, it is imperative to understand the import and exact scope of the material proposed amendments, viz.:

a) Introduction of definition of ‘synthetically generated information’, as any ‘information which is artificially or algorithmically created, generated, modified or altered using a computer resource, in a manner that such information reasonably appears to be authentic or true’ (proposed Rule 2(1)(wa));

b) Introduction of additional proviso to Rule 3(1)(b) of the IT Rules 2021, to stipulate that voluntary or suo motu removal of SGI by an intermediary, as part of its reasonable efforts, shall not violate the pre-conditions to an intermediary’s safe harbour as specified under Section 79(2) of the Information Technology Act, 2000 (“IT Act”);

c) Placing a positive due diligence obligation on intermediaries to ensure that all SGI offered on its computer resource is prominently labelled or embedded with a permanent unique metadata or identifier – to cover at least 10% of surface area of the visual display of or to cover initial 10% of duration of audio transmission of SGI. The intermediary must also ensure that such label, permanent unique metadata or identifier of SGI cannot be modified, suppressed or removed;

d) Requiring significant social media intermediaries / SSMIs (i.e., intermediaries with more than 50 lakh registered users) to, inter alia:

  • Seek mandatory declaration from users about whether information uploaded by them is SGI;
  • Deploy reasonable, appropriate and proportionate technical measures, including automated tools or other suitable mechanisms, to verify the accuracy of such declaration; and
  • Where the information is declared or technically verified to be SGI, take reasonable and proportionate measures to ensure that the same is clearly and prominently displayed with an appropriate label or notice, indicating that the content is synthetically generated.

e) Notably, if a SSMI becomes aware of such SGI and knowingly permits, promotes or fails to act upon such SGI in the manner mandated vide the proposed amendments, the SSMI shall be deemed to have failed to exercise due diligence and thereby would risk losing its safe harbour protection.

Fundamental Concerns behind the Proposed Amendments

While the proposed amendments suffer from various practical and implementational gaps, they are arguably flawed at a much more fundamental level itself. Specifically, under the garb of delegated rule-making flowing from Section 87(2)(zg), the Central Government has sought to drastically overhaul the legal framework governing intermediaries and the safe harbour protection concomitantly and consciously afforded to them by the legislature.

The current legal framework governing intermediaries broadly envisages a knowledge-based post facto redressal mechanism, i.e., an intermediary shall act upon information or content hosted on its computer resource subsequent to receiving ‘actual knowledge’ (jurisprudentially and/or statutorily construed to refer to a court order, executive orders or notices issued by law enforcement agencies). The language of Section 79 of the IT Act also echoes this, as sub-section (1) thereof grants safe harbour to an intermediary provided, inter alia, that it expeditiously removes or disables access to unlawful material ‘upon receiving actual knowledge’ of the illegal / unlawful nature thereof (see, sub-section (3)). Pre-screening or policing of an intermediary’s computer resource has, as such, largely been considered antithetical to this framework.

The proposed amendments however significantly depart from this position, and place direct and unwavering obligations on intermediaries, by requiring them to first pre-screen SGI, ensure thereafter that such SGI carries necessary labels, metadata or unique identifiers, and if not, disable access to such SGI. Likewise, SSMIs are tasked with implementing technical measures to achieve similar results vis-à-vis SGI, failing which they are disentitled from invoking safe harbour.

In doing so, the proposed amendments ex facie alter the statutorily-recognised limited role of intermediaries vis-à-vis content hosted on or transmitted through their platforms, and task them with obligations which patently contradict their core functionalities and the statutory mandate placed on them under Section 79(2) of the IT Act (to, inter alia, not modify content transmitted through their platform / computer resource). What is more fascinating is that the MeiTY is seemingly mindful of this contradiction and seeks to resolve this by clarifying that an intermediary will not lose safe harbour for voluntarily removing or disabling access to SGI which does not conform to the proposed amendments’ requirements. This turns safe harbour upside down, as it is now being given in the rule FOR content moderation as against because of the platforms supposed incapacity of it.

Thus, on a threshold level, the proposed amendments are rife with rudimentary concerns and paradoxes, and they hence beg the question of whether this form of delegated rule-making traverses well beyond the power conferred under the parent legislation.

Operational Uncertainties and Potential Chilling Effect

Keeping aside the fundamental objection that the proposed amendments are arguably beyond the rule-making powers delegated to the Central Government, various practical concerns emerge vis-à-vis the operationalisation of these amendments. These include:

a) The implicit, and potentially erroneous, assumption that developers of genAI tools or models qualify as ‘intermediaries’ under the IT Act. As fleshed out in detail by one of our regular contributors, Akshat Agrawal, here, cogent arguments may be made both for and against whether such developers of such tools / model actually operate in the traditional sense of what we understand ‘intermediaries’ to be. However, as opposed to hurriedly resolving this debate through a rule-making process (the validity of which is questionable for the foregoing reasons), this fundamental question necessitates a larger discussion and legislative resolution (through appropriate amendments, for instance, to the definition of an ‘intermediary’ under the IT Act).

b) The exact scope and contours of SGI – the proposed definition is broad enough to encompass even minimal or surface-level modifications to content through algorithmic filters, yet that does not seem to be the harm that the MeiTY is concerned with and seeks to redress through these amendments (as per its own Explanatory Note). This ambiguity, in turn, will undoubtedly cause operational uncertainty and hamper ease-of-doing business for various entities / platforms which, to some extent, incorporate or provide algorithmic tools for users to modify content.

c) The proposed amendments imposes overbroad and vague verification obligations on SSMIs. There is no clarity provided on what ‘reasonable’ or ‘proportionate’ measures to ensure labelling of SGI are, levels of accuracy required in suo motu verification of SGI, etc. In this sea of uncertainty and faced with onerous penalties or ramifications for non-compliance, intermediaries may opt to err on the side of caution and indulge in over-labelling and/or excessive removal of unlabelled content which may be synthetically generated, thereby impairing the end-user’s ability to receive and enjoy such content and/or resulting in a chilling effect where the adverse impact on free speech far outweighs the initial purported harm caused by availability of SGI.

On a more nuanced level, the proposed amendments also raise concerns of violation of moral rights of authors (for instance, an author may allege that superimposition of a prominent label covering at least 10% of the surface area of an image created by the author, and embellished through algorithmic filters, amount to mutilation of the image), subjective assessments of what nature of SGI may be construed as misleading viewers as to its truth (and thereby require action from intermediaries) or whether such SGI is purposefully satirical or hyperbolic (and thereby would not be perceived as truth and mislead viewers), etc.

In conclusion, while the MeiTY’s proposed amendments to the IT Rules 2021 take a well-intentioned step towards curbing the undisputed misuse of SGI and AI-generated content, they are beset by significant substantive and procedural drawbacks, and rife with practical ambiguities. Ultimately, meaningful reform in this framework requires an open, deliberative legislative process, clear statutory guidance, and a balanced consideration of both technological realities and constitutional rights, rather than rushed rule-making that may overstep delegated authority and stifle innovation.

Views expressed here are personal.

Image source : here