The government recently released new rules to govern the digital media industry, including social media platforms, over-the-top (OTT) platforms and digital news portals. This has sparked an industry-wide debate on whether these rules would be beneficial for the country’s social media ecosystem or would instead impact the freedom to access internet in the country. Industry leaders share their views on the recently notified IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, the impact of these guidelines on the OTT and social media ecosystem, and the key challenges that stakeholders would face in ensuring compliance…
What are your views on the recently notified IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021?
We are of the view that these rules violate the fundamental rights of speech, expression and access to information. We believe that these rules will fundamentally change the way the internet is used by the Indian population. It is also a matter of concern that the rules were framed without public consultation and were issued in excess of the constitutional authority of the government.
The IT rules are aimed at controlling third-party-generated content over which social media intermediaries have negligible power, thereby affecting society at large. These rules also address the lack of an EU-style data protection law or a data protection authority in India and, in doing so, provide users of social media and OTT platforms with a complaint mechanism in the event the rules are compromised.
Like Australia and Canada, the Indian government has also touched upon the publishing of news and current affairs on these platforms by quoting the journalistic conduct of the Press Council of India and the programme code under the Cable Television Network Act. Further, to the relief of all concerned, fees for news/current affairs content have not been proposed.
In many respects, both the social media intermediaries and OTT platforms should breathe easy as the 2021 rules are less stringent than the path envisaged in the 2018 draft. Among the definite pluses are the user and public transparency requirements, no (or limited) upload filter requirements, due process checks, balances around traceability requests and the “good Samaritan” principle for voluntary enforcement of platform guidelines.
Dr Mahesh Uppal
The guidelines seek to address problems facing end users as well as government agencies. While the problems are real and important, the solutions on offer seem inadequate and disturbing. Few will deny that rogue players frequently abuse unregulated social media to hurt, cheat, defame, threaten and harm innocent users. Similarly, political extremists propagate hatred and violence that would inevitably worry any government agency. Regulation is clearly necessary. However, it is not always easy to classify online players or their behaviour as potentially harmful to users or the country. Boundaries of free speech and legality are especially difficult to set. The elaborate norms and strict timelines for removal of content are intended to protect users and the government. However, it is unclear as to how the norms will be implemented at scale, within reasonable time.
Nuance, proportionality, fairness and independence are key to effective regulation of online behaviour. This is where the current rules leave gaps. The approach to end-to-end encryption seems especially worrying since it seriously undermines privacy. The rules should have been announced after thorough consultation with stakeholders, including industry and civil society, who could have pointed out the pitfalls and challenges of implementing poorly conceived regulation. There is also a concern that the rules are intended to apply to players such as news media, which provide curated content and are not intermediaries at all.
How will these rules impact the OTT and social media ecosystem in the country?
These rules will enlarge government oversight over online content uploaded by OTT platforms and by individual users on social media websites, and provide much wider and stricter “takedown” powers to the government than ever. We are likely to see an atmosphere of self-censorship by OTT platforms and excessive censorship by social media platforms. All this, in turn, will have a massive impact on our fundamental rights.
The 2021 rules will impact three key functions of the internet in India: social media, digital news and OTT streaming services.
While the provisions for due diligence obligations and internal grievance redressal expand on the earlier intermediary rules, it is the disabling of access to content within 24 hours of receipt of complaints that may put pressure on service platforms.
While the new rules have addressed concerns regarding a one-size-fits-all approach, I find the distinction between social media intermediaries and significant social media intermediaries intriguing, and how the government distinguishes the due diligence obligations of the two is also curious. Obviously, prominent social media intermediaries will be under a more exacting scanner rendering it crucial for the less prominent ones to rethink their strategy. It would also be beneficial if Indian residents are appointed to the posts of chief compliance officer, 24×7 nodal contact person and grievance officer.
Another concern spawned by the 2021 rules is that end-to-end encryption may be put to the test with the requirement of identification of the first originator of information. Social media platforms such as Signal, Telegram and WhatsApp will need to identify the “originator” of a message, even though this undermines end-to-end encryption (essential to ensure that online communications remain private). Although the 2021 rules have allotted a three-month time frame for compliance, and much of this time is likely to be spent in active negotiations between the government and social media intermediaries, the time for experimentation is propitious.
The rules provide for one or more self-regulating bodies to be formed without deliberating on whether a complaint can be filed only before the self-regulating body of which the publishing company is a member, or before several self-regulating bodies. If it is the latter, what if they have conflicting decisions? Further, there is no embargo on members of the publishing company to be a part of the self-regulating body, as this can give rise to a situation of conflict of interest. It is encouraging that the current provision enabling an intermediary to act based on receiving actual knowledge in the form of an order by a court or being notified by the appropriate government or its agencies through an authorised officer has been recognised. In terms of OTT platforms, the focus on self-classification of content and censorship follows a global trend. While there may be debate on whether Facebook, for instance, would be considered a news intermediary, at least there is clarity on the applicable rules. The classification of content follows what several OTT platforms already use – U (Universal), U/A 7+, U/A 13+, U/A 16+, and A (Adult).
Dr Mahesh Uppal
Social media applications, despite their numerous pitfalls, deliver enormous value. There is virtually no community that does not use social media in a significant way. Increasing innovation by players in the ecosystem has consistently added to the potential role – both positive and negative – of social media in people’s social and work lives. The proposed rules have significantly increased the level of reporting and compliance relating to traceability. This could discourage free expression and encourage companies to play safe whenever they perceive risks to their businesses, employees or profits. The rules could also drive fringe users to less mainstream apps that may not have the user base to be classified as significant social media intermediaries.
What have been the global trends, precedents and developments in this area?
To take one example, the UK regulator, OFCOM, has recently implemented a Broadcasting Code, which is supplemented by a set of statutory rules and non-binding guidance for OTT service providers. These rules and guidelines impose similar obligations upon service providers, in that they do impose several a priori restrictions on content on the basis of reducing prohibited material. A key difference, however, here is that this framework is much “softer” in approach than the MeitY’s since OFCOM’s guidelines provide a significantly lesser amount of discretionary power to the government. Additionally, the regulatory body in the UK, while approved by the government, is independent of governmental oversight and so may allow for a much freer expression of artistic freedom.
India has followed the global trend of allowing self-regulation of OTT platforms. This is undoubtedly encouraging.
On regulating social media, governments across the world are trying to understand (and limit) the power of big tech. The issues being discussed worldwide are the same: illegal and harmful content on the internet and the lack of sufficient accountability. The UK has proposed a two-hour window to take down extremist content, placing unreasonable burdens on tech platforms. Germany’s NetzDG law is an example of a national law that regulates online speech with potentially negative consequences for global freedom of speech. This is a difficult line to toe wherein minority groups and political dissidents may become, as one may put it, “collateral damage”. Ultimately, the Indian situation is unique, and blindly following “progressive” strategies can become detrimental and needs to be considered on an ad hoc basis.
Dr Mahesh Uppal
Concerns have been expressed in many countries about the abuse of social media, especially live streaming of extreme and violent acts. Broadly speaking, liberal democracies such as those in Western Europe, North America and Australia have relied on self-regulation, with some exceptions. The focus is on the takedown of illegal content, especially related to extremism, child pornography, violence or breach of copyright. The Chinese government employs a more stringent control on social media companies and does not permit companies such as Facebook, Twitter and WhatsApp to operate in the country. Pakistan and Bangladesh have rules in place to control social media. It is unclear if they have been enforced or have been effective.
What challenges are OTT content providers, social media platforms, digital/online news portals and telcos expected to face while ensuring compliance with the new rules?
The Intermediaries Rules require compliance with a varied set of obligations, depending on the nature and size of the subject. All intermediaries will be expected to appoint grievance officers and put in place measures for grievance redressal in relation to content posted on their website/mobile application. Large social media intermediaries will be required to appoint two other officers residing in India and develop artificial intelligence tools to automate censorship in certain serious cases, while messaging-related intermediaries will have to develop protocols for the identification of the first originator of all messages sent and delivered through the services. Digital news media and OTT platforms will have to subject themselves to a three-tier content review mechanism, which will be headed by officers of various ministries under the central government.
While challenges are specific to each service provider/platform, certain broad consequences can be recounted. These include:
- Traceability: The thrust is on law enforcement to trace the “first originator” of any message. Several messaging services deploy end-to-end encryption and may not store metadata, the purpose being to enhance the security of their systems and guarantee users their privacy. When the first originator is from overseas, the (significant) intermediary is required to identify the first originator within India. This requirement is overwhelming for service providers. Would they need to store additional sensitive information or break end-to-end encryption? Any of these actions would require a reassessment of the overall impact on user privacy and system security.
- Takedown time: The 2021 rules’ allotment of 24/36 hours for content takedown is a short timeline, as is 72 hours for sharing of user data. Take the case of smaller players, would the allotted time be adequate to analyse the requests or seek clarifications? Obviously, taking down and sharing content indiscriminately does affect users’ rights, some of which may be guaranteed by the constitution, or recognised by courts.
- User verification: In what may be regarded as a good intention gone wrong, user verification may have unintended consequences. Sometimes, users may require anonymity and such a move may put them at risk of becoming a target. Profiling of individuals, cyberstalking and loss of privacy can be easily imagined in a scenario where sensitive data is collected to verify users.
- News regulation: Henceforth, online news websites, online news aggregators and curated audio-visual platforms may require government registration and content takedown compliance. It is possible that the “affected parties” may not really be intermediaries. This seems like an extension of the “intermediary rules”.
Dr Mahesh Uppal
The time and scale of response required from social media companies will be a major challenge. The three months given to companies to comply with the new rules seem short. Grievance redressal will be a major logistical and legal challenge, given the large number of users and potential grievances. Another complication will be the provision in the rules that unresolved grievances will go to government bureaucrats instead of the judiciary.
Are there any other issues that you would like to highlight?
We believe that any rules made in respect of intermediaries and users of such intermediaries should be consistent with constitutional values. We have recommended that these rules be withdrawn and a fresh set of guidelines be issued after public consultation with stakeholders while following proper parliamentary procedure.
Safe harbour protections have fuelled the growth of the internet. Technology has created an open and secure architecture, which facilitates communication and commerce. The 2011 rules that stopped at a “due diligence” obligation for the intermediary triggered the growth of the tech sector during the past decade. The new era of digital media faces new challenges. Will end-to-end encryption end now? Will social media be subjected to increased surveillance? Such burning questions are rife.
Dr Mahesh Uppal
India has extremely high stakes in the digital economy, both as a user and a major player. It cannot afford shortcuts that could hurt its varied interests and aspirations in an area where it has huge strength and resources. It must follow global best practices and resist the temptation to overregulate a key pillar of the economy.