The government's recent introduction of new rules in the IT Act allows 'offensive' material on any website to be removed within 36 hours. Did the state just arm everyone to shoot the messenger, online?
The government's recent introduction of new rules in the IT Act allows 'offensive' material on any website to be removed within 36 hours. Did the state just arm everyone to shoot the messenger, online?
ADVERTISEMENT
u00a0
Let's say a newspaper published a contentious piece that begs to be questioned in the court of law. What would happen if instead of the journalist who wrote the article or the editor who published it, we decide to sue the newspaper boy who delivered the paper? Irrational? According to bloggers and digital media experts, new rules notified under the Information Technology Act 2008, has armed everyone to shoot the messenger, online. Will this challenge our fundamental freedom of speech and expression, and the fabric of the Internet as we know it?
In April this year, the Department of Information Technology (DIT) introduced a new set of rules called Intermediary Due Diligence. According to it, every citizen has the right to complain against any digital content to the website host or any ISP that serves the content or any cybercafe from which the content is available, etc (legally referred to as intermediary). In other words, any website that carries content ufffd Twitter, Facebook, YouTube, blogs and even newspaper websites ufffd can be sued for the content they carry, even if it is a third party that has written it.
So, even though over 190 million users worldwide publish over a billion comments a week on social broadcast medium Twitter, if someone were to find a particular 'Tweet' offensive ufffd even if it hasn't been written by an Indian ufffd they can ask Twitter to remove the tweet, failing which, they can sue the site.
According to the rules, every intermediary (read website) is now required to hire a grievance officer, to whom the offended party can send their complaint. The website is given 36 hours, to remove the comment, post or content, failing which the website is liable to judicial action.
The website owner no longer possesses the discretion to ignore complaints and uphold the freedom of speech of his site's users without risking liability himself.
Is the rule unconstitutional?
According to Pranesh Prakash, programme manager for Centre for Internet and Society, Bengaluru, the new rules are unconstitutional.
The pre-existing section 79 of the IT Act states that intermediaries (that is to say, websites) are not liable for third party information (such as comments, posts, tweets or posts) as long as they are mere conduits, observe 'due diligence' and don't encourage criminal activity. The new rules were meant to clarify what 'due diligence' meant. A draft of the rules was released in February, and the final rules were added to the IT Act in April.
"The rules have gone far beyond mere clarification. The Department has imposed rules that insist that intermediaries play the role of a judge and executioner on mere complaint, without any opportunity for the other side to be heard," says Prakash.
In a press release issued on May 11 this year,u00a0 the DIT stated, "The Government adopted a transparent process for formulation of the Rules under the Information Technology Act. The draft rules were published on the Department of Information Technology website for comments [in February] and were widely covered by the media. None of the Industry Associations and other stakeholders objected to the formulation, which is now being cited in some section of media."
However media analysts disagree.u00a0 "The DIT was expected to create a public listing of comments submitted. From what we've seen on their website, they haven't," says Nikhil Pahwa, editor of Medianama, a website that offers analyses of news on various forms of media, including the Internet.
Interestingly, some Members of Parliament registered their protest against the draft rules.
Rajya Sabha member Rajeev Chandrashekar registered his protest against the draft rules during Zero Hour on March 22, and received the support of three other MPs ufffd Kumar Deepak Das, P Rajeev, Mahendra Mohan.
His argument was also published on his website: "The execution of these rules could result in a shutting down of the Internet, which is the main form of expression for growing Indians, if the information posted is found inconvenient to Government, institutions or individuals. This would also take away the right to freedom of expression of bloggers and other Internet users in the country. The Government must call for transparent public consultation/public opinion."
Restricting freedom of speech
What's more, say lawyers, the ground on which a person can find a comment offensive is vague and open to interpretation.
Apar Gupta, a lawyer associated with the Software Freedom Law Centre in New Delhi, says, "The grounds to block content are arbitrary. In a situation like this, any intelligent critique, discourse etc can be deemed offensive and no one can do anything about it."
Nor is it mandatory for the website to inform the person, who has posted a comment that someone else found offensive, before removing it.
"In a case like this, the so-called violator does not even have the opportunity to be heard or defend himself, which is a violation of the principles of natural justice," adds Prakash.
Websites no longer have a final say in regulating content, as they are legally bound to remove matter that has been found offensive. The removed content can be re-instated if the website wins a lawsuit against the complainant ufffd if it chooses to go through with one in the first place.
"Suppose you do not like what I have posted on Twitter, and file a complaint with the grievance officer.
Twitter has two options ufffd remove the content and be safe or keep it and be liable. What do you think is easier for Twitter? Obviously, it wouldn't want to be party to hundreds of lawsuits," explains Gupta.
Before the rules were notified, a police complaint could be registered, a civil suit filed, or a 'nodal officer' ufffd required to be designated in all government departments ufffd could be approached, which would be followed by a judicial probe. If the content was eventually found offensive, the website would be asked to remove it.
"Now, websites will lose protection from law if they don't take down 'offensive' content. They have no incentive to uphold the freedom of speech of their users. Instead, they have been provided incentives to take down all content about which they receive complaints without applying their minds," Prakash points out.
Then again, in our country where anything from a paragraph in a history textbook to a 15-second jig by a politician can be deemed offensive, analysts fear that the rules can be rampantly misused.
"The rule will be used by conservatives and not liberals. Lots of organised people (political parties, bureaucrats etc) will take down all content against them. People could end up using the rule to challenge a website and making money by agreeing for an out of court settlement," fears Gupta.
"It can become a tool for harassment," Shivam Vij, a member of radical critique blog Kafila, adds tersely.
The new clause has the potential to immediately address truly offensive material, such as child pornography, online grooming of young girls and boys by paedophiles (such as the recent case of Paul Wilson who was convicted in Birmingham for grooming 20 minors online) and videos taken on the sly (one such case led Rutgers university student Tyler Clementi to commit suicide, when a video of him having sex with a fellow male student was posted online). However, it is a double-edged sword that calls for further debate on what can be posted and what can be removed.
One way to do this is to make information public. If a site is blocked or content removed, there should be a public notice issued and a list should be maintained of all requests for removals or blocks. Also, the reason for removing or blocking a piece of content, and the authority responsible for taking that decision should be made public. When a user visits a blocked site, there should be a notice about the block, and a specific recourse mentioned for getting the block removed.