shot-button
Ganesh Chaturthi Ganesh Chaturthi
Home > Technology News > Facebook moderators getting wrong interpretation of Indian law NYT

Facebook moderators getting wrong interpretation of Indian law: NYT

Updated on: 28 December,2018 08:30 PM IST  |  San Francisco
IANS |

One of these documents tells moderators that any post degrading an entire religion violates Indian law and should be flagged for removal

Facebook moderators getting wrong interpretation of Indian law: NYT

The moderators at Facebook who carry out the critical task of removing dangerous content from its platform rely on material based on incorrect interpretation of certain Indian laws, The New York Times has reported.


The documents that are used to guide Facebook's more than 7,500 moderators span more than 1,400 pages which often contain inaccuracies and outdated information, said the report on Thursday.


One of these documents tells moderators that any post degrading an entire religion violates Indian law and should be flagged for removal.


But the law prohibits such posts only in certain conditions, such as when the intention of the users is to inflame violence, Chinmayi Arun, a legal scholar, told NYT.

Another document for moderators instructs them to "look out for" the phrase "Free Kashmir" - through the slogan, common among activists, is completely legal, the report said.

The moderators are even warned that ignoring posts that use the phrase could get Facebook blocked in India.

According to data compiled by Statista, India has the highest number of Facebook users - 294 million (as of October 2018) - 90 million more than the number of Facebook users in the US which has the second-biggest users-base for the social network.

As Facebook faces severe criticism for allowing extremist content in some countries, and censoring legitimate posts in some other places, the job of moderators who need to monitor billions of posts every day in over 100 languages is very critical.

Those moderators, at times relying on Google Translate, have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day, the NYT report said.

An examination of the documents for moderators by the paper revealed numerous gaps, biases and errors, which are limited not only to the interpretation of Indian laws.

For example, the moderators were once told to remove fund-raising appeals for volcano victims in Indonesia just because a co-sponsor of the drive was on the social network's internal list of banned groups.

Similarly, in Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months, the report said.

While admitting that perfection was not possible, Facebook's Head of Global Policy Management, Monika Bickert said that the company had been successful to a great extent in achieving its primary goal of preventing harm.

Catch up on all the latest Crime, National, International and Hatke news here. Also, download the new mid-day Android and iOS apps to get latest updates

This story has been sourced from a third party syndicated feed, agencies. Except for the change in the headline, the story has been provided "AS-IS," "AS AVAILABLE, without any verification or editing from our side. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever.

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!


Mid-Day Web Stories

Mid-Day Web Stories

This website uses cookie or similar technologies, to enhance your browsing experience and provide personalised recommendations. By continuing to use our website, you agree to our Privacy Policy and Cookie Policy. OK