The report quoted a senior content moderator as saying that Genpact employees were informed they could lose their jobs if they didnt come to the office.
Image: AFP
As Covid-19 cases surged in India, third-party content moderators working for Facebook allegedly pressurised them to get back to work, a report from the non-profit publication 'Rest of World' has claimed.
ADVERTISEMENT
In Hyderabad, at least 1,600 people are employed by global professional services firm Genpact to do content moderation for Facebook.
"This summer, even as Covid-19 cases were surging in India, Genpact moderators said they felt pressured by their employer to come back to the office," the report said on Tuesday.
"While most of Facebook's full-time employees remain safe at home, these workers have been forced to choose between their health and their livelihoods," it claimed.
Rest of World said it spoke with four current and former Genpact employees.
"They said moderators were asked -- in some cases as early as July -- to return to the office to tackle sensitive content, including posts involving child exploitation, suicide, and other matter that could lead to real-world harm," the report mentioned.
Genpact said in a statement given to the publication that it asserted that moderators are being asked to come to the office only on a volunteer basis.
"To make this manageable, safe and clear, employees need to sign a weekly form that asks them to voluntarily agree to this," a company spokesperson told Rest of World.
Facebook responded: "Our focus for reopening any office is on how it can be done in a way that keeps our reviewers safe. To do this, we are putting strict health and safety measures in place, making sure they're followed, and addressing and disclosing any confirmed cases of illness".
The report quoted a senior content moderator as saying that Genpact employees were informed they could lose their jobs if they didn't come to the office.
"The operations team told them these are important orders," said the moderator. "There's a threatening factor behind (it)."
Facebook had more than 30,000 employees working on safety and security -- about half of whom were content moderators.
The social networking giant in May agreed to pay $52 million to third-party content moderators who developed post-traumatic stress disorder (PTSD) and other mental health issues as they scanned scores of disturbing images of rape, murder and suicide to curb those on the platform.
According to The Verge, in a preliminary settlement in San Mateo Superior Court, the social networking giant agreed to pay damages to 11,250 US-based moderators and provide more counselling to them.
Facebook has hired several firms like Accenture, Cognizant, Genpact and ProUnlimited to help it moderate and remove harmful content in the aftermath of the 2016 US presidential election and Cambridge Analytica data scandal.
Last year, several moderators told The Verge that they had been diagnosed with PTSD after working for Facebook.
Cognizant later announced that it would quit the content moderation business and shut down its sites earlier this year. The company also developed "the next generation of wellness practices" for affected content moderators.
Keep scrolling to read more news
Catch up on all the latest Crime, National, International and Hatke news here. Also download the new mid-day Android and iOS apps to get latest updates.
Mid-Day is now on Telegram. Click here to join our channel (@middayinfomedialtd) and stay updated with the latest news
This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever