Meta will start removing sensitive ads targeting options related to health, race or ethnicity, political affiliation, religion or sexual orientation from 2022. According to the company, it has heard concerns from experts that targeting options like these could be used in ways that lead to negative experiences for people in underrepresented groups
Meta has said that it removed over 30 million pieces of content in September on Facebook in India over security concerns. Photo: IANS.
Facebook's parent company Meta has announced that it will remove sensitive ads targeting options related to health, race or ethnicity, political affiliation, religion or sexual orientation starting January 19, 2022.
ADVERTISEMENT
It is important to note that the interest targeting options being removed is not based on people's physical characteristics or personal attributes, but instead on things like people's interactions with content on the platform.
"Starting January 19, 2022 we will remove Detailed Targeting options that relate to topics people may perceive as sensitive, such as options referencing causes, organisations, or public figures that relate to health, race or ethnicity, political affiliation, religion, or sexual orientation," the company said in a statement.
According to the company, it has heard concerns from experts that targeting options like these could be used in ways that lead to negative experiences for people in underrepresented groups.
"Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them," the firm added.
Meta earlier this month said that it removed over 30 million pieces of content in September on Facebook and Instagram in India, as it faces intense security over user data privacy.
The social network acted upon 26.9 million pieces of content across 10 policies for Facebook and over 3.2 million pieces of content across 9 policies for Instagram in compliance with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, the company said in its monthly report.
"This report will contain details of the content that we have removed proactively using our automated tools and details of user complaints received and action taken," a Meta spokesperson said in a statement.
"In accordance with the IT Rules, we have published our fourth monthly compliance report for the period for 30 days (01 September to 30 September)," the spokesperson added.
Also Read: Why it is urgent to address caste-based discrimination in Indian medical institutions