Meta Removes Over 18 Million Pieces of Bad Content in India

Meta Removes Over 18 Million Pieces of 'Bad Content' in India
Meta's decisive action in November 2023 to remove over 18 million pieces of 'bad content' in India highlights its commitment to digital safety and compliance with IT Rules 2021.

Meta, the parent company of Facebook and Instagram, has taken significant strides in content moderation across its platforms in India, removing over 18 million pieces of content in November 2023. This action falls under Meta’s commitment to comply with India’s IT Rules 2021, which mandates large digital and social media platforms to publish monthly compliance reports.

In detail, Meta removed over 18.3 million pieces of content on Facebook and more than 4.7 million on Instagram, adhering to the digital ethics outlined by the Indian government. The company also engaged actively with the Indian grievance mechanism, receiving tens of thousands of reports and resolving over half directly by providing users with tools to manage their concerns.

Meta’s report details the purging of over 18.3 million pieces of content from Facebook and more than 4.7 million from Instagram. These figures are a testament to Meta’s rigorous content moderation policies, which span across 13 guidelines for Facebook and 12 for Instagram. The company’s efforts are part of a larger strategy to combat the spread of misinformation, hate speech, and other forms of inappropriate content that violate community standards.

The process includes a comprehensive grievance mechanism, allowing users to report content that they find objectionable. In November alone, Facebook received over 21,000 reports through this channel. Meta provided resolutions in over 10,700 cases, either by guiding users through self-remediation flows, offering tools for downloading their data, or addressing hacked account issues. For Instagram, out of over 11,000 reports received, solutions were provided in more than 4,200 instances.

These figures, along with detailed compliance reports, reflect Meta’s proactive approach in not just removing harmful content but also empowering users with tools and resources to safeguard their digital presence. By publishing these monthly reports, Meta reaffirms its commitment to transparency and compliance with local regulations, ensuring a safer and more responsible online ecosystem.

This crackdown on harmful content is part of Meta’s ongoing effort to maintain a safe online environment for users, showcasing its technological prowess and dedication to ethical digital governance. The initiative underscores the importance of collaboration between tech giants and regulatory bodies in ensuring the digital sphere is free from harmful content.


About the author

Avatar photo

Mahak Aggarwal

Mahak’s passion for technology and storytelling comes alive in her articles. Her in-depth research and engaging writing style make her pieces both informative and captivating, providing readers with valuable insights.

Add Comment

Click here to post a comment