In a country as culturally and linguistically diverse as India, ensuring safe, respectful digital spaces is no small task. At Social & Media Matters, we spent few months last year deeply examining one critical question: Is content moderation really working in India?
Our report — based on surveys and interviews with 49 content moderators across the country — reveals a sobering reality. Content moderation in India is far from a neutral, mechanical process. It's messy, emotional, and riddled with gaps that platforms urgently need to address.
Here's what we found:
Our Recommendations?
We propose a new roadmap: localized moderation guidelines, context-sensitive AI, real-time user reporting improvements, transparency in decision-making, and meaningful support for moderators' mental health.
"Content moderation should not just be about what fits into a global rulebook — it must reflect the realities, languages, and lived experiences of users on the ground," — Social & Media Matters, 2024 Our call to action is simple:
The responsibility for change cannot rest with platforms alone. It demands collaboration across sectors — tech companies, policymakers, educators, and civil society must work together to reimagine moderation for India's complex realities.
We need platforms to recognize that one-size-fits-all models will not protect India's vibrant digital public sphere. We need stronger accountability mechanisms that prioritize people over profit. And we need to create an ecosystem where moderators, the invisible backbone of our internet safety, are treated with the dignity and care they deserve.
This report is not just a set of findings — it’s an urgent invitation to rethink, to rebuild, and to center human dignity in the future of digital governance. The time for surface-level solutions has passed. It's time for India to lead with empathy, context, and courage.
Read the Full Report Here: Content Moderation Report.
Have thoughts to share? Connect with us at