Our report — based on surveys and interviews with 49 content moderators across the country — reveals a sobering reality. Content moderation in India is far from a neutral, mechanical process. It's messy, emotional, and riddled with gaps that platforms urgently need to address.
Here's what we found:
- Moderators are bearing the emotional burden - Exposed daily to violent, graphic, and abusive content, many moderators reported significant psychological distress — yet mental health support remains patchy or absent across companies.
- Language barriers are undermining moderation quality - While Hindi and English dominate moderation efforts, India's internet users communicate in dozens of languages. This linguistic complexity leads to inconsistencies and missed harmful content.
- Context matters — but it’s often ignored - What’s offensive in one region might be normal in another. Yet many platforms apply blanket rules without adapting to India's cultural nuances, leading to user dissatisfaction and distrust.
- Users are losing faith in reporting systems - Many Indians feel their content reports are ignored or mishandled. Delayed or inadequate responses breed cynicism and discourage future reporting — weakening the very trust moderation systems rely on.
- Profit often trumps people - Platform decisions frequently prioritize community guideline technicalities or revenue considerations over user harm or distress — as illustrated by cases where sensitive videos were reinstated after takedown appeals.
We propose a new roadmap: localized moderation guidelines, context-sensitive AI, real-time user reporting improvements, transparency in decision-making, and meaningful support for moderators' mental health.
"Content moderation should not just be about what fits into a global rulebook — it must reflect the realities, languages, and lived experiences of users on the ground," — Social & Media Matters, 2024
Our call to action is simple:
- Center the user
- Support the moderator
- Respect the context
- Prioritize mental health
- Demand transparency
We need platforms to recognize that one-size-fits-all models will not protect India's vibrant digital public sphere. We need stronger accountability mechanisms that prioritize people over profit. And we need to create an ecosystem where moderators, the invisible backbone of our internet safety, are treated with the dignity and care they deserve.
This report is not just a set of findings — it’s an urgent invitation to rethink, to rebuild, and to center human dignity in the future of digital governance. The time for surface-level solutions has passed. It's time for India to lead with empathy, context, and courage.
Read the Full Report Here: Content Moderation Report.
Have thoughts to share? Connect with us at This email address is being protected from spambots. You need JavaScript enabled to view it.