Facebook’s very real dilemmas and very human Community Standards

Harassment and Hate Speech are not permitted on Facebook as per its Community Standards. In case some user content is taken down because it violates the Community Standards, Facebook intimates the user of the same. If the user feels that Facebook’s decision was unfair, he or she can click on Facebook’s intimation to appeal for a re-look at their decision. Facebook works on these requests by getting an external review to see the content in conflict. If the external review decides that the user is right, then the content is put back on the site by Facebook.

As per the Community Standards on Hate Speech, Facebook differentiates between Hate Speech content posted by a user and other people sharing the same hate content to create awareness about the concept of Hate Speech. In the latter case, Facebook does not consider the content as a violation of Community Standards as it is used with the intention to educate and generate awareness about Hate Speech. The Vice-President of Facebook’s Global Policy Management Monika Bickert admits that it is difficult to know the context of Hate Speech. Hence, the user who is sharing some other user’s Hate Speech content must make the intention clear that the content sharing is undertaken with the intention to generate awareness about Hate Speech and not perpetrate the same. Facebook also pro-actively works on taking down graphically violent content that has terrorism propaganda in it because it is easy to sift violent content through the use of technology. However, it is difficult to pro-actively work on Hate Speech content because it is context oriented. Therefore, Facebook depends on user reports to take down Hate Speech content. Facebook is in the same grey area when it comes to Fake News because as a social media platform that does not generate any content of its own; Facebook cannot police real news from fake. What Facebook does instead is remove fake accounts that spread fake news.

Bickert in her interview with CNBC reported an interesting fact that more than 85 percent Facebook users are outside of the United States and hence standards of social media profanity are different for every country. In order to overcome the issue, Facebook ropes in organizations in countries where Facebook is widely used and engages with them to understand what constitutes the definition of abuse, safety and healthy discourse in each country.

Social Surfing 3.0 Workshop

The Centre for Social Research has been in affiliation with Facebook since 2015. We represent India on its global Safety Advisory Board and have been part of many discourses over three years. Centre for Social Research also conducts #SocialSurfing workshops that are supported by Facebook. As part of the workshops, we have reached out to over 300 colleges in over 27 states of the country to talk to youngsters about using social media for social change and employing counter speech for healthy online discourse.

Simple Share ButtonsShare this Post
Simple Share Buttons