Bull’s Eye on a Moving Dartboard – The Morals of Social Media #Sextortion #RevengePorn

Home / Online Safety / Bull’s Eye on a Moving Dartboard – The Morals of Social Media #Sextortion #RevengePorn

Facebook Inc. has around 4500 Moderators onboard who sift harmful content that are shared on the social media platform, globally. The harmful content is either reported by the social media users or it is evident enough without having been reported. Sextortion and Revenge Porn fall in the latter category. In the former category, we have cases of online abuse and fake social media profiles. There are no official statistics on creation of fake profiles. However, in India, there is an estimated 168% rise in the same.

As part of ethics and principles, Facebook has decided to ban / take down / take action against harmful content directed at someone else but not the self. The social media organization feels that self-inflicted harmful content that goes viral on the platform should be allowed in the larger public interest. Self expression is a way of healing and also to bring phenomena to public notice. Facebook’s underlying principle is to be a moral neutral technology. It shall not act as a platform that decides what constitutes harmful, offensive and unethical. It will only act upon it as per social media users’ complaints or if a particular video / image / text is shared with the intention of harming another individual. In order to do so, Facebook constantly revamps its safety features and updates reporting tools to not only cover user protection comprehensively; but also make it user friendly.

Facebook’s reporting tools promote community well-being. If a person feels that someone on their Facebook friends’ list is putting up disturbing updates which may be a sign of harmful behavior to self and others; then as a good Samaritan, it can be brought to Facebook’s notice and also to those who are close to that person in real life. Facebook’s safety tools are helpful as precautionary measures. The social media platform gives the freedom of choice to every user to control their social media visibility and the extent to which they feel comfortable in their interaction with others.

The Telegraph on 22nd May 2017 published an article where it voiced the concern about legal under-reporting of revenge porn and Sextortion by the social media platform. The article also questioned the competency of the Social Media Moderators who at times take down the wrong content as harmful or offensive to human dignity. The mistaken harmful content includes breastfeeding images, child nudity, female nipples, naked statues, burn victims, plus-sized women, graphic pictures depicting menstruation, LGBTQ community, etc. The Telegraph published another article on 21st May 2017, reporting that Facebook shall treat handmade sexual activity and art as permissible and ban digital art on the aforementioned theme. The same article stated Facebook’s decision on the kind of threats they feel amounts to being serious, tantamount to having committed the crime or having implied the same in clear terms. Not all threats are serious and people should feel free to express frustration online as it is healthy to vent out when you know someone is listening.

Facebook Inc. does not generate any content that we see on it. 100% content is generated by the users and hence attacking the social media platform does not make sense. Identifying this as the biggest space for societal reformation, Centre for Social Research initiated SocialSurfing and TweeSurfing as movements for social change. The ideology behind the movement is to help people reform as they are the ones who use social media platforms. When you have healthy people, you have healthy interactions – both online and offline. Democracy is ensured and so is Freedom of Expression.

Censorship of harmful content on social media platforms should be restricted to protective mode. As a society, we have a tendency to regress and take a moral high stand on issues by proclaiming it to be an act of societal reformation. Civilizations are not needed to be told what is wrong or right; they are just required to know that they cannot harm anyone – physically or otherwise. For everything else, we are at our personal discretion. This is something Facebook Inc. in its nascent stage of existence is trying to take a stand for.

Simple Share ButtonsShare this Post
Simple Share Buttons