facebook content moderation

The most popular social network in the world, Facebook Inc. (NASDAQ:FB), employs more than 30,000 people to ensure the safety and security of its users. Nearly half of them work on moderating content, filtering off gore, pornographic, violent and misleading messages.

But, quite recently one of these moderators has filed a suit for the psychological trauma against Facebook itself. She had claimed to be suffering from viewing graphic images and videos for long durations day after day. The incident has drawn attention towards the unpleasant and overlooked behind-the-scene reality of these workers who work dedicatedly to maintain the appeal of these platforms to users as well as advertisers.

The Unsavoury Workplace Experiences

Social sites had a field day when they were allowed to get away with anything shared thereupon by site users. As more and more people around the world got hooked to the increasing number of platforms, laws were enacted to make these sites responsible for the information being disseminated using these sites as mouthpieces. By the end of 2018, the number of Facebook users reached 2.3 billion MAVs (Monthly Active Users).

Facebook has partnered with companies like Cognizant, Genpact and Accenture for reviewing their content. A greater part of Facebook’s content moderators works at sites managed by these partnering firms.

A recent report in The Verge drew attention to the stress and trauma which approximately 1,000 Cognizant employees at Facebook’s Phoenix, Arizona based office were undergoing. Besides highlighting that the employees had been directed not to talk about the “emotional toll their job takes on them, even with loved ones, leading to increased feelings of isolation and anxiety”.

Due to immense psychological pressure and tension resulting from flagging off undesirable content, the people entrusted with taking care of the well-being of billions of site users had resorted to smoking weed and rant about conspiracy theories besides indulging in sexual activities inside parking lots, garages, stairwells and rooms meant for lactating mothers.

Though the social titan provides round the clock psychological support to those working in this capacity round the clock, the recent report has made it obvious that what it had done thus far isn’t enough.

Facebook too has acknowledged that it needs to do more to ensure the psychological well being of its moderators. The social site which is globally acknowledged to be one of the best employers in the world has agreements with its partners to provide good facilities, wellness breaks and resiliency support for those working with them.

Justin Osofsky, Facebook’s Vice-President of Global Operations, agrees.

“We are putting in place a rigorous and regular compliance and audit process for our all outsourced partners to ensure they are complying with contracts and care we expect,” he pointed out in a post shared with employees over the weekend.

This would include, among other things, more regular focus groups with vendor employees. Osofsky also mentioned that his company encourages all partnering firms’ employees to raise any concerns with their employers’ HR teams.

“Additionally, they can anonymously raise concerns directly to Facebook via our whistle blower hotline and Facebook will follow up on the issue appropriately,” he clarified.

Despite the clarifications issued by the networking king, no one is impressed or convinced.

Moderators who are bombarded with graphic images, videos and live-streamed broadcasts of sexual abuse, rape, torture, suicides, beheadings, self-harm, murder and other gory content on a daily basis subjects those working in this capacity to at least some form of Post Trauma Stress Disorder (PTSD).

“Almost everyone I know who has encountered those things regularly has experienced some form of PTSD,” shares Gossage, who worked at the social-blogging platform LiveJournal and from 2011 to 2014 as a community manager at Reddit. “These things don’t always emerge right away–they can come up later. The anxiety, the sleeplessness–it’s insidious. It’s a job that certainly changes your life.”

Made worse by the fact that their pay packages are miserable. Falling under contractual labour (because that costs lesser to the company), Facebook content moderators earn, on an average, $28,800 annually.  This is way below the median Facebook annual salary of $240,000.

Being exposed to heaps of graphics and disturbing content like a person being stabbed by an assailant repeatedly while he screams for help or a child being assaulted sexually can indeed have devastating consequences. And these moderators cope up with their workplace stress by sharing dark jokes about committing suicide, sobbing in corners and smoking weed to numb their emotions.

Sad. True nevertheless.

Paid Listing