Facebook Content Moderators in Kenya Describe Job as ‘Torturous’


Nathan Nkunzimana, overcome with emotion, recounted his experience of watching disturbing videos as a content moderator for a Facebook contractor. The job required him to view unsettling content, causing some of his overwhelmed colleagues to scream or cry. Now, according to AP, Nkunzimana and nearly 200 former employees in Kenya are suing Facebook and local contractor Sama over the working conditions they faced. This court challenge, the first outside of the United States, could have implications for social media moderators worldwide. The group was employed at Facebook’s outsourced content moderation hub in Nairobi, Kenya’s capital, where they screened user posts, videos, messages, and other content from across Africa.

The moderators, hailing from various African countries, are seeking a compensation fund of $1.6 billion, citing poor working conditions including inadequate mental health support and low pay. They were laid off earlier this year by Sama, the company exiting the content moderation business. The moderators claim that the companies ignored a court order to extend their contracts until the resolution of the case. Facebook and Sama have defended their employment practices. In response to accusations of allowing hate speech to circulate in countries like Ethiopia and Myanmar, Facebook invested in moderation hubs globally. These hubs aimed to address conflicts claiming thousands of lives and the posting of harmful content in local languages.

Content moderators hired by Sama in Kenya were chosen for their fluency in various African languages, but they soon found themselves exposed to graphic content that hit close to home. Fasica Gebrekidan, who worked as a moderator for two years, experienced the war in her native Ethiopia’s Tigray region, witnessing the deaths of hundreds of thousands of people. Many Tigrayans, like Fasica, knew little about their loved ones’ fate. Fasica expressed her anguish, stating, “You run away from the war, then you have to see the war. It was just a torture for us.” She blames Facebook for failing to provide adequate mental health care and fair compensation, and accuses the local contractor of exploiting her and subsequently terminating her employment. “Facebook should be aware of what’s happening,” she emphasized. “They should prioritize our well-being.”


This type of work has the potential to cause significant psychological damage, but job-seekers in lower-income countries may be willing to take the risk in exchange for a job in the tech industry, according to Sarah Roberts, an expert in content moderation at the University of California, Los Angeles. What sets the Kenya court case apart, she points out, is the moderators’ collective effort to organize and challenge their working conditions. In the United States, settlement is typically the preferred tactic in such cases, but Roberts believes that companies may not find it as easy to settle if similar cases are brought elsewhere. (Read more Facebook stories.)

Reference

Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment