Home Tech Company Regrets Accepting Facebook Moderation Contract in East Africa

Company Regrets Accepting Facebook Moderation Contract in East Africa


A firm that had been contracted to oversee Facebook post moderation in East Africa has acknowledged that, in hindsight, it should never have taken on the responsibility.

Former staff members of Sama, an outsourcing company based in Kenya, have revealed that exposure to distressing graphic posts had left them deeply traumatized.

Several of these individuals are now pursuing legal action against the company within the Kenyan legal system.

Wendy Gonzalez, the CEO, stated that Sama will no longer undertake projects involving the moderation of harmful content. Some ex-employees recounted their experiences of distress after encountering videos depicting beheadings, suicides, and other explicit content at the moderation hub, which had been operated by the firm since 2019.

According to former moderator Daniel Motaung, the first distressing video he encountered was “a live video of someone being beheaded.”

Motaung is currently suing both Sama and Meta, the parent company of Facebook. Meta asserts that it mandates all its partner companies to provide constant support. Sama, on the other hand, claims that certified wellness counselors were always available.

In an interview with the BBC, Ms. Gonzalez shared that the moderation work, which never constituted more than 4% of the company’s business, was a contract she would decline if given the opportunity again. The company declared its intent to terminate this contract in January.

Reflecting on the situation, she stated, “In retrospect, if I had the knowledge I possess now, including the impact on our primary operations and resources, I would not have agreed to it.”

She highlighted that there were crucial lessons to be learned and the company had since adopted a policy to abstain from assignments involving harmful content moderation. Furthermore, the firm pledged not to engage in artificial intelligence (AI) projects that support weaponry or police surveillance.

When asked about the claims of harm made by employees who had been exposed to graphic content, Ms. Gonzalez refrained from commenting due to ongoing legal proceedings. On the broader issue of whether moderation work could be inherently harmful, she acknowledged it as “a novel domain that necessitates thorough research and resources.”

Related Articles

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More