– Three complaints –
Since May last year, three lawsuits have been filed against Meta and California-based Sama, a company contracted by the Silicon Valley behemoth to moderate content on Facebook between 2019 and 2023.
When contacted by AFP, both Sama and Meta — which owns Facebook, WhatsApp and Instagram — declined to comment on the specifics of the cases.
Two of the cases were filed by content moderators employed by Sama — formerly known as Samasource — in the Kenyan capital Nairobi.
Their job was to review and remove posts from Facebook that were violent, inciting hatred or spreading misinformation in sub-Saharan Africa.
The first complaint was filed in May 2022 at Nairobi’s Employment and Labour Relations Court by Daniel Motaung.
In his petition, the South African accused the companies of “inhumane” working conditions, deceitful hiring methods, irregular and insufficient remuneration, and a lack of mental health support.
He also claimed he was fired after trying to form a union. The case is ongoing.
In March this year, 184 content reviewers filed a lawsuit at the same court against Meta and Sama, claiming they were unfairly dismissed when the latter closed its Nairobi content moderation hub.
They are also seeking compensation for “damage caused to their mental health and general wellbeing as a result of the constant exposure to toxic content”.
Earlier this month, the Kenyan court suspended the mass dismissals and ordered Meta and Sama to “provide proper medical, psychiatric and psychological care for the petitioners and other Facebook content moderators”.
Meta and Sama said they would appeal the ruling.
A third complaint filed in December in another Nairobi court accuses Meta of failing to act against online hate speech in Africa, and has called for the creation of a $1.6 billion fund to compensate victims, including the family of a murdered university professor in Ethiopia.
AFP is involved in a partnership with Meta providing fact-checking services in Asia-Pacific, Europe, the Middle East, Latin America and Africa.
– Meta’s responsibility in question –
The cases represent the first major litigation on the issue of content moderation since a class action lawsuit launched in 2018 in the United States.
That case ended with a court settlement in 2020, when Facebook agreed to pay content reviewers $52 million as compensation for the trauma resulting from constant exposure to graphic, violent imagery.
Critics of Meta say the Nairobi cases are aimed at exposing subcontracting practices used by the company to dodge its responsibility for moderators’ mental health.
In addition to Sama in Kenya, Meta outsources Facebook content moderation to companies operating in more than 20 locations around the world, which together review more than two million posts daily, according to data provided by the company to AFP.
Lawyers for Meta told the Nairobi labour court the company could not be tried in a country where it did not directly employ people.
But the court ruled on June 2 that Meta was the “owner of the digital work and the digital workspace”, giving it jurisdiction to try the case.
– ‘Dark side of social media’ –
“These cases are pulling back the curtain on the true dark rooms of content moderation,” said Brandie Nonnecke, director of the Berkeley Center for Law and Technology and the CITRIS Policy Lab at the University of California, Berkeley.
“The general public doesn’t realise how vitriolic (and) awful the content can be and the true human cost of moderation,” she told AFP.
Cori Crider, director of UK-based legal activist firm Foxglove, which is supporting the complaints filed in Kenya, said “the main objective in all of these (cases) is to reform the way the work is done”.
The Ethiopian murder case and the content reviewers’ lawsuits are “two sides of the same coin”, she said, arguing that unhappy working conditions lead to poor moderation, which can have deadly consequences.
The complaints reveal “the dark side of social media in general”.
Nonnecke said the issues raised were the symptom of a deeper problem.
“The platforms built systems that are a powder keg for sharing harmful content… And there’s limited accountability.
“We need to focus on forcing them to design their platforms in a way that does not incentivise the posting and sharing of harmful content,” she added.
“Doing so can help to stop the posting of harmful content in the first place.”