Following years of backlash and legal disputes, Meta has decided to discontinue a key agreement with Kenyan outsourcing firm Sama. According to Sama, the decision will result in the loss of employment for over 1,100 workers.
The American tech company initially partnered with Sama in 2019 to help moderate Facebook content across sub-Saharan Africa, focusing on identifying and removing violent, harmful, and hateful material.
Over time, both organisations have faced growing scrutiny over alleged poor labour conditions and workforce reductions.
In 2023, nearly 200 content moderators dismissed by Sama filed a lawsuit claiming unfair termination. They further accused the company of subjecting Kenyan workers to harsh conditions, including forced labour practices and inconsistent pay.
A separate grievance was also lodged in 2022 by a former employee based in South Africa.
Employees have reported that constant exposure to graphic and disturbing content has taken a toll on their mental well-being. They have also demanded compensation, arguing that pay levels do not match the psychological risks involved in the job.
Sama has rejected these claims, maintaining that staff are provided with fair wages, benefits, and access to professional mental health support.
Meta stated that the partnership was being ended because Sama failed to meet required performance standards. The company added that future content moderation work will increasingly rely on artificial intelligence and machine learning systems.