Inside the intriguing fight for Meta content moderation jobs

Mercy Mutemi

Kenyan lawyer Mercy Mutemi (centre) speaks to the media after filing a lawsuit against Meta, Facebook’s parent.

Photo credit: File | Nation Media Group

The fate of about 500 staff is at stake as two outsourcing companies clash in court in a matter involving social media giant Meta, which owns Facebook and Instagram

The matter was sparked early this year following a decision by Samasource Kenya EPZ Ltd (Sama), announcing that it would sever ties with the tech giant and shut its hub in Nairobi, which serves the Eastern and Southern Africa region.

Sama then announced plans to lay off over 200 Facebook content moderators but the employees went to court and obtained orders stopping the process, arguing that the decision was unlawful.

And as Sama was announcing plans to lay off the employees, another firm-- Majorel Kenya was busy recruiting content moderators to take over the jobs of the lot being fired.

In a landmark decision on June 2, Employment and Labour relations court judge Byram Ongaya ruled that Meta was the primary owner of the job and it didn’t matter whether the arrangement was direct or indirect.

“The court returns that it is the 1st and 2nd respondents (Meta Platforms, Inc and Meta Platforms Ireland Ltd) who owned the job and, therefore, had the obligation to provide the job.”

The judge said Meta was the primary or principal employer of the content moderators and Sama was merely the agent, foreman, and manager of Meta as defined in the Employment Act, of 2017.

Majorel had opposed the case arguing that it would disproportionately prejudice the outsourcing firm, a move that would cause it financial ruin and even possible redundancies of other content moderation staff it had already hired.

The company said it had heavily invested in the project given by Meta and had completed a recruitment exercise, relocated some of them to Kenya, paid their salaries, benefits and accommodation, sustenance in readiness to take up the jobs.

Sh200 million

The firm revealed that it has leased office space for the content moderation project, done fitting, and was paying rent and service charges.

It also informed the court that it had purchased and set out all infrastructure and acquired other office resources such as computers, desks, chairs, printers, and other support kits.

Majorel said it had invested approximately Sh200 million from recruitment, payroll, accommodation, immigration permits, capital expenditure, and other set-up costs.

Sven Alfons A De Cauter, a director of the outsourcing company said before the filing of the case in March, it had already completed its own recruitment of 230 members of staff for its content moderation project.

The court was informed that all these members of staff are employed by Majorel directly.

“Having employed these staff, it had, as per normal employment practice, has on-boarded them as Majorel employees and commenced their training to perform the content moderation work the 4th respondent expected to undertake for its customer,” Sven said in an affidavit filed in court.

The firm said it is fully within its rights to have conducted its employee onboarding and training programmes as it awaited to take on its customers' content moderation work. The company would have started content moderation work from April 1, 2023, or soon after completing training, when its employees would have begun working on the content moderation assignments.

The company said the freeze of employment was putting financial squeeze on the firm, putting the investment it had made in jeopardy and also putting the suppliers, vendors, and staff that have been retained to support at risk of losing the economic benefits that they had sought to derive from this project.

The content moderators through lawyer Mercy Mutemi told the court that 260 of them were set to lose their jobs if the court did not intervene. The petitioners including Kiana Monique Arendse, Fasica Berhane Gebrekidan, and Mahlet Yilma said they were issued with termination letters indicating that their employment will be ended from March 31 on account of redundancy.

Toxic work environment

According to Ms Mutemi, the criteria used in the termination did not take into account the statutory provisions such as seniority in time, skill, ability and reliability of individual moderators.

Further, the agent had not been computed and communicated to each of them and their payment, but it was made conditional to signing of non-disparagement documents for, which no consideration had been offered.

Most of them, she said, are foreigners and the termination would see them kicked out of the country. During hearing of the application, the content moderators revealed the toxic work environment in which they worked.

Some of them said they come across mutilated or dismembered bodies, sadistic videos depicting manslaughter, and the burning of persons. The content they come across is graphic, others are nude and depict sexual activity content.

One of the moderators, who was assigned to moderate in Tigrinya/Amharic market said she remembers watching a manslaughter video and was traumatised and the matter was reported to the team leader, who referred her to a counsellor.

“That sadly, the counsellor could not reverse what I saw, nor could he help me in any way,” she said.

Sama, on its part, said it had ceased its content moderation undertaking and it has no income in that respect to pay the moderators if they continued being in employment. The company revealed that the capital contracts would result in affecting its over 3,000 staff.

The firm, whose business is in data annotation and computer vision said it had in the past 15 years employed over 13,000 individuals in Kenya. In the ruling the judge noted that evidence on record showed that all the graphic disturbing, toxic, dangerous, and harmful videos they watched were part of the work, which was directly served or provided by Meta directly.

The judge directed the government and human rights bodies to review the status of law and policy for the protection of employees, occupational health, and safety in the sector of digital work, digital workspaces, and digital workplaces and recommend improvement of the applicable policy and law and report to the court.

Strict liability

“While making the finding the court has examined the evidence and returns that the applicants and the 3rd respondent are in agreement that the work of content moderators is inherently hazardous with likely serious and adverse mental impact,” the judge said.

The judge said Meta was the principal employer of the moderators as the arrangement appears to fall outside the well-known systems of outsourcing in, which the owner of the job or work and the outsourced contractor clearly share the obligations owed to employees.

“The court observes that some of such obligations would be statutory and impose strict liability upon such owner of work and workplace and also, upon the outsourced contractor as an employer,” the judge said.

Meta through senior counsel Fred Ojiambo had opposed the case arguing that they are foreign corporates not registered in Kenya.

Ms Joanne Redmond of Meta Ireland maintained that there was no contract between the Facebook content moderators and the social media giant firm.

She said there was no basis for the court to allow the petitioners to summon it and the employees were required because they are foreign corporates not registered in Kenya

“That as a corollary to that proposition, it would be absurd for this court to arbitrarily and forcibly subject the 1st (Meta Platforms Inc.) and 2nd (Meta Platforms Ireland Ltd) respondents to the jurisdiction and laws of Kenya when the said foreign corporates are not present or trading in Kenya,” she said.