Hello

Your subscription is almost coming to an end. Don’t miss out on the great content on Nation.Africa

Ready to continue your informative journey with us?

Hello

Your premium access has ended, but the best of Nation.Africa is still within reach. Renew now to unlock exclusive stories and in-depth features.

Reclaim your full access. Click below to renew.

Facebook raises salaries of content moderators in Kenya after exposé

Facebook

A moderator’s job is to review posts on Facebook that can contain graphic violence, exploitation, extremism, abuse and suicide.

Photo credit: Shutterstock

For hours, content moderators around the world look at and filter the most repulsive content on the internet for Facebook, but they do not work for the social media giant directly.

In Kenya, the workers, employed by outsourcing agencies, claim they are paid peanuts, gagged by non-disclosure agreements and given next to no support for the damage caused by looking at such content.

They work for Sama, a California-based company sub-contracted by Facebook, and are some of Facebook’s lowest-paid workers anywhere in the world.

Sama pays them about Sh250 ($2.20) per hour for a nine-hour working day, and this is after their salaries were increased after an expose by Time magazine last month.

By comparison, outsourced content moderators for Facebook in the United States are paid a starting salary of $18 (Sh2,000) per hour.

The Kenyan workers were initially paid Sh170 ($1.50) per hour. Even with the pay rise, Sama employees remain some of Facebook’s lowest-paid workers in the world.

When the Nation reached out to Meta, the owner of Facebook, for comment on why they underpay Kenyan employees, a representative said: “We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry-leading pay, benefits and support.”

The spokesperson added: “We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect of them.”

Reacting to the Time article, Sama said it contained false and misleading characterisations of their company and team.

Falsehoods and inaccuracies

“Sama's business model is designed to meaningfully improve employment and income for those with the greatest barriers to it via training, benefits, and work. Sama is a longstanding and trusted employer in East Africa. To date, Sama’s work has helped lift more than 59,000 individuals out of poverty,” it said in a statement.

It added that its impact model has been validated by respected third-party organisations such as the Massachusetts Institute of Technology.

“We were disappointed to see these falsehoods and inaccuracies published, and we’d like to share the facts. Sama values its employees, and we are proud of the longstanding work we have done,” the company said.

Given content moderation is a tough job, “we pay employees wages that are consistently (three times) the minimum wage and (two times) the living wage in Kenya as a recognition of the work.

“Today, new team members see an average 3.6x increase in their earnings when joining the company, and receive healthcare, pension plans, travel expenses, food subsidies, and other benefits that cover essential expenses for employees and their dependents in each of our working regions.”

Habel Kamau, a human resources director at Sama’s Nairobi office, told Time that the salary changes were not a result of the magazine’s article. “The truth is that this conversation was still going to happen with these events occurring or not,” he said.

A moderator’s job is to review posts on Facebook that can contain graphic violence, exploitation, extremism, abuse and suicide.

Once employed, a worker is not allowed to speak to her friends or family about the things they see at work due to a non-disclosure agreement that is included in the contract.

Meta says the NDAs are standard practice, and reviewers can discuss any aspect of their job with doctors and counsellors. Staff can discuss the general challenges and rewards of their jobs with family and loved ones, but not specific details of the content they review.

Graphic violence

A moderator can process around 100 “tickets” a day – videos, images or text posts on the platform that often contain graphic violence, suicide, exploitation and abuse.

Facebook says psychological help is available to all its moderators 24 hours a day.

Some Facebook moderators are asked to sign a disclaimer before starting work, accepting that the content they will see could lead to poor mental health and PTSD (post-traumatic stress disorder).

Facebook uses a combination of machine learning algorithms and human moderators to review content. In the future, it hopes to reduce the number of human moderators through the algorithms.

The Wall Street Journal once described being a Facebook moderator as “the worst job in technology”. Moderators have disclosed watching hours of child abuse, gory violence, murders and suicides.

Current and former moderators took on Facebook in a class-action lawsuit over the mental health implications of their jobs. The lawsuit was settled in 2020, and Facebook agreed to pay $52 million to tens of thousands of affected workers.