- Sputnik International, 1920
Get the latest Africa news from Sputnik: breaking news, photos, videos, analysis, and features.

Meta Kenyan Moderator Abuse Case 'a Strong Precedent' for Other Countries, Expert Says

© AP Photo / Matt RourkeThe Facebook logo is displayed on an iPad in Philadelphia, May 16, 2012
The Facebook logo is displayed on an iPad in Philadelphia, May 16, 2012 - Sputnik International, 1920, 07.02.2023
On Monday, a court in Kenya ruled that Meta* could be sued in the country on charges related to forced labor, human trafficking, and freedom of association. This came as part of a lawsuit filed against Meta and its partner by a former outsourced moderator on the charges listed above as well as insufficient mental health support for employees.
The ruling made by the Kenyan labor court has opened a pathway and established a strong precedent for employees to sue Meta in countries where the company is not based, but has outsources its activities, says Barbara Lazarotto, PhD researcher in law and Marie Curie Actions fellow, Vrije Universiteit Brussel, Brussels, in an interview with Sputnik.

"Hopefully, this will impact Meta's modus operandi in human-run operations for content moderation, which has been a target of criticism owing to poor working conditions and its consequent adverse effects on the mental health of workers," the expert notes.

The legal case was launched in May 2022 by Daniel Motaung, a former outsourced Facebook content moderator, who made his accusations against Facebook’s owner and its Kenya-based outsourcing partner Sama.
In January 2023, Sama announced that it had decided to quit content moderation operations for Meta.
Meta initially argued that it could not be sued in Kenyan courts because it is a foreign corporation and does not operate in the African country. The multinational corporation asked that its name be removed from the lawsuit.
Now, as the as the labor court has declared Facebook part of the case, the process will continue.

"We are extremely pleased Facebook have been found to be proper parties to this case and we look forward to the day when Facebook will face justice for exploiting content moderators like Daniel. We think it’s right that this trial be heard in Kenya, where the abuses happened," said Cori Crider, director of Foxglove, a London-based tech justice legal non-profit organization that supports Motaung's lawsuit, in a press release obtained by Sputnik.

According to her, Meta CEO Mark Zuckerberg profits from advertising his services to Kenyan users, but at the same time refuses to invest enough resources to provide safety to Kenyan workers and treat them "with the dignity and humanity they deserve."

"Daniel's win today should send a message to Facebook, and by proxy, all of Big Tech in Africa. Kenyan justice is equal to any tech giant, and the giants would do well to wake up and respect Kenyan people – and their law," Crider noted.

Amnesty International Kenya deemed the ruling "a significant step," ensuring the authority of the country's courts to "protect and enforce fundamental human rights" and "the first time that Meta Platforms Inc. will be significantly subjected to a court of law in the global south."
Facebook's Meta logo sign is seen at the company headquarters in Menlo Park, Calif., on Oct. 28, 2021. - Sputnik International, 1920, 06.02.2023
All You Need to Know About Facebook Moderator Abuse Case as Court Rules Meta Can Be Sued in Kenya
A key part of Motaung's case was the issue of the mental state of workers. The claimant stressed that together with his colleagues, he had suffered psychological injuries after repeated exposure to disturbing content as a moderator.
According to da Rosa Lazarotto, the debate over the need for human content moderation is a complex one, as at the moment, artificial intelligence mechanisms are not able to completely substitute humans, despite having evolved a lot.
"The need for human workers, coupled with the lack of interest from companies in the well-being of content moderators results in the degradation of their mental state. This is also aggravated by the fact that the adverse mental health effects of shock content are widely underplayed and also accentuated by the fact that most of the people working in content moderation are workers from the Global South, with little in the way of resources to process this shock-content induced trauma and next to no political capital to get international visibility and help for their poor working conditions," she notes.
In the expert's opinion, the victims of the company will not be able to receive true justice, as some psychological injuries are irreparable, especially considering the gravity of the content the majority of the workers have seen.
"However, hopefully, this decision will be the first step in a true cultural change in the company regarding workers, with the construction of a better work environment for content moderation in Meta," the scholar underlines.
According to her, companies such as Meta are not fully committed to spending revenue on human content moderation, since it is an invisible job and often not considered justifiable from the cost perspective by the company. This results in the hiring of workers in the Global South, which has relatively lenient labor regulations that allow for poor working conditions and consequent psychological injuries, da Rosa Lazarotto adds.

"Kenya, other African countries and countries of the Global South, in general, are often targets of companies that seek cheap, under-regulated labor which is very much regulated in other sections of the world. It is essential to develop labor legislation that adequately protects content moderator workers in these environments. Frequent psychological checks and short shifts are essential to maintain the general health of these workers. I believe that holding companies accountable in those countries is also a big step in this direction. As a global company, Meta should be held accountable globally as well," the expert stresses.

Facebook unveiled its new Meta sign at the company headquarters in Menlo Park, Calif., on Oct. 28, 2021. Facebook parent company Meta Platforms Inc. settled a decade-old class action lawsuit on Tuesday, Feb. 15, 2022, over the company’s use of “cookies” in 2010 and 2011 that tracked people online even after they logged off the Facebook platform. As part of the proposed settlement, which must still be approved by a judge, Meta agreed to delete all the data it wrongfully collected during the period.  - Sputnik International, 1920, 15.12.2022
'No Guarantees': Meta 'Not Interested' in Combatting Hate Speech in Africa, Experts Say
She also notes that it is Meta's enormous size as a corporation that leads to the company becoming a magnet for controversy.
"Meta is one of the biggest tech companies in the world, what was only a social media company that now has been transformed into a major tech conglomerate. This development comes with a cost. I believe the lawsuits are a result of the size of the company and the recent development of internet and platform regulation in Europe and in the world," da Rosa Lazarotto says.
The decision comes at a time when Meta faces another case in Kenya's High Court for promoting violence-inciting hate speech that was filed following the murder of an Ethiopian professor during the Tigray conflict. The son of the victim, who is ethnic Tigrayan, said that it was Facebook's algorithms that led to his father's demise by promoting content calling for violence against Tigrayans.
In 2021, the firm was subject to a similar lawsuit related to violence-inciting content posted during the Rohingya genocide in Myanmar.
In March 2022, Meta became the first-ever public company to be outlawed as an extremist organization in Russia. The decision followed after the firm allowed calls for violence on its platforms targeting Russians in the context of the Ukraine conflict. Posts included content advocating for the death of Russian President Vladimir Putin and Belarusian President Alexander Lukashenko. In April, Russia imposed sanctions on Mark Zuckerberg.
*Meta is banned in Russia as an extremist organization.
To participate in the discussion
log in or register
Заголовок открываемого материала