Nairobi, Kenya – (African Boulevard News) – Facebook, Instagram, and WhatsApp parent company Meta, faces legal trouble as three complaints have been filed against the tech giant and its subcontractor, Sama. The complaints are from content moderators, who allege that the two companies have failed to protect them from the harmful impact of their jobs.
The complaints were filed in Nairobi’s Industrial Court and allege that Meta and Sama are responsible for causing physical, psychological, and emotional harm to content moderators. The plaintiffs claim that they were exposed to disturbing content that caused them to develop Post-Traumatic Stress Disorder (PTSD), depression, and anxiety.
The content moderators are demanding compensation for the harm they have suffered and are requesting that the court orders Meta and Sama to implement measures to address the physical and psychological effects of their work.
According to the moderators, they were exposed to graphic content including violent videos, child abuse, and images of suicide, and were not provided with adequate support, resources, or equipment to cope with the mental impact of their jobs.
“They did not give us anything like psychological support, and when things get out of hand, they fire us without considering the negative psychological effect the job has on us,” said one of the moderators.
This is not the first time that Meta has faced criticism and legal action over its content moderation practices. In 2020, content moderators working for Facebook in America filed a lawsuit against the company, alleging that they had developed PTSD from their work and were not provided with the necessary support.
While Meta has implemented measures such as offering better mental health support to moderators, the company has been criticized for its failure to adequately address the issue.
Content moderation is a critical aspect of social media platforms, but it is also a job that comes with significant risks. As the complaints against Meta and Sama show, content moderators are often left exposed to disturbing content without adequate support and resources to cope.
The case against Meta and Sama raises important questions about the responsibility of tech giants to protect their workers and the need for greater regulation of content moderation practices. As the case unfolds, it is clear that the issues of content moderation and worker protection will continue to be a major challenge for social media companies.