The Dark Side of Social Media: Hidden Toll of Content Moderation Revealed

The Dark Side of Social Media: The Unseen Consequences of Content Moderation

The world of social media is often seen as a platform for self-expression and connection, but behind the scenes, a different reality exists. Over 140 Facebook content moderators in Kenya have been diagnosed with severe post-traumatic stress disorder (PTSD) due to their exposure to graphic and disturbing content, including murders, suicides, child sexual abuse, and terrorism.

The Human Cost of Content Moderation

These moderators, employed by a company contracted by Facebook, worked grueling 8- to 10-hour days, viewing images and videos that would haunt most people’s nightmares. The consequences of this exposure were devastating, with many developing PTSD, generalized anxiety disorder (GAD), and major depressive disorder (MDD). Dr. Ian Kanyanya, head of mental health services at Kenyatta National Hospital in Nairobi, has been treating these moderators and has seen firsthand the devastating impact of their work.

A Toxic Work Environment

The allegations against Facebook and its contractor, Samasource Kenya, paint a dire picture of a work environment that is both physically and emotionally draining. Moderators reported fainting, vomiting, screaming, and running away from their desks due to the graphic content they were forced to view. It’s hard to imagine the psychological toll of such a job, where the lines between reality and the virtual world become increasingly blurred.

A Familiar Pattern

This is not the first time Facebook has faced criticism for its treatment of content moderators. In 2018, the company was forced to pay $52 million to American moderators who developed PTSD due to their work. It seems that the company has not learned from its past mistakes, outsourcing its content moderation to cheap third-party companies that prioritize profits over people.

The Need for Change

It’s time for companies like Meta to take responsibility for the well-being of their employees. Instead of outsourcing this critical work to underpaid and overworked contractors, they should invest in their employees’ mental health and provide them with the resources they need to cope with the trauma they experience. A living wage, access to counseling, and a safe work environment are the least that these moderators deserve.

A Call to Action

As we move forward in the digital age, it’s essential that we prioritize the human cost of technology. Instead of focusing on creating AI that can generate explicit content, we should be working towards developing AI that can identify and remove harmful content, freeing humans from the burden of this toxic work. It’s time for companies like Meta to put people over profits and create a safer, more compassionate work environment for all.

Author

Leave a Reply

Your email address will not be published. Required fields are marked *