OPINION: Content moderation floors are the gruesome sweatshops of the 21st century

 

Facebook, Inc.’s new logo under its rebranded name, “Meta.” Facebook is one of the world’s largest employers of content moderators (Source: Trusted Reviews).

The internet is already a very ugly place. While scrolling through my social media feed, it is safe to say that I, like most of us, have been subjected to my fair share of unwanted and disturbing pictures, videos, or content. However, after my encounter with the occasional gory video, or pornographic  picture, I thank my lucky stars that there is a robot behind the screen cleaning up at least some of  that content to safeguard my mental wellbeing.  

It was not until after I took Georgetown’s “Hate Groups and Social Media” course that I learned how wrong I was. The class showed me the true horrors of content moderation, where 150,000 workers from third world countries are on the front lines of the battle to keep the internet safe. Technology giants are heralding the new age of sweatshops: outsourcing its most mentally taxing work  to 20 countries like  the Philippines and India, subjecting its employees to abysmal working conditions, meager pay,  and an array of post-traumatic stress disorders. 

In the 3 minutes it will take a reader, on average, to read this article, Facebook will experience   an influx of close to ten million pieces of content – and will have taken down over 60,000 pieces of content in the same period. And, that’s only the “takedown decisions,” discounting the vast amount of other content they decide to “leave up.” 

Three weeks ago, Facebook rebranded itself as Meta: shifting from social apps to an  augmented, virtual reality platform, thus unlocking a whole new realm of content moderation  chaos. Even Chief Technology Officer of Meta, Andrew Bosworth, acknowledged content moderation in the metaverse “at any meaningful scale is practically impossible.”  

That’s where content moderators come in, whose role on Facebook’s content moderation floors echoes  sweatshop-like working conditions. Long hours, and limited space for working and living could not be more fitting descriptions of Facebook’s largest outsourced content moderation office – the Philippines.  

Here, Lester, is one of many content moderators whose work fuels Facebook’s content moderation. After signing a stringent non-disclosure agreement, Lester became an employee of the California-based outsourcing firm oDesk. Earning a meager 1USD per hour, and working on the top floors of a mall in Manila, Philippines, Lester spent up to nine hours each day reviewing horrific content on social media. 

The similarities of content moderation to sweatshop work do not stop with their physical  attributes but extend to the extreme toll it has on employees’ mental health. Lester spends his  hours deciding whether a child’s genitals were being touched accidentally or on purpose, or  whether a knife slashing someone’s neck depicted a real-life killing. Even worse, moderators go  through 2,000 photos and videos per hour, which is a mere 1.6 seconds to decide if an image  violates one bullet point off of their 17-page abuse standard manual. They describe employees taking smoke breaks during breaks to numb their emotions and having sex on stairwells, desperate for a rush of dopamine amidst their suffering.  

“I can remember the first one that was just like a wow moment,” the ex-Facebook moderator recounts to Vice. “And that was when we first started. I got this super up close shot of someone just jerking off. But I didn’t want to reach forward, put my hand into that space to delete it, to press the buttons to delete it…Thinking about that later, I realized that we really do feel  touched by what you see.”  

Although Facebook has put in place some mental health resources for its outsourced employees, they are purely window dressing to preserve the company’s image. Vice’s anonymous Facebook  employee noted none of the “wellbeing” experts available to them were mental health  professionals, having never received formal training in psychotherapy or psychology. Consequently, instead of a customized care for the trauma he experienced, Lester was offered “finger painting” as a response. 

Lester is not alone. In a leaked company meeting audio, Facebook’s CEO, Mark Zuckerberg, claimed the content  moderator’s testimony was “a little overdramatic.”  

It is time for Mark Zuckerberg to visit these shop floors. It is time for him to see the thankless, mentally ruinous work that these people do day in and day out, and to acknowledge that he has a problem to solve. 

As Facebook heads into its next phase as Meta, we in the developed world have an opportunity to speak out and stop the next wave of sweatshop work before it is too late. Content moderation is not the Third World’s problem; it is Facebook’s. Whether through new policies such as bringing all content moderators in-house or making them full-time employees, Facebook must change its practices, and begin treating its essential workers as human. 

 
Previous
Previous

Protests Prompt Global Liberalization of Abortion Laws in 2021

Next
Next

Indian Court Sentences 38 People to Death over Ahmedabad Bombing