“What would you do if you weren’t afraid?”

In the current information age, our priority would be to build infrastructure for laws, healthcare, and human rights to navigate the novel digital space. People have become resources that are infamously misused, and I argue that the innovation of technology has trapped us in a realm of exploitation and unhappiness. What is content moderation? It is the safeguarding of the internet from harmful user-generated content, a job that is notoriously detrimental to the individual performing it. It is modern-day slavery as people are closely monitored, made to keep their work confidential, and as a result, discouraged from seeking external assistance. Facebook, for instance, only provides a 9-minute wellness break per day. Given the content being reviewed, the compensation is insufficient in providing help and so they seek comfort in drugs and other activities. This stands in stark contrast to the benefits offered to Facebook's software developers, highlighting a lack of consistency in how different job roles are valued. While those building the technology are well-compensated, those responsible for maintaining the safety of users are not given the same recognition. Who are content moderators? They are typically individuals from various backgrounds, many of whom are motivated by financial need, given that the pay is typically just above minimum wage. In recent years, Facebook has increasingly relied on contract labor for this role. While some may argue that individuals have the option to simply not apply for these jobs, the reality is that many people are compelled to accept any job available in order to sustain themselves. As contract workers often face poor working conditions, this issue persists, with these individuals being relegated to a certain lifestyle with limited security and rights. Why can’t we use artificial intelligence? Moderators require cultural context to assess region-specific content, and it demands sophisticated natural language processing to understand the meaning of the content. Since all types of violence are not universal, there is no generalizable set of rules to follow. Despite Facebook making efforts in this direction, artificial general intelligence is still a far-fetched notion, meaning that automated moderation is not a feasible option in the near future. How are we trapped? Moderators recognize the critical role they play in ensuring Facebook remains a safe platform, given its vast user base. Despite users being unaware of who is responsible for this work, moderators strive to do their best. This is because they realize the importance of their work when they interact with the content they are protecting the users from. Though artificial intelligence is not a viable solution, Facebook does have the ability to improve the working conditions of its content moderators. This would require regulations that safeguard both ghost workers and content moderators, although there is little incentive for Facebook to pursue such measures. As a result, this remains a hopeful but unrealistic proposition. What kind of society are we creating? Individuals find themselves trapped by these companies with regard to their employment, well-being, freedom of expression, and other areas. The information age has introduced a new challenge: while artificial intelligence is rapidly reshaping our world, it has yet to develop the capacity to protect users from its own shortcomings. As long as this disparity remains the case, we will continue to remain trapped.

References:

  1. https://www.google.com/amp/s/techcrunch.com/2012/07/25/facebook-designers-ask-what-would-you-do-if-you-werent-afraid/amp/
  2. https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona
  3. https://www.computerweekly.com/feature/The-one-problem-with-AI-contentmoderation-It-doesnt-work#:~:text=In many ways%2C thatstruggle,learning%20system%20to%20fully%20achieve.
  4. https://www.theverge.com/2019/2/27/18242724/facebook-moderation-ai-artificial-intelligence-platforms