Big Tech Moderators: Mental Health & Trauma Support

by Ethan Brooks

“`html

Content Moderators Demand Mental Health Support Amid Rising Online Trauma

A growing coalition of content moderators working for tech giants is urgently calling for increased mental health support as they grapple with the psychological toll of filtering a surge in disturbing online content. The individuals tasked with shielding users from harmful material on platforms like Meta Platforms and TikTok are reporting severe health consequences, ranging from insomnia and anxiety to suicidal ideation.This escalating crisis highlights the hidden human cost of maintaining a safe online environment.

The demand for better support is gaining momentum across multiple countries, including the Philippines and Turkey, were a significant portion of content moderation work is outsourced. The moderators, often working remotely, are exposed to a constant stream of graphic violence, hate speech, and other deeply disturbing imagery.

Did you know?-Content moderation as a profession has only existed for about 15 years, emerging with the rise of social media platforms.

The Invisible Scars of Online Safety

The psychological impact of this work is profound. One Filipino content moderator, speaking anonymously out of fear of reprisal from their employer, described a dramatic decline in their sleep quality.

You may also like

Leave a Comment