![]() ![]() Google, Microsoft, Twitter, Pinterest, and many other tech companies rely on these people - many of whom are paid a paltry wage - to maintain their family-friendly image. The social media giant employs an estimated 15,000 content moderators in countries around the world, both directly and indirectly through subcontractors.Īnd Facebook isn’t the only company that employs content moderators. “Horror” is the only word to describe the messages, images, and videos that content moderators see all day, every day at work: unspeakable child abuse, cruelty to animals, murder and other instances of gruesome violence, hate crimes and virulent racism - not to mention an endless stream of pornography, much of it deeply misogynistic.įacebook has roughly 1.6 billion “daily active users,” so staying on top of the posting proclivities of the worst humanity has to offer is a gargantuan task. When we scroll through our algorithmically generated feeds, we don’t realize that these content streams are being managed by real people working tirelessly to ensure our screens remain (generally) horror-free. ![]() Facebook is facing lawsuits in California and Ireland from former employees who say they’ve suffered severe psychological damage working as content moderators.Ĭontent moderators aren’t talked about much - they’re part of the invisible workforce that makes modern digital platforms go. But it is well aware that sorting through flagged content on the site can be bad for one’s health. The Financial Times recently received documents showing that Accenture, a global professional services firm that provides content moderation for Facebook in Europe, asked its employees to sign a waiver acknowledging that screening content for the social media company could result in PTSD.įacebook claims that it didn’t know about or ask Accenture to distribute the waiver to moderators in Warsaw, Lisbon, and Dublin. Working for Facebook can cause post-traumatic stress disorder (PTSD). ![]()
0 Comments
Leave a Reply. |