On Facebook , illegal or disturbing content is sorted out by hand. An ex-employee now complains because her job made her sick.
So-called “content moderators” are responsible for checking posts on Facebook and removing disturbing or illegal content. According to a recent lawsuit filed by a woman who did this work for nine months, these employees are not sufficiently protected from the potential psychological impact of their job. This reports, among other things, the “New York Post” .
“Every day, Facebook users publish millions of videos, photos, and live streams of child s3xual abuse, rape, torture, animal s3x, beheadings, suicides, and murders,” according to a former employee’s complaint. But the company has not done enough to protect the moderators from a “psychological trauma “.
How paralyzed
The woman is now suffering from a “debilitating post-traumatic stress disorder “, which could be caused by touching a mouse or entering a cold building alone. The ex-employee and her colleagues would have had to control more than 10 million posts a week with potentially harmful content.
In an e-mail statement to the tech website “Mashable” a Facebook spokesman said that the allegations would be reviewed. The company is aware that “this work can often be difficult”. So take “the support of our content moderators incredibly serious.” Accordingly, this also includes appropriate training, employer services and the provision of psychological help for each employee.