- Facebook is being sued by a former content moderator in a lawsuit seeking class action status.
- The moderator alleges she experienced PTSD because of her job and that Facebook failed to provide a safe working environment.
- Content moderators are tasked with policing illegal and unsavoury content online, and can be exposed to child abuse imagery, graphic violence, and other dark material.
Facebook is facing another legal headache – this time over work conditions for staffers tasked with keeping the site free of disturbing images and other content.
The Silicon Valley social networking firm is being sued by a former content moderator, who alleges that her exposure to “toxic” material while working for the firm gave her PTSD.
The complaint was filed in California state court in San Mateo on Friday, and was first reported on by Motherboard. The former employee, Selena Scola, is attempting to launch a class action case against Facebook, with her lawyers’ court documents citing a need to “protect herself and all others similarly situated from the dangers of psychological trauma resulting from Facebook’s failure to provide a safe workplace for the thousands of contractors who are entrusted to provide the safest environment possible for Facebook users.”
The lawsuit comes as Facebook faces legal challenges as a result of the misuse of its user data by Cambridge Analytica.
In a statement, director of corporate communications Bertie Thomson said: We are currently reviewing this claim. We recognise that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.
“Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling – available at the location where the plaintiff worked – and other wellness resources like relaxation areas at many of our larger facilities.
Content moderation was described by the Wall Street Journal in 2017 as “the worst job in technology.” Facebook – and other public platforms like it – have thousands of moderators (typically contractors rather than employees) who scour the social network for content that violates the company’s rules or is outright illegal. In the course of their work, they can be exposed to child abuse imagery, graphic violence, terrorist propaganda, and other ugly material.
“From her cubicle in Facebook’s Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence,” the court documents read. “As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Scola developed and suffers from significant psychological trauma and post -traumatic stress disorder (‘PTSD’).”
The documents allege: “Facebook ignores the workplace safety standards it helped create. Instead the multibillion-dollar corporation affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma. By requiring its content moderators to work in dangerous conditions that cause debilitating physical and psychological harm, Facebook violates California law.”
Scola lives in San Francisco, and worked for Facebook between June 2017 and March 2018, as a contractor working for the firm Pro Unlimited. “Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled,” the documents say.
Scola’s lawyers are asking for the lawsuit to be certified as a class action, to require Facebook and Pro Unlimited to create a fund to finance “a testing and treatment program” for moderators, and for unspecified damages to be paid.
Do you work at Facebook? Got a tip? Contact this reporter via Signal or WhatsApp at +1 (650) 636-6268 using a non-work phone, email at [email protected], WeChat at robaeprice, or Twitter DM at @robaeprice. (PR pitches by email only, please.) You can also contact Business Insider securely via SecureDrop.
- One Facebook outage show just how much of our attention the social network sucks away
- Inside Facebook’s plan to turn a scrapped AI project into a tool to reshape how millions communicate
- Facebook is losing its grip on users’ attention
Business Insider Emails & Alerts
Site highlights each day to your inbox.