Content Moderators Are Suing Facebook for Not Properly Protecting Them From Mental Trauma

Including PTSD

Content moderators are regularly exposed to graphic imagery. Getty Images

Facebook is facing a class-action lawsuit filed on behalf of content moderators who work on a contract basis, claiming that those workers are not being properly protected against psychological trauma and post-traumatic stress disorder caused by the content they view.

Law firms Joseph Saveri Law Firm, Burns Charest and the Law Office of William Most filed in Superior Court of the State of California, on behalf of lead plaintiff Selena Scola of San Francisco.

According to the suit, Scola worked as a public content contractor at Facebook’s offices from approximately June 19, 2017, through March 1, 2018, employed by staffing company Pro Unlimited. Scola claimed in the lawsuit that she was formally diagnosed with post-traumatic stress disorder after experiencing symptoms including fatigue, insomnia and social anxiety, and those symptoms are triggered by acts including touching a computer mouse, entering a cold building, seeing violence on television, hearing loud noises or being startled.

Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to during her tenure as a content moderator.

An attorney with knowledge of the case, who asked to remain anonymous, said further details about Scola’s experiences will not be made available publicly until the court grants its permission to amend the complaint, in order to avoid violating a non-disclosure agreement that she signed with Facebook.

Facebook director of corporate communications Bertie Thomson responded to the lawsuit with this statement: “We are currently reviewing this claim. We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources. Facebook employees receive these in-house, and we also require companies that we partner with for content review to provide resources and psychological support, including on-site counseling—available at the location where the plaintiff worked—and other wellness resources, like relaxation areas at many of our larger facilities.”

Burns Charest partner Korey Nelson said in a release announcing the lawsuit, “Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job.”

The lawsuit cited content such as “videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder.”

The lawsuit also alleged that Facebook does not follow the established industry standards for training, counseling and supporting content moderators that were put in place by the social network and other internet platforms more than one decade ago.

The Technology Coalition’s “Employee Resilience Guidebook for Handling Child Sex Abuse Images” was cited in the suit. Facebook is a member of the group, along with other major internet companies.

Published in January 2015, the guidebook’s recommendations include informational interviews with candidates for content moderator positions, encouraging employers to limit the time those employees are exposed to child pornography, providing mandatory group and individual counseling sessions administered by professionals with specialized training in trauma intervention, permitting moderators to “opt-out” of viewing child pornography, encouraging employees to work on other projects, ensuring that moderators do not view child pornography in the one hour before they leave work and permitting moderators to take time off to recover from trauma.

The lawsuit also cited the National Center for Missing and Exploited Children’s guidelines, which recommend changing the color or resolution of images, superimposing grids over images, changing the direction of images, blurring portions of images, reducing the size of images and muting audio.

Attorney William Most of the Law Office of William Most said in Burns Charest’s release, “Facebook has an obligation to provide its content moderators with a safe workplace. Other tech companies implement safeguards to mitigate trauma to content moderators. Facebook must be held to the same standard.”

According to the suit, Scola is seeking an injunction requiring Facebook and Pro Unlimited to provide mandatory onsite and ongoing mental health treatment and support, and to fund medical monitoring programs to facilitate the diagnosis and adequate treatment of PTSD and other symptoms.

Joseph Saveri Law Firm attorney Steve Williams said in the release, “Our client is asking Facebook to set up a medical monitoring fund to provide testing and care to content moderators with PTSD. Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized.”


david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}