Facebook Hit with Class Action Over Psychological Trauma Suffered by Content Moderators [UPDATE]
Last Updated on July 23, 2021
Garrett et al. v. Facebook, Inc. et al.
Filed: March 12, 2020 ◆§ 8:20-cv-00585
A class action alleges Facebook falls far short in providing help for content moderators experiencing psychological trauma related to viewing graphic images and videos.
July 22, 2021 – Judge Tosses Content Moderators’ Lawsuit
The judge overseeing the case detailed on this page has granted the defendants’ motions to dismiss after finding that the plaintiffs failed to adequately plead their claims.
In a May 14 order, found here, U.S. District Judge Kathryn Kimball Mizelle dismissed the claims of the Arizona plaintiffs, finding the court did not have specific jurisdiction over Facebook and Cognizant with regard to plaintiffs in that state.
As for the claims that Cognizant concealed the dangers of content moderators’ jobs, the judge noted that the plaintiffs failed to point to any specific information that the defendant hid from them, including “who should have warned them, when they should have been warned, and where they should have been warned.” Moreover, Judge Mizelle pointed out that the plaintiffs’ other allegations contradict their claim that Cognizant concealed the dangers of their jobs because they note that “[i]t is well known that exposure to images of graphic violence can cause debilitating injuries, including PTSD” and point to numerous studies and available research on the topic.
“By Plaintiffs [sic] own allegations, any danger was then fully accessible to them through due diligence,” the judge wrote.
The order further stated that the plaintiffs had failed to establish that Cognizant had a duty to disclose the aforementioned dangers due to a special relationship of trust or that the individuals would not have accepted the content moderator job had they known about the associated risks.
With regard to the plaintiffs’ claims that Facebook negligently caused them to be at risk of developing serious mental health injuries, the judge ruled that the individuals had failed to establish that they were physically impacted. Per the order, Florida law requires that those seeking damages for emotional distress caused by negligence must show that the emotional distress stemmed from injuries sustained in a physical impact.
The judge also threw out the plaintiffs’ claims under the Florida Deceptive and Unfair Trade Practices Act because that law applies to consumers, not employees, and does not cover personal injury.
Finally, Judge Mizelle tossed the plaintiffs’ claims for medical monitoring, ruling that the amended complaint did not establish a valid claim for such.
The case was therefore dismissed and administratively closed for all except three plaintiffs whose claims were stayed pending arbitration.
At issue in the case is not only the “debilitating physical and psychological harm” the plaintiffs say is suffered by Facebook content moderators but the company and contractor Cognizant Technology Solutions U.S. Corporation’s alleged practice of cutting the workers loose upon the expiration of their contracts. Dubbed in the suit as “the first responders of the internet,” content moderators are hired on a contract basis and laid off at the end of their term in what the lawsuit claims is an attempt by Facebook and Cognizant to shirk accountability for the workers’ mental health.
The lawsuit explains that as a result of “constant and unmitigated” exposure to “highly toxic and extremely disturbing images” that go through Facebook’s content review systems, content moderators have developed and suffered from “significant psychological trauma and/or post-traumatic stress disorder.”
Compounding matters for Facebook content moderators is what the suit describes as the failure of the company and Cognizant to implement workplace safety improvements of their own creation. The lawsuit says that while Facebook, “in an effort to cultivate its image,” has drafted employee-focused measures concerning pre-hiring psychological screening, robust counseling, working in pairs rather than alone, reducing emphasis on efficiency and productivity and providing additional wellness breaks, the company and Cognizant “ignore the very workplace standards they helped create.”
“Instead, the multibillion-dollar corporations affirmatively require their content moderators to work under conditions known to cause and exacerbate psychological trauma,” the lawsuit out of Florida alleges.
The plaintiffs allege Facebook and Cognizant’s requirement for content moderators to work under conditions that cause debilitating physical and psychological harm—and then laying the workers off at the end of their contracts—is a violation of Florida and Arizona laws. The plaintiffs assert in the case that without the court’s intervention, Facebook and Cognizant will “continue to breach the duties they owe” to content moderators.
“Plaintiffs and other content moderators, at a minimum, deserve the same protections as other first responders, which includes workers’ compensation/health coverage for the PTSD caused by the working conditions,” the complaint reads.
Through their lawsuit, which has been removed to federal court, the plaintiffs look to ensure Facebook and Cognizant put in place safe tools, systems and mandatory ongoing mental health support; establish a medical monitoring fund; and provide compensation to thousands of current and former moderators.
Before commenting, please review our comment policy.