A Jane Doe plaintiff has filed a proposed class action against YouTube over what she alleges is the Google-owned streaming giant’s failure to implement workplace safety standards to protect content moderators from psychological trauma resulting from frequent exposure to graphic and objectionable videos.
As a result of content moderators’ unmitigated exposure to “highly toxic and extremely disturbing images” through YouTube’s proprietary “single review tool,” the workers have developed and suffered from “significant psychological trauma,” including anxiety, depression and symptoms associated with post traumatic stress disorder (PTSD), the 46-page lawsuit, filed September 21 in San Mateo County Court, alleges.
“Instead, the multibillion-dollar corporation affirmatively requires its Content Moderators to work under conditions it knows cause and exacerbate psychological trauma,” the lawsuit says, alleging that such conditions violate California law.
The case emphasizes that while the aforementioned safety standards might not eliminate altogether the risk of content moderators developing job-related psychological disorders, the standards could at least reduce the risk and mitigate any harm.
“By requiring its Content Moderators to review graphic and objectionable content, YouTube requires Content Moderators to engage in an abnormally dangerous activity,” the lawsuit says. “And by failing to implement the workplace safety standards it helped develop, YouTube violates California law.”
Still further, the suit alleges the non-disclosure agreements to which YouTube requires content moderators to agree “exacerbates” the harm experienced by the workers.
According to the lawsuit, YouTube content moderators, tasked with keeping disturbing content—described in the complaint as millions of uploads that include “graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder”—off of the streaming platform, are exposed every day to not only videos of extreme and graphic violence and assault but repeatedly to “conspiracy theories, fringe beliefs, and political disinformation,” including content related to Holocaust denials, COVID-19 hoaxes and doctored videos of elected officials.
“This type of content has destabilized society and often features objectionable content,” the suit says.
Per the complaint, YouTube content moderators, of which there are thousands across the country, are required to view “hundreds of thousands if not millions” of potentially rule-breaking videos uploaded to the platform each week. YouTube relies on users to “flag” inappropriate content before it comes before the eyes of a moderator, the suit relays.
The plaintiff goes on to allege that YouTube, during the hiring process for prospective content moderators, failed to properly inform the individuals about the nature of the work or the effect reviewing graphic content could have on their mental health. Though prospective moderators are told they might have to review graphic content, they’re neither provided examples of such nor told that they would be required to view it daily, the suit claims. Prospective hires are also never asked about their experience in viewing graphic content or told it could have negative consequences with regard to mental health, the lawsuit says, stating training for content moderators begins only after they sign a non-disclosure agreement.
As the lawsuit tells it, the training process falls short of adequately preparing YouTube content moderators for the hazards of the position and for the job itself. From the complaint:
“During the training process, YouTube failed to train Content Moderators on how to assess their own reaction to the images, and YouTube failed to ease Content Moderators into review of graphic content through controlled exposure with a seasoned team member followed by counseling sessions.
Instead, Content Moderators are provided a two-week training where an instructor presents PowerPoints created by YouTube. The PowerPoints covered various categories of content, including graphic violence, child abuse, dangerous organizations, solicitation, porn, animal abuse, regulated products, fraud, and spam. Each category was covered by 60–80 slides. For each category, the PowerPoint began with a brief description of the applicable Community Guidelines, and then dozens of examples of content, applying the Community Guidelines.”
During training, “little to no time” is spent on wellness and resiliency, the case says. According to the suit, many YouTube content moderators, who while on the job are subject to strict quantity and accuracy quotas, remain in the position for less than a year due to “low wages, short-term contracts, and the trauma associated with the work.”
The plaintiff asks the court for, among other damages, medical monitoring for YouTube content moderators that provides specialized screening, assessment and treatment not generally given to the public at large.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.