YouTube Content Moderators Protection from Trauma, PTSD Class Action

Users of YouTube, Inc.’s YouTube platform are able to post millions of videos without any prescreening by the company. Some of this content, says the complaint for this class action, is truly horrible. YouTube therefore employs content moderators to look at videos and remove any that violate the platform’s terms of use. According to the complaint, however, YouTube has not implemented its own workplace standards designed to mitigate the psychological trauma suffered by content moderators. The plaintiff in this case is one such moderator who claims to have developed anxiety, depression, and symptoms of PTSD.

The class for this action is all persons who acted as Content Moderators for YouTube, at any time up to the present.

Content Moderators are exposed to conspiracy theories, disinformation, and other content that the complaint claims has “destabilized society and often features objectionable content.” That, apparently, is the easier part of the job. The complaint says that Content Moderators have “witnessed thousands of facts of extreme and graphic violence and sexual assault” that include everything from “genocide in Myanmar to mass shootings in Las Vegas and Christ Church to videos of children being raped and animals being mutilated…”

Because of all this, the plaintiff in this case, who is referred to as Jane Doe, “developed and suffers from significant psychological trauma including anxiety, depression and symptoms associated with PTSD [post-traumatic stress disorder].”

YouTube and its parent company, Google, LLC, have created safety standards meant to “mitigate the negative psychological effects that viewing graphic and objectionable content has on Content Moderators.” These include “providing Content Moderators with robust and mandatory counseling and mental health support; altering the resolution, audio, size, and color of trauma-inducing images and videos; and training Content Moderators to recognize the physical and psychological symptoms of PTSD, anxiety, and depression.”

However, the complaint alleges, YouTube has not implemented these measures.

This causes a triple injury, the complaint says. First, the Content Moderators must “engage in an abnormally dangerous activity. And by failing to implement the workplace safety standards it helped develop YouTube violates California law. By imposing non-disclosure agreements, YouTube exacerbates the harm that it causes to Content Moderators.”

The complaint asks the court to require three things from YouTube:

  • Compensating Content Moderators exposed to “graphic and objectionable content…”
  • Ensuring that Content Moderators are given “tools, systems, and mandatory ongoing mental health support to mitigate the harm reviewing graphic and objectionable content can cause.
  • Providing mental health screening and treatment to current and former moderators “affected by YouTube’s unlawful practices.”

The counts include a variety of negligence charges and violation of California’s Unfair Competition Law.

Article Type: Lawsuit
Topic: Injury

Most Recent Case Event

YouTube Content Moderators Protection from Trauma, PTSD Complaint

October 24, 2020

Users of YouTube, Inc.’s YouTube platform are able to post millions of videos without any prescreening by the company. Some of this content, says the complaint for this class action, is truly horrible. YouTube therefore employs content moderators to look at videos and remove any that violate the platform’s terms of use. According to the complaint, however, YouTube has not implemented its own workplace standards designed to mitigate the psychological trauma suffered by content moderators. The plaintiff in this case is one such moderator who claims to have developed anxiety, depression, and symptoms of PTSD.

YouTube Content Moderators Protection from Trauma, PTSD Complaint

Case Event History

YouTube Content Moderators Protection from Trauma, PTSD Complaint

October 24, 2020

Users of YouTube, Inc.’s YouTube platform are able to post millions of videos without any prescreening by the company. Some of this content, says the complaint for this class action, is truly horrible. YouTube therefore employs content moderators to look at videos and remove any that violate the platform’s terms of use. According to the complaint, however, YouTube has not implemented its own workplace standards designed to mitigate the psychological trauma suffered by content moderators. The plaintiff in this case is one such moderator who claims to have developed anxiety, depression, and symptoms of PTSD.

YouTube Content Moderators Protection from Trauma, PTSD Complaint
Tags: Causes Pain or Injury, Employment Violations, Negligence, Psychological Injury