fbpx

TikTok Content Moderators Harm from Extreme Videos Class Action

TikTok allows millions of people to create short videos, edit them, added background music and special effects, and upload them to its platform to share with others. Unfortunately, not all the content uploaded is so benign, and Tiktok, Inc. and its owner, Bytedance, Inc., employ content moderators to sift through the uploaded content and remove objectionable material. This class action brings suit against the two companies, alleging that they “failed to implement workplace safety standards” for the moderators and instead require them “to work under conditions they know cause and exacerbate psychological trauma.”

The class for this action is all individuals in the US who performed content moderation for Bytedance’s TikTok app at any time up to the present.

The plaintiff for this class action, Candie Frazier, works for Telus International, which provides content moderators for Tiktok and which is not a defendant in this suit.

Frazier spends twelve hours per day viewing material that includes horrendous “acts of extreme and graphic violence including sexual assault, genocide, rape, and mutilation.” She has viewed footage of “the genocide in Myanmar, mass shootings, children being raped, and animals being mutilated.”

In addition to this, the complaint alleges they are exposed “to conspiracy theories (including suggestions that the Covid-19 pandemic is a fraud), distortions of historical facts (like denials that the Holocaust occurred), fringe beliefs, and political disinformation (like false information about participating in the census, lies about a political candidate’s citizenship status or eligibility for public office, and manipulated or doctored videos of elected officials).” The complaint claims, “This type of content has destabilized society…”

Because of the “constant and unmitigated exposure” to these things, Frazier says she is suffering from anxiety, depression, and PTSD.

The complaint alleges that TikTok and Bytedance know that the work of content moderators has bad psychological effects, but it claims that they have not put into place safety standards that are known to mitigate this kind of harm.

The complaint alleges that the companies require content moderators to take part in “abnormally dangerous activities,” and that by not putting safety standards into place, they violate California law. Furthermore, the complaint alleges, the companies intensify the harm by requiring that content moderators sign nondisclosure agreements.

The complaint alleges that TikTok’s popularity and ability to attract a young demographic depend on the work of content moderators.

The complaint asks for three things:

  • That the companies compensate content moderators who were exposed to “graphic and objectionable content” on the TikTok platform;
  • That they make sure that content moderators are given “tools, systems, and mandatory ongoing mental health support to mitigate the harm reviewing graphic and objectionable content can cause; and
  • That they “provide mental health screening and treatment” to current and former content moderators who have been affected.
Article Type: Lawsuit
Topic: Injury

Most Recent Case Event

TikTok Content Moderators Harm from Extreme Videos Complaint

December 23, 2021

TikTok allows millions of people to create short videos, edit them, added background music and special effects, and upload them to its platform to share with others. Unfortunately, not all the content uploaded is so benign, and Tiktok, Inc. and its owner, Bytedance, Inc., employ content moderators to sift through the uploaded content and remove objectionable material. This class action brings suit against the two companies, alleging that they “failed to implement workplace safety standards” for the moderators and instead require them “to work under conditions they know cause and exacerbate psychological trauma.”

TikTok Content Moderators Harm from Extreme Videos Complaint

Case Event History

TikTok Content Moderators Harm from Extreme Videos Complaint

December 23, 2021

TikTok allows millions of people to create short videos, edit them, added background music and special effects, and upload them to its platform to share with others. Unfortunately, not all the content uploaded is so benign, and Tiktok, Inc. and its owner, Bytedance, Inc., employ content moderators to sift through the uploaded content and remove objectionable material. This class action brings suit against the two companies, alleging that they “failed to implement workplace safety standards” for the moderators and instead require them “to work under conditions they know cause and exacerbate psychological trauma.”

TikTok Content Moderators Harm from Extreme Videos Complaint
Tags: Content Moderation or Content Moderators, Health or Safety Standards for Employees, PTSD