TikTok, Atrium, and others are at the center of a lawsuit filed by former content moderators over alleged psychological trauma.
Two former TikTok content reviewers are suing the company over allegations that it “failed to adequately support them as they engaged in the deeply disturbing work of removing objectionable videos from the social network.” The suit was filed earlier this month in federal court and names Ashley Velez and Reece Young as the plaintiffs.
According to the lawsuit, both former employees “did moderation work for TikTok on contract through third-party companies — Canadian tech firm Telus International and a New York-based company called Atrium.” At the moment, they’re seeking class-action status, something that would allow other content moderators at the company with similar complaints to join the lawsuit.
What happened, though? Why was the suit filed? For starters, Young and Velez allege that “TikTok and ByteDance violated California labor laws by failing to provide them with adequate mental health support in spite of the mental risks of the abnormally dangerous activities they were made to engage with on a daily basis.” On top of that, the suit argues that the defendants “pushed moderators to review high volumes of extreme content to hit quotas and then amplified that harm by forcing them to sign NDAs so they were legally unable to discuss what they saw.”
The suit further states:
“Defendants have failed to provide a safe workplace for the thousands of contractors who are the gatekeepers between the unfiltered, disgusting, and offensive content uploaded to the App and the hundreds of millions of people who use the App every day.”
In addition, the plaintiffs allege that “in spite of knowing the psychological risks of prolonged exposure to such traumatic content, TikTok and ByteDance made no effort to provide appropriate ameliorative measures to help workers cope with the extreme content after the fact.” The suit goes on to describe how Young and Velez often worked 12-hour days where they would review “extreme, disturbing content including child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”
As if that isn’t bad enough, the plaintiffs note in their suit that they were also regularly exposed to hate speech and other things that negatively impacted their mental health.
A similar lawsuit was filed in December 2021 by another TikTok content moderator. In 2018, a similar suit was also filed against Facebook. The social media giant ended up settling that lawsuit when it agreed to pay $52 million to “more than 11,000 moderators who struggled with mental health as a result of the content they were tasked with sorting through on a daily basis.”