A former content moderator is suing TikTok over claims the company failed to protect her mental health as she spent hours upon hours watching graphic videos.
TikTok is at the center of a lawsuit filed by a former content moderator over claims that parent company “ByteDance provides inadequate safeguards to protect moderators’ mental health against a near-constant onslaught of traumatic footage.”
The proposed class-action lawsuit was filed in the California Central District Court by Candie Frazier. According to Frazier, she spent “12 hours a day moderating videos uploaded to TikTok for a third-party contracting firm named Telus International.” During that time, she claims she witnessed “thousands of acts of extreme and graphic violence, including mass shootings, child rape, animal mutilation, cannibalism, gang murder, and genocide.”
While describing her job, Frazier said that “in order to deal with the huge volume of content uploaded to TikTok daily, she and her fellow moderators had to watch between three and ten videos simultaneously, with new videos loaded in at least every 25 seconds.” She also noted that moderators like herself were only allowed 15-minute breaks “in the first four hours of their shift, and then additional 15 minute breaks every two hours afterward.” She also pointed out that ByteDance “monitors performance closely and heavily punishes any time taken away from watching graphic videos.”
Because of that, the suit argues that TikTok and its parent company and partners failed to “meet industry-recognized standards intended to mitigate the harms of content moderation.” Some of these standards include “offering moderators more frequent breaks, psychological support, and technical safeguards like blurring or reducing the resolution of videos under review.” As a result of all the graphic videos Frazier had to watch, she alleges she suffered “severe psychological trauma including depression and symptoms associated with anxiety and PTSD.” It further states that Frazier now has “trouble sleeping and when she does sleep, she has horrific nightmares…She often lays awake at night trying to go to sleep, replaying videos that she has seen in her mind…She has severe and debilitating panic attacks.”
The allegations in Frazier’s suit matches reports from other content moderators working for other big tech companies, like Facebook, Google, and YouTube. According to employees for these places, the “terrible working conditions facing these moderators — a labor force that is absolutely crucial in maintaining the profitability of some of the world’s biggest companies — has become increasingly scrutinized.”
When commenting on the allegations, TikTok spokesperson Hilary McQuaide said:
“We strive to promote a caring working environment for our employees and contractors…Our Safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”