The Sad Reason TikTok Moderators Are Suing The Platform

TikTok is one of several popular social media apps that has been scrutinized by parents, internet safety advocates, and the government for not doing enough to protect its young users from harmful content. It's so scrutinized that attorneys general from several states, including California, Florida, Massachusetts, and New Jersey, opened an investigation against TikTok in early March 2022 into what detrimental effects  may have on the mental health of children, teens, and young adults. Ironically, the short-form video sharing platform hasn't considered the mental health of one particular group: its own moderators.

Advertisement

Two former moderators, Ashley Velez and Reece Young, are suing TikTok and its parent company ByteDance over the alleged emotional trauma they suffered as a result of viewing the graphic content they policed. As reported by NPR, the two women filed the federal lawsuit against the platform on Thursday, March 24, 2022. Although each of them worked for TikTok through different third-party companies, both of them alleged they worked 12-hour workdays to review and moderate content that violates the platform's terms of service.

TikTok moderators weren't given trigger warnings nor mental health support

According to the lawsuit, TikTok violated California labor law by exposing Velez and Young, as well as other content moderators, to high volumes of graphic content ranging from child abuse to killings to suicides and failing to provide them adequate mental health resources to cope with the emotional distress such content inflicted on them. For example, most of the disturbing content Velez saw included bestiality and violence against children. Young also reviewed videos containing bestiality, but she saw a video of a 13-year-old girl getting executed by cartel members. 

Advertisement

To make matters worse, the suit alleged that TikTok tied their pay to content moderation performance, pushing them to watch multiple videos at once, which they can't view for longer than 25 seconds — despite some uploads being 10 minutes long now. The suit also said they were required to sign non-disclosure agreements, forcing them to stay silent on the things they saw.

This class-action lawsuit comes two years after Facebook reached a $52 million settlement with a group of former moderators who filed a lawsuit against the social media behemoth in 2018 over the psychological trauma they suffered after seeing violent imagery posted on the platform, including murders, suicides, and beheadings. Velez and Young are seeking damages and legal fees, as well as for TikTok to establish medical screenings for themselves and other content moderators.

Advertisement

Recommended

Advertisement