Former TikTok Staff Launch Legal Action Over Bullying and Union Busting Claims
Ex-TikTok Workers Sue Over Bullying and Union Busting

In a significant development for workplace rights in the technology sector, former TikTok employee Lynda Ouazar has come forward with serious allegations against the social media giant. Alongside four of her former colleagues, she is initiating legal proceedings against TikTok, marking the second such court case brought by ex-UK employees against the company in recent months.

Allegations of a Toxic Work Environment

Lynda Ouazar, who worked as a moderator and later in quality control at TikTok, describes an environment rife with bullying, harassment, and exclusion from team projects. "There was lots of bullying, harassment, exclusion from the team, from projects. A lot of things were going on," she reveals. The psychological toll was severe, with Ouazar reporting difficulty sleeping, flashbacks, fatigue, and loss of motivation.

The Strain of Content Moderation

Initially finding her role rewarding, Ouazar's experience deteriorated when she was assigned to handle some of the most extreme content on the platform. This included graphic material involving child sexual assault, abuse against women, self-harm, and pervasive hate speech. "It affected me," she states, highlighting the emotional burden of daily exposure to such disturbing content.

Despite TikTok's official policies encouraging breaks and offering mental health support, Ouazar and other moderators interviewed by Sky News claim they felt unsupported in practice. Instead, they report feeling constant pressure to work faster and harder, regardless of content severity. "You are monitored by AI all day long," Ouazar explains, describing an environment where performance metrics took precedence over wellbeing.

Performance Pressure and User Safety Concerns

The pressure to meet targets had tangible consequences, both for employees and platform users. "Moderators find themselves pressurised to deliver, so they have to carry on, even if you see something which really affects you and you feel like you have tears in your eyes," says Ouazar. She notes that this pressure could lead to moderation errors, potentially allowing harmful content to remain on the platform.

This stands in contrast to TikTok's transparency report, which claims the platform removes over 99% of harmful content before it's reported. Data collected for the EU's Digital Services Act also indicates TikTok has the lowest error rates and highest accuracy in moderation among major social media platforms.

Union Involvement and Alleged Retaliation

The situation escalated when Ouazar joined the United Tech and Allied Workers union and became a union representative. She believes her union activities led to targeted bullying and harassment. "It took me some time, I would say a few months, to see the pattern," she recalls.

Ouazar reports that her performance ratings were downgraded from the highest to the lowest level without adequate explanation, even after she raised formal grievances. Other employees she helped recruit to the union began experiencing similar treatment, suggesting a pattern of retaliation against union members.

Restructuring and Redundancies

During TikTok's major restructuring of content moderation operations last year, Ouazar's team was notified they were at risk of redundancy. Of the 24 individuals identified, 11 ultimately lost their positions. According to the legal claim, all those dismissed had been openly involved in union activities at TikTok.

Stella Caram, head of legal at Foxglove, which is representing the former workers, states: "In this case specifically, we want compensation for the workers. They have been unlawfully dismissed because they were engaging with union activities. We wanted to make this a precedent because we've seen a lot of this happening across the world."

TikTok's Response

TikTok has firmly rejected the allegations. In a statement to Sky News, the company said: "We strongly reject these baseless and inaccurate claims. We have made ongoing enhancements to our safety technologies and content moderation, which are borne out by the facts: a record rate of violative content removed by automated technology (91%) and record volume of violative content removed in under 24 hours (95%)."

This legal action represents a significant challenge to TikTok's UK operations, raising important questions about workplace culture, union rights, and content moderation practices in the technology industry. The outcome could establish important precedents for how social media companies manage their workforce and respond to employee organisation efforts.