Hundreds of German social media moderators who remove dangerous information from platforms like Facebook and TikTok call politicians to improve their working conditions, citing tough targets and mental health difficulties.
Bundestag Digital Council will hear from TELUS International content moderator Cengiz Haksöz on Wednesday afternoon. He would tell legislators he was “mentally and emotionally drained” reviewing bad content.
TELUS International moderates Facebook posts, among others.
Facebook (META.O) and Bytedance’s TikTok use thousands of content moderators worldwide to remove hazardous content, including child pornography and severe violence.
Haksöz will submit a petition signed by over 300 German content moderators requesting better mental health services, a non-disclosure agreements (NDAs) ban, and higher compensation and benefits.
“I thought the company had mental health help, but it doesn’t. Haksöz told Reuters exclusively before his Bundestag appearance that it’s more like coaching.
“This is a serious job with big ramifications for workers. “This job changed me,” he stated. “These outsourcers help tech giants avoid their responsibilities.”
Meta’s content moderators’ working circumstances are under scrutiny. The corporation settled with long-term mentally ill American content moderators for $52 million in 2020.
“Without us, social media companies would collapse overnight,” states the petition, seen by Reuters. “Social media is unsafe until our workplaces are safe and fair.”
Meta and TELUS International declined to comment.