In this photo illustration a girl browses the social networking site Facebook on July 10, 2007 in London, England.
Chris Jackson | Chris Jackson Collection | Getty Images
LONDON – Many Facebook content moderators who work at outsourcing firms like Accenture, CPL, Hays, and Voxpro have been leaving to take in-house roles at TikTok, according to LinkedIn analysis by CNBC.
The outsourcing firms are under contract to Facebook and the social media giant refers to them as contractors.
Content moderation has become one of the biggest challenges for social media companies. Firms like Facebook, Instagram, TikTok, Twitter, YouTube and Quora use a combination of software and thousands of humans to spot and remove videos, photos and other content that breaches their rules. That includes everything from posts by President Donald Trump to hate speech, terrorism, child abuse, self-harm, nudity, and drug abuse.
Over 25 people have left roles where they worked on Facebook content to join to TikTok, according to LinkedIn analysis. Their reasons for leaving are unclear as they did not respond to a CNBC request for comment.
A Facebook spokesperson told CNBC: “Our content reviewers play an important role in keeping our platform safe for billions of users. That’s why we ensure our partners provide competitive pay and benefits, high-quality work environments, and the training, coaching, guidance and support necessary to successfully review content.”
TikTok declined to comment on this story, while CPL did not immediately respond to a request for comment.
Moderating Facebook content
Dublin-based Chris Gray, 54, moderated Facebook content for 10 months while at CPL, which helps global tech firms to establish their operations in Ireland and recruit tech workers in that country and across Europe. He told CNBC that it was a “terrible job” adding that TikTok looks a “lot better,” partly because there isn’t as much extreme content being uploaded there yet.
While at CPL, Gray had to deal with the “nasty stuff” on a night shift that ran from 6 p.m. until 2 a.m. He was paid 12 euros 98 cents ($15.39) up until 8 p.m., when the pay went up 25%. “That’s not a lot of money,” he said, adding that Dublin is one of the most expensive cities in Europe. Employees were fed food that had been shipped in from the nearby Facebook office and reheated, Gray said.
The CPL office itself was bright and airy with six-foot yellow emojis painted on the wall. “It all seems very cool and hip but after a while you just sink into this morass of despair,” Gray said.
During his shift, he’d review about 100 pieces of content an hour that had been reported as violent, disturbing, racist, or hateful. “You make your decision on that and as soon as you hit the bottom, the next piece of content loads,” said Gray, who now works as a tour guide.
“It could be people being unloaded from a truck somewhere in the Middle East and lined up by a trench and machine gunned or it might be Dave and Dorine have broken up and they’re having a bit of a spat and making claims about who’s a junkie and who is a slut,” Gray continued, adding that some people use the reporting tool “as a weapon against each other.”
Gray is one of dozens of contractors from across the EU that is in the process of suing Facebook and CPL for post-traumatic stress disorder (PTSD). He wants a judge to rule that “Facebook didn’t take care of people and that they have been willfully blind to what was going on.”
TikTok “hire content moderators in-house, not through a staffing agency,” Gray said. “Plus, they may have a proper system in place to mitigate it (PTSD) as the problem emerges. Facebook is very much in denial about the dangers.”
In order for outsourcing companies to do Facebook content moderating, Facebook requires that they provide access to on-site counselling during all hours of operations. It also requires them to have access to a staffed 24-hour help hotline for in-the-moment psychological support.
Facebook is also looking at technical solutions that can limit exposure to graphic material as much as possible including tools to blur graphic images as default before they’re reviewed by a moderator.
TikTok’s rapid growth
Theo Bertram, TikTok’s director of government relations and public policy in Europe, told British politicians in September that TikTok now has over 10,000 people working on trust and safety worldwide.
Owned by China’s ByteDance, TikTok has recently set up what it calls “trust and safety hubs” in San Francisco, Singapore, and Dublin. Moderators in these offices are responsible for keeping inappropriate content off the app, which has been downloaded over 2 billion times, according to app tracking firm SensorTower.
With millions of pieces of content uploaded to TikTok every week, it’s a big job that requires lots of people.
“If there’s one company that knows how to ruthlessly poach staff from rivals it’s ByteDance,” said Matthew Brennan, a China-based social media analyst who has just written a book on TikTok and ByteDance.
“They won’t think twice about swooping in to take advantage of Facebook’s difficulties. All’s fair in love, war and business,” he told CNBC.
Dublin hub
Many of those who have left Facebook moderating roles to join TikTok are located in Dublin, where both companies have large moderation teams.
TikTok announced Wednesday that it is planning to hire at least 200 people in Ireland over the next three months, taking its total headcount in the country from 900 to over 1,100 by January 2021.
The company opened a trust and safety hub in Dublin at the start of 2020. At launch, there were less than 20 people in the office but there are now almost 600 people based there, said Cormac Keenan, head of trust and safety at TikTok, in a blog post.
“Today, over 100 million people in Europe are active on TikTok every month and we want to ensure that as this community continues to grow, we are doing everything we can to keep TikTok a safe space,” he said.
Remote working
Last month, The Guardian reported that Facebook moderators at CPL were being forced to work in a Dublin office despite a high-tier lockdown, while Facebook’s own employees worked from home.
Gray said he used to review Facebook content from home and that there’s no technical reason to prevent people from working remotely. “I think they accept internally that this stuff carries a risk,” he said. “If you’re sitting at home viewing child porn, as opposed to being in the office supervised with access to their wellness team, then there’s a higher risk that they’re going to be accused of endangering people. But they can’t admit that because that means that they jeopardize their legal defense.”
In a statement to The Guardian, CPL said: “Our employees carry out extremely important work, keeping the Facebook platform safe. They are positively contributing to society in the work that they do in ensuring the safety of our online communities, and their roles are deemed essential.”
“The health and safety of our employees is our top priority and we review each employee’s situation on a case-by-case basis. Our employees work in a state of the art office which is operating at 25% capacity to facilitate strict social distancing. We are providing private transport to and from the office, so employees do not need to take public transport.”
CPL announced to the stock market last week that it has been acquired by Japan’s Outsourcing for 318 million euros.
Source: CNBC