SEPTEMBER 17, 2019
The task of moderating Facebook continues to leave psychological scars on the company’s employees, months after efforts to improve conditions for the company’s thousands of contractors, the Guardian has learned.
A group of current and former contractors who worked for years at the social network’s Berlin-based moderation centres has reported witnessing colleagues become “addicted” to graphic content and hoarding ever more extreme examples for a personal collection. They also said others were pushed towards the far right by the amount of hate speech and fake news they read every day.
They describe being ground down by the volume of the work, numbed by the graphic violence, nudity and bullying they have to view for eight hours a day, working nights and weekends, for “practically minimum pay”.
A little-discussed aspect of Facebook’s moderation was particularly distressing to the contractors: vetting private conversations between adults and minors that have been flagged by algorithms as likely sexual exploitation.
Such private chats, of which “90% are sexual”, were “violating and creepy”, one moderator said. “You understand something more about this sort of dystopic society we are building every day,” he added. “We have rich white men from Europe, from the US, writing to children from the Philippines … they try to get sexual photos in exchange for $10 or $20.”
Gina, a contractor, said: “I think it’s a breach of human rights. You cannot ask someone to work fast, to work well and to see graphic content. The things that we saw are just not right.”
The workers, whose names have been changed, were speaking on condition of anonymity because they had signed non-disclosure agreements with Facebook. Daniel, a former moderator, said: “We are a sort of vanguard in this field … It’s a completely new job, and everything about it is basically an experiment.”
John, his former colleague, said: “I’m here today because I would like to avoid other people falling into this hole. As a contemporary society, we are running into this new thing – the internet – and we have to find some rules to deal with it.
“It’s important to create a team, for example in a social network, aiming to protect users from abusers, hate speech, racial prejudice, better pornographic software, etc. But I think it’s important to open a debate about this job. We need to share our stories, because people don’t know anything about us, about our job, about what we do to earn a living.”
Some of the moderators’ stories were similar to the problems experienced in other countries. Daniel said: “Once, I found a colleague of ours checking online, looking to purchase a Taser, because he started to feel scared about others. He confessed he was really concerned about walking through the streets at night, for example, or being surrounded by foreign people.
“Maybe because all this hate speech we have to face every day affects our political view somehow. So a normal person, a liberal person, maybe also a progressive person, can get more conservative, more concerned about issues like migrants for example. Indeed, many of the hate speech contents we receive on a daily basis are fake news … which aim to share very particular political views.”
In February, the technology site the Verge produced one of the first behind-the-scenes reports from a US Facebook contractor. Similar to their Berlin colleagues, the Americans reported that “the conspiracy videos and memes that they see each day gradually led them to embrace fringe views”, and that a former moderator “now sleeps with a gun at his side” after he was traumatised by a video of a stabbing.
Others were dealing with trauma by self-medicating. Just as the Arizona moderators were reportedly turning to drugs and alcohol, so were those in Germany. “I saw a lot of big consumer drugs in the company,” Daniel said. “We don’t have any way to destress. The company, technically, is against drugs.”
When trying to go down a more legitimate route of self-help, the American moderators complained about the psychological help that was provided. “The on-site counsellors were largely passive,” the Verge reporter Casey Newton wrote, “relying on workers to recognise the signs of anxiety and depression and seek help.”
On Aug. 6, 1991, British scientist Tim Berners-Lee at CERN put online the world’s first website, which was created to introduce the World Wide Web (WWW) to newcomers on the internet. The site’s address was info.cern.ch, and the first webpage address was http://info.cern.ch/hypertext/WWW/TheProject.html. The webpage is still active.
(Pictured L-R) Berners-Lee and the computer used by him at CERN to devise WWW.
Berlin moderators were also critical of the counselling services provided and suggested they leaned too heavily on the state’s universal healthcare.
Daniel said: “In the end, we didn’t have proper psychological support. We had some colleagues who went to the [counsellor], and when they showed that they had real problems, they were invited to go outside the company and find a proper psychologist.”
The Verge report appeared to trigger reforms. Moderators in Berlin said after the article was published there had been immediate interest from Facebook’s head office in their workload. Previously, they had been required to moderate 1,000 pieces of content a day – more than one every 30 seconds over an eight-hour shift.
In February, an official from Facebook’s Dublin office visited, John said. “This person after this meeting decided to take off the limit of 1,000. We didn’t have any limit for a while, but now they have re-established another limit. The limit now is between 400 and 500 tickets.” The new cap – or number of tickets – was half that of the previous one but still required workers achieve about a ticket a minute. However, that volume of work was what their American colleagues had faced before the reforms.
Berlin moderators have discussed whether to seek help from the unions, but say the nature of the work makes it difficult. Gina said: “I wouldn’t say no one is interested, but no one has the possibility to do something for real.”
John added: “They are so tired.”
While the moderators agreed such work was necessary they said the problems were fixable. “I think it’s important to open a debate about this job,” he said, adding that the solution was simple – “hire more people”.
In a statement, Facebook said: “Content moderators do vital work to keep our community safe, and we take our responsibility to ensure their wellbeing incredibly seriously. We work closely with our partners to ensure they provide the support people need, including training, psychological support and technology to limit their exposure to graphic content.
“Content moderation is a new and challenging industry, so we are always learning and looking to improve how it is managed. We take any reports that our high standards are not being met seriously and are working to look into these concerns.”
Courtesy/Source: The Guardian