A wave of criticism has broken out about the trauma content reviewers endure on the job
In the wake of the 2016 presidential election, Facebook Inc rushed to expand efforts to police its platforms, trying to keep political misinformation, graphic violence, terrorist propaganda, and revenge porn off the products. This has entailed both new technology, and thousands of new workers. Facebook now has about 15,000 content reviewers, almost all of whom work not for Facebook itself but for staffing firms like Accenture and Cognizant.
The company’s decision to outsource these operations has been a persistent concern for some full-time employees. After a group of content reviewers working at an Accenture facility in Austin, Texas complained in February about not being allowed to leave the building for breaks or answer personal phone calls at work, a wave of criticism broke out on internal messaging boards. “Why do we contract out work that’s obviously vital to the health of this company and the products we build,” wrote one Facebook employee. (Bloomberg News viewed dozens of messages about the topic, on the condition that it not publish the names of people involved; Business Insider first reported the internal criticism earlier this month.)
A Facebook spokeswoman said there has been no change in policies at the facility in Austin, and that it has been working with Accenture to ensure practices comply with Facebook policies. Accenture referred questions about the program to Facebook.
The pressure on the company doesn’t seem likely to subside. Over the years, a stream of media reports have detailed one of the Internet’s most dystopian jobs. The most recent example came on Monday, when the Verge published a lengthy account from several Cognizant employees working in Phoenix. They described the trauma of being presented with an endless procession of graphic violence and disturbing sexual activities, and said the restrictive working conditions further aggravated their stress.
Legally, Facebook believes it is insulated from much of what goes on in the outsourcing centers like the ones in Austin and Phoenix. Selena Scola, a content moderator working for a company called Pro Unlimited sued Facebook in September, saying it was responsible for her post-traumatic stress disorder. In a court filing, the company responded by saying that Scola had no right to sue, because she was an independent contractor. It argued that any harm she suffered was either her own fault, or the fault of unnamed third parties. The case is pending.
But legal cover isn’t the only consideration. Over the weekend, Facebook circulated an explanation on internal message boards trying to dispel employee concerns, and detailing how it planned to address questions about how staffing companies treat their employees. The message, posted publicly to Facebook’s blog on Monday, was written by Justin Osofsky, Facebook’s vice president of global operations. It said that outsourcing content review was the only way it could scale quickly enough to meet its needs. “Given the size at which we operate and how quickly we’ve grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis,” he wrote.
Facebook says it plans to review its contracts with staffing firms, requiring them to have adequate facilities and mental health resources. It also plans to make regular site visits to make sure staffing firms are adhering to the requirements. The company plans to do a wider review of its relationships with contracting firms, starting with a summit with vendors in April. “We will regularly evaluate these roles, our needs going forward, the risks, location, mix of the workforce, and many more areas,” wrote Osofsky.
Even in the best of conditions, content review is a brutal job. Bloomberg news spoke with three current content moderators employed by Accenture and all of them requested anonymity because they had signed non-disclosure agreements. “It’s a strain. I don’t know what I’m going to see,” one said. “I don’t have a problem with nudity – that’s what I signed up for – but then there are random beheadings.” Another said that moderators periodically discuss their daily “body counts” – the number of dead bodies they see per shift.
Facebook and Accenture both provide access to “wellness” resources to help moderators deal with the stress. Facebook is also running a pilot program at one facility that offers virtual reality-based treatments for stress, a strategy that has proven effective in other contexts.
All three content moderators who spoke to Bloomberg News said they were reluctant to come forward for help. “Because of the whole contractor situation, they can fire you for any reason,” says one of them. One said he tried meditation to deal with the stress of the job; another said he regularly visited Reddit pages featuring cute animal photos. Many reviewers attempt to pay as little attention to their work as possible, either by listening to music or streaming movies as they work. Alcohol and marijuana use is common both on and off the premises, they said.
A Facebook spokesperson said it was reviewing how Accenture provided resources to deal with stress, and added that drug and alcohol use on the job violates its contracts.
The company sees hiring all content reviewers directly as a bad use of resources, said two people familiar with Facebook’s strategy who requested anonymity because they weren’t authorized to speak publicly. They added that using staffing firms allowed the company to avoid layoffs in the future.
One argued that people have an idealized view of the career path once available at companies like Kodak, where janitors rose through the ranks to become vice presidents – while forgetting that, in their view, Kodak’s unsustainable cost structure left it unable to respond to changes in the market. Eventually, this person said, the company’s plan is to offload an increasing proportion of content moderation to automated systems. This echoes comments Facebook Chief Executive Officer Mark Zuckerberg has made when employees have brought up content moderation to him in employee meetings.
How much human intervention will be needed in the long run is a matter of debate. One content reviewer who has worked for Facebook through Accenture for several months says even over that span there’s a noticeable difference in how well automated filters catch fake accounts that act in particular ways. But new behaviours have already emerged to fool the computers. “There’s no way you could train a robot to understand those complexities,” he said.
On Facebook’s message boards, employees said that the spectre of automating away content review jobs should make the company more sensitive to the lives of the humans doing the job in the meantime. “It seems like the least we could do is treat people well before they get replaced,” one of them wrote.
Copyright Bloomberg News