(Reuters) - Facebook Inc CEO Mark Zuckerberg said on Wednesday that the company was moving moderation of its most sensitive content, including self-harm and counter terrorism, from contractors to full-time employees during the coronavirus outbreak.
He said on a call with reporters that the goal was to have contractors, who work under third-party vendors, moderate less emotionally taxing content while working at home.
Zuckerberg said it would take time to set guidelines for contractors working from home, such as mental health support for moderators and putting data privacy protections in place. He also said Facebook would increase the number of full-time employees doing this work.
The social media giant has drawn public criticism for encouraging its employees to work from home while its content moderation contractors were forced to come into offices in cities around the world. On Monday, Facebook said it planned to rely more on automated tools to take down offensive content and work with contract vendors to send content reviewers home with pay.
Social media companies face security and privacy issues when content is reviewed in less secure environments, like private homes.
The major shift in Facebook’s review operations occurs as the social media company faces pressure to police content violations. Zuckerberg acknowledged that role changes and reliance on automation would likely reduce the company’s effectiveness at moderating certain types of content.
Facebook content moderators working for third party vendors, including Genpact Ltd and Accenture PLC told Reuters this week that their employers had been scrambling to decide if moderators should come into the office and how they might access review tools remotely.
A contractor in India’s Hyderabad, who moderates counter-terrorism content, told Reuters earlier this week that they had been told it was “business as usual” from home. They said some Genpact contractors had been working remotely from Wednesday, while others were waiting for company-provided laptops to access the content moderation tool via VPN.
Facebook said this week that more videos and other content might be erroneously removed for policy violations, as it relies more on automated takedown software. But one Accenture contractor speaking to Reuters earlier on Wednesday said there was still a fear “that three weeks into the coronavirus lockdown, somebody’s going to come up with an AI that can do it all and we’ll get an email saying you don’t have to come back to work because you don’t have a job.”
Reporting by Munsif Vengattil in Bengaluru, Elizabeth Culliford in London and Katie Paul in San Francisco; Editing by Arun Koyyur, Greg Mitchell and Cynthia Osterman
Our Standards: The Thomson Reuters Trust Principles.