(Reuters) - Deciding its first-ever cases, Facebook Inc’s oversight board ruled on Thursday that the social media company was wrong to remove four of five pieces of content the board reviewed, including posts Facebook took down for violating rules on hate speech and harmful COVID-19 misinformation.
The first rulings will be scrutinized to see how independent the board appears from the world’s largest social media platform and how it may rule in the future, particularly ahead of its high-profile decision on whether Facebook was right to suspend former U.S. President Donald Trump.
Facebook blocked Trump’s access to his Facebook and Instagram accounts over concerns of further violent unrest following the Jan. 6 storming of the U.S. Capitol by the former president’s supporters. The board said the Trump case would be opened for public comment on Friday and that he had not yet provided a statement to the board.
Facebook said it would abide by the board’s decisions. The group, which was created by Facebook in response to criticism of the way it treats problematic content, also called for the company to be clearer about its rules on what is allowed on its platforms.
Here is the full list of the board’s rulings:
* A post from a user in Myanmar with photos of a deceased child that included commentary on a perceived inconsistency between Muslims’ reactions to killings in France and to China’s treatment of Uighur Muslims.
* An alleged quote from Nazi propaganda minister Joseph Goebbels that Facebook removed for violating its policy on “dangerous individuals and organizations.”
* A post in a group claiming certain drugs could cure COVID-19, which criticized the French government’s response to the pandemic. This case was submitted by Facebook, rather than a user.
* Instagram photos showing female nipples that the user in Brazil said aimed to raise awareness of breast cancer symptoms. Facebook had also said this removal was an error and restored the post.
* A post that purported to show historical photos of churches in Baku, Azerbaijan, with a caption that Facebook said indicated “disdain” for Azerbaijani people and support for Armenia.
See a factbox on the decisions.
Facebook now has seven days to restore the pieces of content that the board ruled should not have been taken down. The board said it would shortly announce one more decision from its first batch, as well as the next round of cases.
The board also issued nine nonbinding policy recommendations - for example that Facebook should tell users the specific rule they have violated and better define their rules on issues like dangerous groups and health misinformation. Facebook doesn’t have to act on these, but it does have to publicly respond within 30 days.
“We can see that there are some policy problems at Facebook,” said board member Katherine Chen in an interview. “We want their policy to be clear - especially those policies involved in human rights and freedom of speech. They have to be precise, accessible, clearly defined,” she added.
In a blog post responding to the decisions, Facebook said it would publish updated COVID-19 misinformation policies. However, it said it would not change its approach to removing misinformation during the global pandemic.
Facebook has long faced criticism for high-profile content moderation issues.
The board said on Thursday that it had received 150,000 appeals since it started accepting cases in October. It will rule on a limited number of controversial decisions.
The board has 20 members including former Danish Prime Minister Helle Thorning-Schmidt and Nobel Peace Prize laureate Tawakkol Karman.
Some Facebook critics and civil rights groups slammed the board’s rulings. A group dubbed The Real Facebook Oversight Board said the decisions showed “deep inconsistencies and troubling precedent for human rights.”
Eric Naing, a spokesman for Muslim Advocates, also a member of the group, said that “instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility” and that the board’s ruling had reinstated “a dangerous, anti-Muslim post in Myanmar.”
The group hears cases from users who have exhausted the company’s appeals process on content removed from Facebook’s platforms, not content that has been left up. The board’s limited remit has been the subject of criticism. Facebook itself can ask the board to review a wider range of content problems.
Facebook has pledged $130 million to fund the board for at least six years.
Reporting by Elizabeth Culliford; Editing by Kenneth Li, Cynthia Osterman and Steve Orlofsky
Our Standards: The Thomson Reuters Trust Principles.