BERLIN (Reuters) - The German parliament approved a plan on Friday to fine social media networks up to 50 million euros ($57 million) if they fail to remove hateful postings promptly, despite concerns the law could limit free expression.
Germany has some of the world’s toughest laws covering defamation, public incitement to commit crimes and threats of violence, with prison sentences for Holocaust denial or inciting hatred against minorities. But few online cases are prosecuted.
The law gives social media networks 24 hours to delete or block obviously criminal content and seven days to deal with less clear-cut cases, with an obligation to report back to the person who filed the complaint about how they handled the case.
Failure to comply could see a company fined up to 50 million euros, and the company’s chief representative in Germany fined up to 5 million euros.
The Central Council of Jews in Germany, in a statement, hailed the law as “the logical next step for effectively tackling hate speech since all voluntary agreements with the platform providers have been virtually unsuccessful.”
German Justice Minister Heiko Maas said the measure to “end the internet law of the jungle” was long overdue and dismissed suggestions that it would infringe freedom of speech.
The issue has taken on more urgency amid concerns in Germany that proliferating fake news and racist content, particularly targeting migrants, could sway public opinion in the run-up to a national election due on Sept. 24.
But organizations representing digital companies, consumers and journalists have accused the government of rushing a law through parliament that could damage free speech.
Facebook, which has 29 million active users in Germany - more than a third of the total population - has said it is working hard to remove illegal content, deleting 3,500 posts per week in Germany in the past two months.
“This law as it stands now will not improve efforts to tackle this important societal problem,” a spokesman said, adding Facebook did not think it had been consulted enough.
Facebook noted that in May it had announced plans to add an extra 3,000 workers around the world over the next year to monitor reports of inappropriate material, in addition to 4,500 people already reviewing posts.
In Berlin, Facebook’s partner Arvato will employ up to 700 staff for “content moderation” by the end of the year.
A German government survey has shown that Facebook deleted just 39 percent of content deemed criminal and Twitter only 1 percent, even though they had signed a code of conduct including a pledge to delete hate speech within 24 hours.
However, Facebook says it has significantly improved its processes since then and is now removing 87 percent of posts reported by German non-governmental organizations.
Twitter has also made a number of changes, including adding new filtering options, putting limits on accounts it had identified as engaging in abusive behavior and stopping those users from creating new accounts.
In response to criticism of the draft law, the government softened the legislation by excluding email and messenger providers and opening up the option of creating joint monitoring facilities to make decisions about what content to remove.
It also made clear that a fine would not necessarily be imposed after just one infraction, but only after a company systematically refused to act or does not set up a proper complaint management system.
Reporting by Thorsten Severin; Writing by Emma Thomasson and Andrea Shalal; Editing by Thomas Escritt and Gareth Jones