LONDON (Reuters) - Britain proposed new online safety laws on Monday that would slap penalties on social media companies and technology firms if they fail to protect users from harmful content.
Easy access to damaging material, particularly among young people, has caused growing concern worldwide and came into the spotlight in Britain after the death of 14-year-old schoolgirl Molly Russell, which her parents said came after she had viewed online material on depression and suicide.
Internet companies could face big fines, with bosses also held personally accountable, under rules to be policed by an independent regulator.
In the most serious cases companies could also be banned from operating in Britain if they do not everything reasonably practical to eradicate harmful content.
“We are putting a legal duty of care on these companies to keep users safe; and if they fail to do so, tough punishments will be imposed,” Prime Minister Theresa May said in a video posted online.
“The era of social media firms regulating themselves is over.”
Media Secretary Jeremy Wright said the proposed legislation - the toughest in the world - would apply to any company that allowed users to share or discover content or interact online, such as social media sites, discussion forums, messaging services and search engines.
Governments globally are wrestling over how to better control content on social media platforms, often blamed for encouraging abuse, the spread of online pornography and for influencing or manipulating voters.
Global worries were stoked by the live streaming in March of the mass shooting at a mosque in New Zealand on one of Facebook’s platforms, after which Australia said it would fine social media and web-hosting companies and imprison executives if violent content is not removed “expeditiously”.
TechUK, an industry trade group, said the paper was a significant step forward, but one that needs to be firmed up during its 12-week consultation. It said that some aspects of the government’s approach were too vague.
“It is vital that the new framework is effective, proportionate and predictable,” techUK said in a statement, adding that not all concerns could be addressed through regulation.
Facebook said it was looking forward to working with the government to ensure new regulations were effective, repeating founder Mark Zuckerberg’s line that regulations were needed to have a standard approach across platforms.
Rebecca Stimson, Facebook’s head of UK public policy, said any new rules should strike a balance between protecting society and supporting innovation and free speech.
“These are complex issues to get right and we look forward to working with the government and parliament to ensure new regulations are effective,” Stimson said in a statement.
Prime Minister May said that while the internet could be brilliant at connecting people, it had not done enough to protect users, especially children and young people.
“We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe,” she said in a statement.
The duty of care would make companies take more responsibility for the safety of users and tackle harm caused by content or activity on their services. The regulator, funded by industry in the medium term, will set clear safety standards.
A committee of lawmakers has also demanded that more is done to make political advertising and campaigning on social media more transparent.
“It is vital that our electoral law is brought up to date as soon as possible, so that social media users know who is contacting them with political messages and why,” said Damian Collins, a Conservative MP who chairs the parliamentary committee for digital, culture, media and sport.
“Should there be an early election, then emergency legislation should be introduced to achieve this.”
Reporting by Elizabeth Piper and Paul Sandle; Editing by David Holmes and David Goodman
Our Standards: The Thomson Reuters Trust Principles.