UK Government outlines ‘duty of care’ for social media companies
Social media companies will be left facing the prospect of massive fines or being blocked from the UK altogether if they fail to remove harmful or illegal content from their platforms, according to new measures outlined by the UK Government.
In a new white paper, the Prime Minister said the proposals would ensure firms act to stop child abuse and terrorist content from being spread in order to make the UK “the safest place in the world to be online”.
Content that encourages suicide or that is deemed disinformation or cyber-bullying will also be targeted, while new measures to stop children from accessing inappropriate material will be brought in.
Among the rules proposed in the initial 12-week consultation is a mandatory ‘duty of care’, which will force companies to take “reasonable steps” to keep users safe and tackle illegal and harmful activity on their platforms.
A new regulator with enforcement tools is set to be given the power to impose “substantial fines”, or in worse cases, to block access to sites and potentially take action against companies themselves.
Sites would also be obliged to hand over “annual transparency reports” on the amount of harmful content on their platforms and what they are doing to fix the issue.
Codes of practice issued by the regulator will meanwhile outline a responsibility for companies to “minimise the spread of misleading and harmful disinformation with dedicated fact checkers”, particularly during election periods.
Furthermore, firms will be obliged to respond to users’ complaints quickly.
Announcing the plans, May said that companies had “for too long” neglected to protect users, especially children and young people.
“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”
File hosting sites, public discussion forums, messaging services, and search engines will also fall under the remit of the new rules.
Digital Secretary Jeremy Wright said of the plans: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.
“Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However those that fail to do this will face tough action.”
Home Secretary Sajid Javid added: “Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism - is still too readily available online.
“That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise.”
But Shadow Digital Secretary Tom Watson said the plans set out in the white paper would not address "market failure".
The Labour frontbencher warned: “Labour have been calling for a new regulator with tough powers to bring social media companies into line for the last year. The public and politicians of all parties agree these platforms must be made to take responsibility for the harms, hate speech and fake news they host.
"The concern with these plans is that they could take years to implement. We need action immediately to protect children and others vulnerable to harm."
He added: “These plans also seem to stop short of tackling the overriding data monopolies causing this market failure and do nothing to protect our democracy from dark digital advertising campaigners and fake news.
"This is a start but it’s a long way from truly reclaiming the web and routing out online harms."