MPs tear into social media giants for failing to take down online hate
Social media companies have “shamefully” failed to police content online, according to MPs
Computer: Picture credit - PA
Social media companies have “shamefully” failed to police content online and should face fines if they do not take down hateful or illegal material, MPs have said.
The Home Affairs Committee said internet giants had failed to remove child abuse images, terrorist propaganda, and anti-Semitic material from their sites, even after they were reported.
Describing their response so far as a “disgrace”, the MPs called for legislation to introduce a system of “escalating sanctions”, including fines, to force the companies to take down illegal content within a strict timeframe.
Today’s report also suggested requiring the businesses to contribute to the cost of running the Metropolitan Police’s counter-terrorism internet referral unit, as it compared the situation to football clubs paying towards the cost of policing matches.
And it called for a wider government review of the legislation around hate speech, harassment and extremism online.
The committee heard evidence from Facebook, Twitter and Google as part of its inquiry.
Yvette Cooper, the Labour MP who chairs the committee, said the Government can no longer “afford to turn a blind eye” to the problem.
“Social media companies’ failure to deal with illegal and dangerous material online is a disgrace,” she said.
“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.
“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people's lives. This isn't beyond them to solve, yet they are failing to do so.
“They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.”
Tory Home Secretary Amber Rudd said she welcomed the report, but declined to commit to any specific action at this stage.
“We have made it very clear that we will not tolerate the internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponise the most vulnerable people in our communities,” she added.
“Last month I convened a meeting with the social media companies to ask them to go further in making sure this kind of harmful material is not available on their platforms, and an industry-led forum has now been set up to more robustly address this.
“We will continue to push the internet companies to make sure they deliver on their commitments to further develop technical tools to identify and remove terrorist propaganda and to help smaller companies to build their capabilities. I expect to see early and effective action."
Police Scotland will outline its business case for upgrading IT systems within the force
The vice-chair of the European Parliament’s Committee on Consumer Protection on how new proposals on copyright laws may have unintended consequences
Civica has been awarded a contract to build a platform for sharing information between police, prisons, prosecutors and forensic teams
Simon McDougall joins the regulator in the role of executive director for technology policy and innovation
Vodafone explores some of the ways IoT is significantly improving public sector service delivery
BT's Amy Lemberger argues that having the right security in place to protect your organisation is no longer just an option. It is a necessity.