MPs tear into social media giants for failing to take down online hate
Social media companies have “shamefully” failed to police content online, according to MPs
Computer: Picture credit - PA
Social media companies have “shamefully” failed to police content online and should face fines if they do not take down hateful or illegal material, MPs have said.
The Home Affairs Committee said internet giants had failed to remove child abuse images, terrorist propaganda, and anti-Semitic material from their sites, even after they were reported.
Describing their response so far as a “disgrace”, the MPs called for legislation to introduce a system of “escalating sanctions”, including fines, to force the companies to take down illegal content within a strict timeframe.
Today’s report also suggested requiring the businesses to contribute to the cost of running the Metropolitan Police’s counter-terrorism internet referral unit, as it compared the situation to football clubs paying towards the cost of policing matches.
And it called for a wider government review of the legislation around hate speech, harassment and extremism online.
The committee heard evidence from Facebook, Twitter and Google as part of its inquiry.
Yvette Cooper, the Labour MP who chairs the committee, said the Government can no longer “afford to turn a blind eye” to the problem.
“Social media companies’ failure to deal with illegal and dangerous material online is a disgrace,” she said.
“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.
“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people's lives. This isn't beyond them to solve, yet they are failing to do so.
“They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.”
Tory Home Secretary Amber Rudd said she welcomed the report, but declined to commit to any specific action at this stage.
“We have made it very clear that we will not tolerate the internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponise the most vulnerable people in our communities,” she added.
“Last month I convened a meeting with the social media companies to ask them to go further in making sure this kind of harmful material is not available on their platforms, and an industry-led forum has now been set up to more robustly address this.
“We will continue to push the internet companies to make sure they deliver on their commitments to further develop technical tools to identify and remove terrorist propaganda and to help smaller companies to build their capabilities. I expect to see early and effective action."
Simon McDougall joins the regulator in the role of executive director for technology policy and innovation
The Home Office is looking for a software and hardware platform for monitoring those being prosecuted for immigration offences
The revelations around Cambridge Analytica show the need for better monitoring of data protection, say MEPs
Event report: Cybercrime is the most secure way of committing crime, insists Klaid Magi, a leading European expert on the threat levels and how to combat them, “and everybody knows it."
Vodafone explores some of the ways IoT is significantly improving public sector service delivery
BT's Amy Lemberger argues that having the right security in place to protect your organisation is no longer just an option. It is a necessity.