MPs tear into social media giants for failing to take down online hate

Written by Josh May on 1 May 2017 in News

Social media companies have “shamefully” failed to police content online, according to MPs

Computer: Picture credit - PA

Social media companies have “shamefully” failed to police content online and should face fines if they do not take down hateful or illegal material, MPs have said. 

The Home Affairs Committee said internet giants had failed to remove child abuse images, terrorist propaganda, and anti-Semitic material from their sites, even after they were reported.

Describing their response so far as a “disgrace”, the MPs called for legislation to introduce a system of “escalating sanctions”, including fines, to force the companies to take down illegal content within a strict timeframe.


RELATED CONTENT 

Senior Tory MP urges UK Government to tackle cyber interference before general election 

Cyber threat to UK business is ‘significant and growing’


Today’s report also suggested requiring the businesses to contribute to the cost of running the Metropolitan Police’s counter-terrorism internet referral unit, as it compared the situation to football clubs paying towards the cost of policing matches.  

And it called for a wider government review of the legislation around hate speech, harassment and extremism online.

The committee heard evidence from Facebook, Twitter and Google as part of its inquiry. 

Yvette Cooper, the Labour MP who chairs the committee, said the Government can no longer “afford to turn a blind eye” to the problem.

“Social media companies’ failure to deal with illegal and dangerous material online is a disgrace,” she said.

“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people's lives. This isn't beyond them to solve, yet they are failing to do so.

“They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.”

Tory Home Secretary Amber Rudd said she welcomed the report, but declined to commit to any specific action at this stage.

“We have made it very clear that we will not tolerate the internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponise the most vulnerable people in our communities,” she added.

“Last month I convened a meeting with the social media companies to ask them to go further in making sure this kind of harmful material is not available on their platforms, and an industry-led forum has now been set up to more robustly address this.

“We will continue to push the internet companies to make sure they deliver on their commitments to further develop technical tools to identify and remove terrorist propaganda and to help smaller companies to build their capabilities.  I expect to see early and effective action."

Tags

Categories

Related Articles

Police need to consider the implications of blockchain technology, says Police Foundation think tank
29 September 2017

The Police Foundation identifies various potential benefits and challenges of distributed ledger technology

Getting the balance right: a Holyrood roundtable discussion on digital justice
14 July 2017

Progress is being made towards digitising Scotland’s justice system, but it’s important not to lose the human element, the Holyrood and Axon roundtable on digital justice concluded

Police Scotland launches online form for Sex Offender Community Disclosure Scheme
3 May 2017

The scheme allows members of the public to find out if someone who has access to their child is a registered child sex offender

Share this page