Social Media (3)

Ask the government's Home Affairs Select Committee if social media companies have a responsibility to tackle extremism, and you'll receive a simple answer. "Yes".

The group of MPs accused Facebook, Twitter and YouTube, which is owned by Google, of "consciously failing" to prevent groups using their platforms to promote extremist views.

The MPs said: "Networks like Facebook, Twitter and YouTube are the vehicle of choice in spreading propaganda and they have become the recruiting platforms for terrorism.

With such a vast number of people accessing these online platforms, it is widely accepted that social media companies need to take responsibility. Facebook has 1.71 billion monthly users, Twitter another 313 million, and 800 million people access YouTube every month.

However, the three firms not only accept they have a role, but stress they play an important and active role in the fight against extremism and terrorism. And techUK suggested that the committee had painted "an inaccurate picture" of how much was being done.

The criticism comes after revelations surfaced during the retrial of radical cleric Anjem Choudry. During the trial, the court heard that British police requested content and accounts linked to Choudry be removed. But social media companies did not act on all of the requests made.

Twitter said it has closed 360,000 accounts since last summer because of extremist content.

Speaking to the BBC, Simon Milner, director of policy for Facebook UK, said: "Terrorists and the support of terrorist activity are not allowed on Facebook and we deal swiftly and robustly with reports of terrorism-related content.

"In the rare instances that we identify accounts or material as terrorist, we'll also look for and remove relevant associated accounts and content."

A YouTube spokesperson said: "We take our role in combating the spread of extremist material very seriously. We remove content that incites violence, terminate accounts run by terrorist organisations, and respond to legal requests to remove content that breaks UK law.

"We'll continue to work with government and law enforcement authorities to explore what more can be done to tackle radicalisation."

techUK's Charlotte Holloway defended the social media firms, insisting that they had given "significant resources, a zero-tolerance approach, and decisive and fast action when needed".

She added: "Tech companies work proactively to deal with online extremism daily, in constructive and proven partnerships with a wide range of policy-makers, the police and security agencies, and wider civil society bodies."