The European Commission has warned Facebook, Google, YouTube, Twitter and other internet technology companies that they must do more to stem the spread of extremist content or face legislation.
Growing pressure from European governments has meant progress has been made by companies in significantly boosting their resources dedicated to help take down extremist content as quickly as possible.
But on Wednesday the EU security commissioner Julian King said: “We are not there yet. We are two years down the road of this journey: to reach our final destination we now need to speed up our work.”
The remarks came at the closing of the third meeting of the EU Internet Forum, which brings together the commission, EU member states, law enforcement and technology companies.
If the EU is not satisfied with the further progress on the removal of extremist content by technology companies, which are primarily based in the US, it said it will come forward with legislation next year to force the issue.
While an online hate speech law will already come into effect on 1 January in Germany, the commission said it is keen to avoid a patchwork of national laws on the issue, favouring a self-regulatory approach.
A group of technology companies pooling resources to combat extremist content called the Global Internet Forum – a group which includes Microsoft, Facebook, Twitter and YouTube – said that progress had been made with Europol. A database of known “terrorist” images and videos now contains more than 40,000 hashes, or digital signatures, allowing quicker detection and removal of the content, with the aim of being within one to two hours of it being uploaded.
“It is feasible to reduce the time it takes to remove content to a few hours,” said Dimitris Avramopoulos, EU home affairs commissioner. “There is a lot of room for improvement, for this cooperation to produce even better results, starting with the reporting from the companies, which must become more regular and more transparent.”
The commission wants companies to make greater use of automatic detection technologies and act faster on referrals from member states and Europol.
The Counter Extremism Project (CEP), a non-profit organisation, said efforts by the companies were encouraging but did not make a strong case for continued self-regulation.
“Instead, what the EU Internet Forum should aim to deliver are concrete, industry-wide policies on blocking or rapidly removing illegal content and consistent enforcement of those policies,” said David Ibsen, executive director of the CEP.
- Facebook bans women for posting ‘men are scum’ after harassment scandals
- Google to hire thousands of moderators after outcry over YouTube abuse videos