NEWS

The European Commission must amend the regulation on terrorist content online to protest fundamental rights
On 12 September, the European Commission published a proposal for a Regulation on preventing the dissemination of terrorist content online. The proposal is very problematic from a fundamental rights and free expression perspective. Index on Censorship joins others in highlighting these concerns.   Dear Ministers, The undersigned organisations are dedicated to protecting fundamental human rights, including […]
04 Dec 18
Berlaymont building in Brussels, Belgium, which houses the headquarters of the European Commission. Credit: Kevin White / Flickr

Berlaymont building in Brussels, Belgium, which houses the headquarters of the European Commission. Credit: Kevin White / Flickr

On 12 September, the European Commission published a proposal for a Regulation on preventing the dissemination of terrorist content online. The proposal is very problematic from a fundamental rights and free expression perspective. Index on Censorship joins others in highlighting these concerns.  

Dear Ministers,

The undersigned organisations are dedicated to protecting fundamental human rights, including the right to freedom of expression and information, both online and offline. We urge you to significantly amend the ‘Regulation on preventing the dissemination of terrorist content online‘, proposed by the European Commission on 12 September 2018, to bring it in line with the Charter of Fundamental Rights, and to propose evidence-based measures that can better achieve the Regulation’s stated goals.

Preventing and countering terrorism, regardless of the ideological, political or religious motivations of the perpetrators, is a legitimate and important goal for European governments that seek to protect liberty and security for individuals and societies. EU Member States and institutions are taking numerous initiatives that aim to counter the threat of violence, including addressing content online that is perceived as promoting terrorism.

One such initiative is the Directive on Combating Terrorism, adopted in March 2017. This Directive has provisions which cover similar content to the Regulation currently being debated – notably in requiring Member States to ensure the “prompt removal of online content constituting a public provocation to commit a terrorist offence” – but its effectiveness has not yet been analysed due to a lack of implementation in all Member States. Without evidence to demonstrate that the existing laws and measures, and in particular the aforementioned Directive, are insufficient to address the harm of terrorist content online, the proposed Regulation cannot be deemed justified and necessary. EU institutions must always ensure that all legislation is evidence-based, appropriately balanced, and consistent with human rights requirements. The undersigned do not believe the proposed Regulation meets these criteria.

Several aspects of the proposed Regulation would significantly endanger freedom of expression and information in Europe:

  • Vague and broad definitions: The Regulation uses vague and broad definitions to describe ‘terrorist content’ which are not in line with the Directive on Combating Terrorism. This increases the risk of arbitrary removal of online content shared or published by human rights defenders, civil society organisations, journalists or individuals based on, among others, their perceived political affiliation, activism, religious practice or national origin. In addition, judges and prosecutors in Member States will be left to define the substance and boundaries of the scope of the Regulation. This would lead to uncertainty for users, hosting service providers, and law enforcement, and the Regulation would fail to meet its objectives.
  • ‘Proactive measures’: The Regulation imposes ‘duties of care’ and a requirement to take ‘proactive measures’ on hosting service providers to prevent the re-upload of content. These requirements for ‘proactive measures’ can only be met using automated means, which have the potential to threaten the right to free expression as they would lack safeguards to prevent abuse or provide redress where content is removed in error. The Regulation lacks the proper transparency, accountability and redress mechanisms to mitigate this threat. The obligation applies to all hosting services providers, regardless of their size, reach, purpose, or revenue models, and does not allow flexibility for collaborative platforms.
  • Instant removals: The Regulation empowers undefined ‘competent authorities’ to order the removal of particular pieces of content within one hour, with no authorisation or oversight by courts. Removal requests must be honoured within this short time period regardless of any legitimate objections platforms or their users may have to removal of the content specified, and the damage to free expression and access to information may already be irreversible by the time any future appeal process is complete.
  • Terms of service over rule of law: The Regulation allows these same competent authorities to notify hosting service providers of potential terrorist content that companies must check against their terms of service and hence not against the law. This will likely lead to the removal of legal content as company terms of service often restrict expression that may be distasteful or unpopular, but not unlawful. It will also undermine law enforcement agencies for whom terrorist posts can be useful sources in investigations.

The European Commission has not presented sufficient evidence to support the necessity of the proposed measures. The Impact Assessment accompanying the European Commission’s proposal states that only 6% of respondents to a recent public consultation have encountered terrorist content online. In Austria, which publishes data on unlawful content reports to its national hotline, approximately 75% of content reported as unlawful were in fact legal. It is thus likely that the actual number of respondents who have encountered terrorist content is much lower than the reported 6%. In fact, 75% percent of the respondents to the public consultation considered the internet to be safe.

The Regulation, as proposed, would introduce serious risks of arbitrariness and have grave consequences for freedom of expression and information, as well as for civil society organisations, investigative journalism and academic research, among other fields.

We urge Members of the European Parliament and Member State representatives to significantly amend the Regulation. In this regard, they should prioritize providing evidence for why this instrument is justified and necessary considering the recent adoption of the Directive on Combatting Terrorism. If evidence proves the Regulation justified and necessary, it is imperative for the EU institutions to bring it in line with the Charter of Fundamental Rights, namely the right to privacy in Art.7, to data protection in Art.8 and to freedom of expression and information in Art.11.

Signatories

Access Now

Apti

Bits of Freedom

Center for Democracy and Technology (CDT)

Chaos Computer Club

CILD

Committee to Protect Journalists (CPJ)

Dataskydd.net

Digitalcourage

Digital Rights Ireland

European Digital Rights (EDRi)

Electronic Frontier Finland

Electronic Frontier Foundation (EFF)

epicenter.works

Fitug

Free Knowledge Advocacy Group

Frënn vun der Ënn

Homo Digitalis

Human Rights Watch (HRW)

Index on Censorship

Initiative für Netzfreiheit

IT-Political Association of Denmark

Panoptykon

Reporters Without Borders

The Civil Liberties Union for Europe (Liberties)

Web Foundation

Wikimedia Foundation  

XNet

Signing in an individual capacity. Affiliation is for identification purposes only.

Daphne Keller
Director of Intermediary Liability
Center for Internet and Society
Stanford Law School

Joan Barata, PhD
Intermediary Liability Fellow
Center for Internet and Society
Stanford Law School