Content Regulation Update: Strict New Obligations for Online Platforms under EU Regulation on Terrorist Content ((EU) 2021/784)

Regulation (EU) 2021/784 on Terrorism Content (the “Regulation”) introduces strict new requirements for online ‘hosting service providers’ (“HSPs”) to remove flagged terrorist content within one hour, provide reasons for the removal of content on request and to monitor their own compliance by submitting annual reports outlining actions taken in this area. It will also introduce measures for Member States’ authorities to identify and ensure the quick removal of terrorist content online and to facilitate co-operation between Member States on this issue. It will apply from 7 June 2022. 

What are the New Obligations for HSPs?

The Regulation requires HSPs to:

  1. remove terrorist content (including texts, images, sound recordings or videos, and live transmissions) as soon as possible and in any event within one hour of receiving a ‘removal order’ from a competent authority;
  2. in certain circumstances take specific measures regarding terrorist content on their platform;
  3. include certain items in their terms and conditions; and
  4. in certain circumstances to reinstate removed content.

Content disseminated for educational, journalistic, artistic or research purposes is exempt.

The Regulation also requires HSPs to monitor their compliance by submitting annual reports to the responsible authorities outlining actions they have taken in this area.

What is Terrorist Content?

‘Terrorist content’ is defined broadly by the Regulation and is the same as the definition in the Directive on combatting terrorism:

"material that incites or solicits someone to commit, or to contribute to the commission of, terrorist offences, solicits someone to participate in activities of a terrorist group, or glorifies terrorist activities including by disseminating material depicting a terrorist attack"

Removal and Reasoning

A ‘competent authority’ within a Member State shall have the power under the Regulation to issue a decision requiring the hosting service provider to remove terrorist content or disable access to it (a ‘removal order’). Save in certain circumstances related to public security, including in relation to the investigation and prosecution of terrorist offences, where a HSP removes content, it must make available information on the removal to the content creator and on the content creator’s request, must inform them of the reasons for the removal and of their rights to challenge the removal order.

The Member State of the HSP’s ‘main establishment’ has jurisdiction under the Regulation. HSPs have the right to challenge a removal order both before the courts of the Member State whose competent authority issued the removal order and before the courts of the host Member State having jurisdiction over the HSP. Content creators also have the right to challenge removal orders.

Complaint Process

The Regulation obliges HSPs to establish procedures allowing content creators to submit a complaint concerning removal of their content and to request reinstatement. The HSP must expeditiously examine all complaints that it receives through the established mechanism and reinstate the content or access thereto “without undue delay” where its removal was “unjustified”.  Where the complaint is rejected, the HSP must provide the complainant with the reasons for its decision.

Role of the Competent Authority

The Regulation does not introduce a general obligation for service providers to monitor or filter content. However, when competent national authorities have decided a HSP is ‘exposed to terrorist content’, the company must take specific measures to prevent its spread. It will then be up to the HSP to decide what specific measures to take to prevent this from happening, with no obligation to use automated tools. The Regulation also requires HSPs to publish annual transparency reports on what action they have taken to stop the dissemination of terrorist content.

A HSP may be deemed by a competent authority to be ‘exposed to terrorist content’ taking into account a number of objective factors, such as where the HSP was the subject of two or more final removal orders in the previous 12 months.   In these circumstances, the HSP must include in their terms and conditions and apply provisions to address the misuse of its services for the dissemination to the public of terrorist content.  Examples given in the Regulation of such measures include appropriate technical and operational measures or capacities, such as appropriate staffing or technical means to identify and expeditiously remove or disable access to terrorist content, and easily accessible and user-friendly mechanisms for users to report or flag to the hosting service provider alleged terrorist content.

The Regulation also obliges HSPs to set out clearly in their terms and conditions their policy for addressing the dissemination of terrorist content, including, where appropriate, a meaningful explanation of the functioning of specific measures, and the use of automated tools if applicable.

Penalties

Member States must notify the EU Commission by 7 June 2022 of the penalties which it has laid down to deal with infringements of the Regulation. The Regulation does not stipulate what these penalties must be, but it does require Member States to ensure that a systematic or persistent failure to comply with removal orders is subject to financial penalties of up to 4% of the HSP’s global turnover of the preceding business year.

Next Steps

While this targeted regulation of terrorist content online is unsurprising, it does introduce an onerous obligation on HSPs to remove flagged content within one hour, which might be operationally challenging. The Regulation also envisages HSPs providing the reasons for the removal to content creators in certain circumstances, which in some borderline cases, may require a level of legal analysis as to how the content comes within the definition of ‘terrorist content’. While the Regulation will not be law until June 2022, HSPs should now consider how terrorist content is flagged and dealt with on their platforms currently and what changes are required to comply with the Regulation.

Also contributed to by Aoife Gunning and Aaron McCarthy.

This document has been prepared by McCann FitzGerald LLP for general guidance only and should not be regarded as a substitute for professional advice. Such advice should always be taken before acting on any of the matters discussed.