Digital Services Act: Obligations of Very Large Online Service Providers, EU Transparency Database and Enforcement

Ireland is home to 13 of the 17 “Very Large Online Platforms” (“VLOPs”) and 2 “Very Large Online Search Engines” (“VLOSEs”)1, as previously designated by the European Commission2 (the “Commission”). The Digital Services Act (the “DSA”)3 will be fully effective from next year but these organisations have been required to comply with an enhanced set of obligations since 25 August 2023.4

In this briefing, we consider some of the key additional obligations for VLOPs and VLOSEs, the new EU Transparency Database, enforcement and interaction of the DSA with existing online safety legislation.

Implementation of the DSA

The DSA entered into force across the European Union on 16 November 2022, with a staggered timeline towards full implementation by 17 February 2024. The DSA creates a comprehensive framework to regulate the eco-system of online content and to impose unique new transparency and accountability requirements for all online intermediary service providers (“OIPs”) that offer services within the EU, irrespective of their country of establishment. The DSA is an EU Regulation and therefore is directly applicable in Ireland (although certain aspects have been given effect through domestic legislation).

For a detailed overview of the DSA’s progression and full set of obligations, please see our previous briefings (here), here and (here).

Additional Obligations for VLOPs / VLOSEs

To address ‘systemic risks’ arising from online content, the DSA imposed additional obligations on two types of OIPs, VLOPs and VLOSEs, relating to four main areas:

1. User Privacy

  • Redesign of systems to ensure a high level of privacy and security
  • Provision of clear information on recommendations to users and the right to opt out of recommendation systems based on profiling
  • Requirement that recommendation systems cannot be based on users’ sensitive data

2. Protection of Minors

  • Prohibition on targeted advertising towards children
  • Special risk assessments to be conducted, including on negative mental health effects
  • Redesign of services, including user interfaces, recommendations systems, terms and conditions, to mitigate systemic risks, including mental health effects, and to ensure the protection of minors

3. Content Moderation

  • Establishing a user friendly mechanism for users to flag illegal content and through which the platform can process notifications expeditiously
  • Provision of an easily understandable, plain-language summary of terms and conditions, in the languages of the Member States in which they operate
  • Terms and conditions must be enforced diligently and non-arbitrarily

4. Transparency and Accountability

  • Publication of transparency reports on content moderation decisions and risk management
  • Conduct an annual risk assessment of systemic risks
  • External and independent auditing of risk assessments
  • Labelling of all advertisements, informing users who is promoting them and explaining algorithms used for recommendations
  • Provide access to publicly available data for ‘specially vetted’ researchers

What are the Risk Assessment and Audit Obligations?

VLOPs and VLOSEs were required to carry out and submit their first annual risk assessments to the European Commission by 25 August 2023 (Article 34, DSA). These risk assessments demonstrate how the companies have dealt with various ‘systemic risks’ relating to:

  • the dissemination of illegal content online;
  • intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service;
  • negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the rights of the child, and ensuring a high level of consumer protection;
  • negative effects on civic discourse, electoral processes, and public security; and
  • negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

When carrying out risk assessments, companies must factor in how their terms and conditions, their systems for selecting advertisements and the design of their recommender or algorithmic systems can influence systemic risks. They must also consider if there is a risk for intentional manipulation of their services, including through the rapid spread of disinformation, a topic which has come into focus since the Russia – Ukraine conflict.  VLOPs and VLOSE must then use the assessments to inform and adopt “risk mitigating measures”, including implementing measures to discourage and limit the dissemination of illegal content.

Crucially, these assessments must be independently audited. The designated platforms are obliged, at their own expense, to complete such audits within 1 year (so, at the latest, by 25 August 2024). VLOPS and VLOSEs then have 3 months from receipt of the audit report to prepare a public report on the outcome of the risk assessment, mitigation measures, audit and audit implementation, which must be transmitted to the European Commission and the relevant Digital Services Co-ordinator.

The Commission published a draft delegated regulation specifying the procedures, methodology and template to be used for such independent audits and audit implementation reports (which is currently undergoing final scrutiny). This delegated regulation also sets out rules in relation to the scope of the audit and the level of assurance to be provided by the auditor.5

In 2024, the European Board for Digital Services (the ‘EBDS’) will publish its first annual report identifying the ‘most prominent and recurrent systemic risks’ reported by all VLOPs /VLOSEs and will include guidelines on best practice to mitigate systemic risk. (The EBDS is an EU independent advisory group, which supports the Commission and helps coordinate the actions of DSCs.)

EU Transparency Database

The Commission has created a new transparency tool in the form of the EU Transparency Database whose purpose is to collect all Statements of Reasons (excluding personal data) issued by online platforms.6 It is updated in an automated manner allowing close to real time updates. A Statement of Reason document must be provided by hosting services, including online platforms (a broader definition than VLOP/VLOSE) to an affected user(s) to explain the basis of any content moderation decision7, for example, a block or restriction on online content.

The EU Transparency Database allows Statements of Reasons to be analysed using advanced search functions such as decision type, keyword, subject category, territorial scope, date and content type. It is not surprising that the Commission regards the EU Transparency Database as an important means of scrutinising how online platforms, including social media companies, apply their content moderation policies as well as a means of monitoring the spread of illegal and harmful content. 

Enforcement and Other Relevant Legislation

The European Commission has direct supervisory powers over the activities of VLOPs and VLOSEs and may investigate suspected breaches of DSA obligations.  

The overseeing of other OIPs’ compliance with the DSA is carried out by the Digital Services Coordinator (“DSC”), which Member States must appoint by 17 February 2024. Ireland established  Coimisúin na Meán (the Media Commission) under the Online Safety and Media Regulation Act, 2022 (the ‘OSMR’) with effect from 15 March 2023 and the Government has decided to appoint Coimisiún na Meán to act as Ireland’s DSC.8

The OSMR amends the Broadcasting Act 2009 to update the regulatory regime for broadcasting services and inter alia establish a new statutory online safety regime. Similar to the DSA’s objectives of tackling ‘illegal content’, this regime aims to protect online users and in particular minors from harmful content. It prohibits offence-specific categories of online content and other types of harmful content such as bullying and content which encourages or promotes eating disorders or self-harm which meet a risk test (see our previous briefing on the OSMR here). Under this regime, ‘designated online services’ will be required to comply with online safety codes relevant to their type of service, once published.9

Neither the DSA or OSMR affect the existing liability exemptions for mere conduit, caching or hosting activities10 as regards online content or impose a general monitoring obligation on OIPs. Both frameworks however require OIPs to consider in depth the nature of content available on their services and to fulfil their obligations as regards mitigating risks of illegal and harmful content. On 6 October 2023, Coimisúin na Meán published its e-Commerce Compliance Strategy (as required by Section 139ZF of the Broadcasting Act 2009) setting out how it intends to ensure that its online safety codes, online safety guidance materials and advisory notices are consistent with and do not infringe the liability exemptions within Articles 4, 5, 6 and 8 of the DSA.

If an OIP does not act in compliance with the DSA, it could face fines of up to 6% of its annual global turnover. This compares to a fine up to 4% of annual global turnover under the GDPR, and up to 10% of annual global turnover for fines under the OSMR.

What’s next?

The impact of the DSA is evident in terms of the sheer volume of new regulatory obligations for relevant companies. The recent submission by VLOPs / VLOSEs of their first risk assessments to the Commission was a milestone but it will likely be Q4 2024 before VLOPs / VLOSEs publish their own reports on the outcome of these risk assessments and independent audits. In the meantime, the publicly available data within the EU Transparency Database provides significant insight into the everyday content moderation decisions made by online platforms.

The deadline for full compliance with the DSA approaches in February 2024. All OIPs should therefore pay close attention to their obligations and take steps to ensure all internal systems, procedures and terms and conditions are DSA-ready. If you have any questions in relation to the DSA, please contact a member of the McCann FitzGerald team.

Also contributed to by Stephen Lahert, David O’Keeffe Ioiart and Morris Hung


  1. In the first designation decision of 25 April 2023, VLOPs included Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube, and Zalando; and “Very Large Online Search Engines” included Bing and Google.
  2. On the basis that they reach more than 45 million monthly active users.
  3. Regulation (EU) 2022/2065.
  4. Legal challenges by two online platforms, Amazon and Zalando, in respect of their designations as VLOPs by the European Commission are still pending (see here). On 28 September 2023, Amazon obtained an interim ruling enabling it to postpone publication of its advertising repository information, as required by Article 39 DSA
  5. COMMISSION DELEGATED REGULATION (EU) …/... supplementing Regulation (EU) 2022/2065 of the European Parliament and of the Council, by laying down rules on the performance of audits for very large online platforms and very large online search engines (see here)
  6. In accordance with the requirement of Article 24 (5) of the DSA. Online platforms are a subset of OIPs.
  7. As required by Article 17, DSA
  8. As provided for at Head 6 of The General Scheme of the Digital Services Bill 2023.
  9. Coimisúin na Meán published a notice, effective on 11 September 2023, that video-sharing platform services (VSPS) within the State are a category of relevant online services and it is consulting with stakeholders on the form of safety code for VSPS.
  10. As set out in Articles 4, 5, 6 and 8 of the DSA (replacing Articles 12-15 of the E-Commerce Directive 2000/31 (EC)).

This document has been prepared by McCann FitzGerald LLP for general guidance only and should not be regarded as a substitute for professional advice. Such advice should always be taken before acting on any of the matters discussed.