knowledge | 26 January 2022 |
The Digital Services Act makes Progress Through the Legislative Process
The EU Parliament (the “Parliament”) has adopted proposed content moderation, fundamental rights and targeted advertising amendments to the Digital Services Act.
On 14 December 2021, the Internal Market and Consumer Protection Committee of the EU Parliament (the “Committee”) issued its report, adopting proposals for amendments to the Digital Services Act1, originally proposed by the European Commission on 15 December 2020. The EU Parliament voted to adopt this proposal on 20 January 2022, with a number of late changes introduced in the plenary vote. The draft text will also update parts of the E-Commerce Directive 2000/31/EC.
As outlined in our initial briefing, it is anticipated that the proposed regulation will modernise the current regime for the digital services market, regulating content online but also providing protection to users’ fundamental rights of freedom of expression.
Protection of Fundamental Rights
A key feature of the Committee’s proposals is the reinforcement of fundamental rights enjoyed by digital service users.
The proposals preserve the 'safe harbour' principle, under which online intermediaries who host or transmit content provided by a third party are exempt from liability unless they are aware of the illegality and are not acting adequately to stop it. The notice-and-action obligations currently imposed on providers of hosting services are to be strengthened to encompass the removal of illegal content,2 bringing hate comments, images of child abuse and copyright violations within its scope. Upon receipt of such a notice, hosts would be required to act without undue delay having regard to the type of illegal content and the urgency of taking action, with liability for hosts with “actual knowledge” of such content that do not act expeditiously to remove or disable access.
Under the revised text of Article 7, providers of intermediary services will not be obliged to use automated tools for content moderation or for monitoring the behaviour of natural persons.3 The proposals aim to ensure that Member States cannot prevent intermediary services from offering end-to-end encrypted services,4 reinforcing the right to privacy of online users, particularly in relation to private online messaging services such as WhatsApp.
Under the current proposals, to increase transparency in online advertising, users will be entitled to opt out of content being recommended based on their internet history and behaviour. The text provides for more transparent and informed choice for all recipients of services, including information on how their data will be monetised.5 The Committee’s proposals provide greater protection to minors from direct marketing, profiling and behaviourally targeted advertising for commercial purposes. The revised text also reiterates that the Digital Services Act will be without prejudice to requirements regarding the processing of personal data, including for targeted advertising, under the GDPR. One of the changes introduced in the plenary vote was imposing further limitations on the use of sensitive personal data, such as sexual orientation, political and religious beliefs, in targeted advertising.
Under the proposals, very large online platforms (“VLOPs”) will be obliged to provide at least one recommender system which is not based on profiling, allowing users to see the criteria used by the platform to adapt content for them. The amended provisions, in particular Article 26, seek to create transparency, making online platforms more accountable for their design and processes. The current draft proposes that VLOPs will have to carry out mandatory risk assessments and take risk mitigation measures aimed at tackling harmful content and disinformation, and will be required to share data with authorities and researchers, giving others a chance to scrutinise the manner in which they operate and better understand any risks posed by online platforms.
Other substantive proposals include:
- As previously raised by Member States, the introduction of a clause prohibiting online platforms using techniques to influence users’ behaviour through online interface which may distort or impair the decision-making of the recipients of the service via the structure, design or functionalities of an online interface. This is sometimes referred to as, what is sometimes referred to as, "dark patterns"6. However, this does not prevent providers from interacting directly with users and to offer new or additional services to them;
- Platforms uploading pornographic content will be obliged to ensure users can only do so once properly verified by the operator, using registered phone number and email address. Human review will have to be used to detect abusive content and immediate removal of content will be available to individuals depicted without their consent.7
- Any authority orders (eg takedown orders) made against illegal content shall be limited to the territory of the Member State that orders removal.8
- Further clarification provided on the role of Digital Services Coordinators in Member States and the need for them to coordinate their work with the European Commission in enforcing obligations.
- Micro, Small and Medium-sized enterprises will enjoy certain exemptions from obligations imposed by the legislation via a waiver system. The European Commission would be required to pay specific attention to SMEs and the position of new competitors when carrying out periodic revisions of the regulation.9
During the plenary vote, amendments to the original proposal were introduced and adopted by the Parliament. In addition to the amendments to targeted advertising mentioned above, various amendments regarding online users’ anonymity were adopted. Such users must, “wherever reasonable efforts can make this possible”, be allowed to use and pay for services anonymously. Another late change allows SMEs who, having made a “reasonable effort” to obtain legal representation of their own, to join a collective representation with other companies. Fundamental rights were also at the core of some changes to requirements on terms and conditions - pursuant to the draft text platforms must consider rights enshrined in the Charter of Fundamental Rights. Any terms and conditions, which violate users’ rights, should not be binding on them.
Since its approval, the draft text will now form the basis for the Parliament’s negotiations with EU governments. Five separate political trilogues between the Parliament, the EuropeanCommission and the EU Council are scheduled, with the first set for 31 January 2022.
Also contributed by Eoin Galligan
- Report on the Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC; COM(2020)0825 – C9-0418/2020 – 2020/0361(COD) (the “Report”).
- Article 5 of the Report.
- Article 7(1)(a) of the Report.
- Article 7(1)(b) and 7(1)(c) of the Report.
- Article 24(1)(a) of the Report.
- Article 13(a) of the Report.
- Article 24(b) of the Report.
- Article 8(2)(b) of the Report.
- Articles 16 and 43a and Recital 35 of the Report.
This document has been prepared by McCann FitzGerald LLP for general guidance only and should not be regarded as a substitute for professional advice. Such advice should always be taken before acting on any of the matters discussed.