The EU’s Digital Service Act: initial thoughts
This week, the European Commission published its proposal for a new Regulation, the Digital Services Act (DSA), which aims to update the EU’s legal framework for intermediary liability, and introduce new obligations on online platforms around their content moderation policies and practices. The DSA would expand upon the EU’s existing legislation, the e-Commerce Directive, that was adopted over 20 years ago to regulate online services. This legislative package has been in development for over a year, and follows a public consultation which closed in September 2020 (read our input to it here).
With the full proposal now out, how does it measure up from a human rights perspective? GPD’s initial assessment is largely positive. We commend the EU for working to modernise the existing legal framework in a way which broadly ensures protection for the rights to freedom of expression and privacy, and welcome many elements of the proposed Regulation. However, there are still some aspects which could have adverse impacts upon these rights, and where further refinement is needed.
Retained provisions and new responsibilities
Intermediary Liability Regime
The DSA (through Chapter II) would maintain the core principles of the e-Commerce Directive’s liability regime, i.e. that intermediary services (services that transmit communications, provide access to communication networks, or that store and host information) would not be liable for illegal content posted by users, as long as they do not have actual knowledge of illegal activity and act swiftly to remove upon obtaining such knowledge or awareness. The prohibition of general monitoring obligations is maintained in the DSA as well. Some new provisions include:
- The introduction of a “Good Samaritan” clause, establishing liability protections for the providers of intermediary services that carry out voluntary initiatives or investigations to detect, identify, or remove illegal content, or take necessary measures to comply with the DSA.
- Intermediary service providers would be compelled to remove illegal content upon receiving a sufficiently detailed order from a national judicial or administrative authority. They would also be compelled to provide certain forms of information under their control upon receiving a sufficiently detailed order from a national judicial or administrative authority.
We are pleased that the DSA would retain the core principles of the e-Commerce Directive’s liability regime and welcome the introduction of the Good Samaritan clause. We also welcome the greater clarity and transparency that would be provided around requests from governments for the removal of illegal content.
New Obligations
The DSA would introduce new obligations for intermediary services to ensure a “transparent and safe online environment” (Chapter III). All service providers would have to establish points of contact and legal representatives based in an EU member state, with the latter able to be held liable for non-compliance with obligations under the DSA.
We welcome the new obligations on all service providers to provide clear terms of service detailing “information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review”, and to “act in a diligent, objective and proportionate manner in applying and enforcing the restrictions”, paying due regard to human rights when doing so. This obligation to provide clarity in content moderation policies and to respect freedom of expression when enforcing them would be the first of its kind in the world.
Under the proposals, online platforms would be required to establish notice and action mechanisms, cooperate with trusted flaggers, take measures against abusive notices sent by users, and deal with complaints. Particularly welcome is the requirement for platforms, when content is removed, to inform the user of the decision and clearly explain the reasoning behind it, as well as their options for seeking redress—whether via internal complaint-handling mechanisms, out-of-court dispute settlement, or judicial redress.
Very large platforms—those reaching 45 million users or more (10% of European population)—would be subject to additional rules. These include a requirement to conduct annual reviews of “significant systemic risks stemming from the functioning and use made of their service” which include “the dissemination of illegal content”, “negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child” and “intentional manipulation of their service” which causes (or could cause) “negative effects upon public health, children, civil discourse, electoral processes and public security”. They would have to put in place “reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks” and undergo independent auditing.
The same platforms would also have to provide greater transparency on the recommender systems (machine learning systems that aim to predict users’ interests and recommend content) that they use, and allow users to choose between different recommender systems. They would also have additional transparency obligations for online advertising, and be required to establish a public repository of adverts. The DSA includes data access and scrutiny provisions to provide oversight of how ads are displayed and targeted.
Overall, we welcome many of these measures, particularly the focus on illegal content, as opposed to other forms of “harmful content”, and the greater transparency that will be required with respect to platforms’ content moderation policies and practices. We particularly welcome the fact that the obligations would only apply to publicly available content, with private (including encrypted) channels out of scope. We remain concerned, however, over the fact that platforms would still be required to make decisions about the legality of content. While users would have a right to challenge removals, and platforms would have to provide information to users on the reasons, the proposal still requires private entities to take on this quasi-judicial function.
Enforcement
The DSA would create a new enforcement mechanism that relies on national and EU-level cooperation. Each EU member state would have to appoint a Digital Services Coordinator, which would serve as an independent authority and supervise compliance with the Regulation. These Coordinators would have powers to investigate and impose fines for lack of compliance, and may also request courts in EU member states to order the temporary restriction of access to a particular service when there is ongoing infringement and the infringement entails a serious criminal offence. Any measure ordered must be proportionate to the nature, gravity, recurrence and duration of infringement, and must not unduly restrict access to lawful information by recipients of the service concerned. The proposed Regulation further specifies that member states would have to ensure that the maximum amount of penalties imposed for failure to comply with obligations does not exceed 6% of the annual income or turnover of a particular entity.
Enforcement would also take place at the EU level, with the establishment of an independent advisory group called the European Board for Digital Services. The Board would be composed of Digital Service Coordinators, and would be charged with advising on the consistent application of the proposed Regulation. The EU Commission would also play a role in supervising and enforcing compliance for very large online platforms.
We are pleased that the proposed sanctions regime appears to reflect the requirements of proportionality and necessity. We also welcome that the levelling of fines and other sanctions would be complemented by a number of procedural safeguards and oversight by independent judicial authorities.
Transparency and accountability
In addition to transparency over their content moderation policies (noted above), we welcome the DSA’s requirement for all but the smallest service providers to publish annual “easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period”. We are pleased that the proposed Regulation seeks to mitigate the risk of erroneous or unjustified restrictions on individuals’ right to freedom of expression through mandatory safeguards. Establishing effective internal complaint-handling systems, requiring user redress mechanisms, and mandating that information be made available to affected users, would help individuals freely express themselves and provide access to an effective remedy as required under international human rights law.
Next steps
The European Parliament and EU member states will discuss the Commission’s proposal in the ordinary legislative procedure. If adopted, the final text will be directly applicable across all EU member states. We’ll be following the Bill’s progress closely—sign up to our monthly Digest for regular updates and analysis.