One of the main objectives of the adoption of the EU Regulation – the Digital Services Act (DSA) was defining the rules on content moderation by digital service providers.On the one hand, the DSA defines the procedure for and consequences of moderating or not moderating content, while on the other hand, it also lays down the rights of users when their content is moderated. The fulfillment of these requirements has direct implications firstly with respect to the possible civil or criminal liability of intermediary service providers for distributing others’ content of an illegal or harmful nature, and secondly with respect to administrative liability under the Digital Services Act (e. g. a financial penalty imposed by the Digital Services Coordinator under Article 52(3) of the DSA).

Below are the ten most important points on content moderation under the Digital Services Act.

  1. The concept of moderation based on DSA.

Examples of activities constituting content moderation are given in Article 3(t) of the DSA, and these include:

(a) detecting and identifying illegal content or information incompatible with the terms and conditions of providers of intermediary services, provided by recipients of the service (users),
(b) combating such content, including measures taken that affect the availability, visibility, and accessibility of such illegal content or information, such as “demotion, demonetization, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account.”

The Digital Services Act has thus significantly widened the range of actions that can be taken with regard to user content, compared to the rules laid down in Directive 2000/31/EC on electronic commerce. In addition to the existing ability to remove or block access to content, new options have emerged, such as demotion of the content.

  1. 2. Types of User Content under the Digital Services Act.

In terms of the obligations of intermediary service providers to take action with regard to content posted by users, the DSA mentions three types of content:

(a) illegal content (Article 2(g)),

(b) content that is incompatible with the terms and conditions of services (Article 3(t)),

(c) harmful content (recital 82).

  1. The concept of illegal content.

The term illegal content is defined in Article 2(g) of the DSA as “any information which, on its own or by reference to an activity, including the sale of products or the provision of services, does not comply with Union law or the law of a Member State concerned, whatever the specific subject matter or nature of that law’.

In the light of the above definition, illegal content is both information that it is illegal to disseminate (such as content that incites terrorism) and information that relates to illegal activities (such as images of sexual exploitation of children).

The Digital Services Act does not specify that certain types of content are illegal. The types of content that are illegal are specified in separate European Union legislation or the legislation of individual Member States (national law). In particular, European Union law outlaws four types of content that might also be disseminated on the Internet, and these are content a) inciting terrorism, b) depicting the sexual exploitation of children, c) inciting racism and xenophobia, and d) infringing intellectual property rights. In turn, content illegal under national law is considered to be content illegal under both public law (e. g. criminal law) and private law (e. g. civil law).

  1. Moderation of illegal content.

Where content is deemed illegal under EU or national law, the intermediary service provider is required to take moderation measures, which means removing that content, and a failure to do so could mean liability for disseminating it through its services regardless of the users who post the information.

  1. The concept of content incompatible with the terms and conditions of service.

Content that breaches the terms and conditions is content that is not formally illegal, but which may not be disseminated under the intermediary service provider’s rules, for example decency rules (such as content containing nudity).

  1. Moderation of content incompatible with the terms and conditions of service.

The legal basis for the moderation of content incompatible with the Terms and Conditions of Service (T&C) is the contractual relationship between the provider and the recipient of the services (Article 3(u)).

According to the solution adopted under the DSA, intermediary service providers are, in principle, free to determine which content they consider to be harmful and will consequently be moderated by them. This structure creates a risk that they will take decisions arbitrarily, thereby breaching the freedom of expression of Internet users. In this respect, Article 14 of the DSA provides two types of instruments to safeguard the interests of users. First, in the terms and conditions of the service, users must be informed about the procedures, means, and tools for content moderation. This information should not only be clear and unambiguous but should also be publicly available and easily accessible (Article 14(1)). Second, online intermediaries are required to exercise due diligence, objectivity, and proportionality when moderating content, taking into account the fundamental rights and interests of the parties concerned (Article 14(4)).

  1. The concept of harmful content.

The Digital Services Act does not define the term harmful content, and this was a deliberate decision by the European legislator, as highlighted in the explanatory memorandum to the DSA: “There is broad agreement among stakeholders that ‘harmful’ content (although not, or at least not necessarily illegal) should not be defined in the Digital Services Act and should not be subject to removal obligations, as this is a sensitive area with serious implications for the protection of freedom of expression. ” (Explanatory Memorandum to the draft Digital Services Act, COM(2020) 825 final 2020/0361 (COD), p. 11).

Harmful content is considered to be content that is not formally illegal but may be considered unethical and socially undesirable for other reasons (e. g. disinformation, content promoting violence, racism, or xenophobia). In most cases, harmful content will also be content “incompatible with the terms and conditions”, however, since it is the intermediary service providers who determine these terms and conditions, it is legally possible that under the terms and conditions, socially harmful content will not be prohibited by the intermediary service provider. Providers of intermediary services do not have a legal obligation to prohibit the distribution of harmful content in their terms and conditions (regulations) (see also point 8 below).

  1. Moderation of harmful content.

The Digital Services Act does not provide for a general obligation for intermediary service providers to moderate harmful content.

Exceptionally, obligations to moderate harmful content have been imposed on so-called very large online platforms (VLOPs) or very large online search engines, VLOSE  (Article 25(1)). According to Article 26 of the DSA, they are required to manage the systemic risk, by way of measures including annual assessments, and, where necessary, appropriate measures to mitigate this risk (Article 27). To this end, they must also undergo independent audits (Article 28). It follows from the above rules that very large internet platforms or very large internet search engines must put in place a risk management system that addresses the risk of dissemination of harmful content via their services, such as disinformation, election meddling, or cyberbullying. When carrying out such a risk assessment, very large online platforms should consider how their content moderation systems, recommendation systems, and advertisement selection and display systems affect the dissemination of illegal and harmful content (Article 26(2)).

  1. Content moderation – user rights.

The Digital Services Act has significantly strengthened the position of users (recipients of services) whose content has been moderated.

Hosting service providers, including online platform providers, are required to provide clear and specific reasons for decisions taken by the intermediary service provider to limit the availability of content considered illegal or contrary to their terms and conditions (Article 17(1)). In addition, providers of online platform services are required to provide users with access to an effective internal complaint-handling system that allows them to lodge complaints electronically and free of charge against a decision taken by the provider of the online platform (Article 20(1)) and inform recipients of the option of out-of-court dispute resolution (Article 21(1)).

  1. Content moderation based on the DSA – summary.

The main distinction made in the DSA is between illegal content and content incompatible with the terms and conditions of service. Most institutions established in the DSA relate to moderation of illegal content or content contrary to the intermediary service provider’s terms and conditions (e.g. Articles 14, 17 et seq.).

For very large online platforms and very large online search engines there is an exception, which is the obligation to moderate harmful content, regardless of whether it is illegal or contrary to the terms and conditions (Articles 34-35).

The scope of activities covered by the concept of moderation of illegal, incompatible with the terms and conditions of service, and harmful content has significantly expanded compared to the current legal situation.


Author: Xawery Konarski

More from Traple Konarski Podrecki & Partners