Using Terms and Conditions to apply Fundamental Rights to Content Moderation
Is Article 12 DSA a Paper Tiger?
As the European Court of Human Rights (ECtHR) has emphasised, online platforms, such as Facebook, Twitter and YouTube, provide an “unprecedented” means for exercising freedom of expression online. International human rights bodies have recognised the “enormous power” platforms wield over participation in the online “democratic space”. However, it is increasingly clear that the systems operated by platforms, where (automated) content moderation decisions are taken based on a platform’s terms of service, are “fundamentally broken”. Content moderation systems have been said to “undermine freedom of expression”, especially where important public interest speech ends up being suppressed, such as speech by minority and marginalised groups, black activist groups, environmental activist groups, and other activists. Indeed, the UN Special Rapporteur on freedom of expression has criticised these content moderation systems for their overly vague rules of operation, inconsistent enforcement, and an overdependence on automation, which can lead to over-blocking and pre-publication censorship. This criticism is combined with, and amplified by, the notion that Big Tech exercises too much power over our online public sphere. Therefore, in order to better protect free expression online, the UN Special Rapporteur, and free speech organisations, have argued that platforms “should incorporate directly” principles of fundamental rights law into their terms and conditions (T&Cs).
In EU law, platforms presently have no obligation to incorporate fundamental rights into their T&Cs. An important provision in the EU’s proposed Digital Services Act (DSA), may change this. Art. 12 DSA lays down new rules on how platforms can enforce their T&Cs, including that platforms must have “due regard” to the “fundamental rights” of users under the EU Charter of Fundamental Rights (Charter). The EU Council and Parliament are considering the proposal in parallel, and several far reaching amendments have been advanced in Parliament. Civil society is tracking these developments closely, and there has been severe criticism on the meagre protection of fundamental rights in the DSA. In this post, we examine Art. 12 DSA, including some of the proposed amendments. We ask whether this provision requires online platforms to apply EU fundamental rights law and to what extent it may curb the power of Big Tech over online speech. We conclude that, as it stands and until courts intervene, the provision is too vague and ambiguous to effectively support the application of fundamental rights. But there is room for improvement during the legislative process, and to avoid that Art. 12 DSA becomes a paper tiger.
The systematic context and scope of Article 12 DSA
The DSA proposal is divided into five chapters. Chapter II sets out the regime for the liability of intermediary services providers, updating and adding to the rules set out in Arts. 12 and 15 e-Commerce Directive (see here).
Chapter III deals with due diligence obligations that are independent of the liability regime assessment of the previous chapter. These new rules, a novelty in relation to the e-Commerce Directive, distinguish between specific categories of providers. They set out asymmetric obligations that apply in a tiered way to all providers of intermediary services (Arts. 10 to 13 DSA), hosting providers (Arts. 14-15 DSA), online platforms (Arts. 16-24 DSA) and very large online platforms or “VLOPs” (Arts. 25-33 DSA). Providers of intermediary services are subject to the fewest obligations and VLOPs – covering Big Tech platforms – are subject to the most obligations. All providers are subject to Art. 12 DSA.
Art. 12 DSA is titled “Terms and conditions”, a term that is defined in Art. 2(q) DSA as “all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.” The provision aims to increase the transparency of these T&Cs and bring their enforcement in direct relation to fundamental rights.
Crucially, unlike Chapter II, Art. 12 DSA applies not only to illegal content but also to harmful content, as defined in the T&Cs of an intermediary. As such, since it applies to all providers, Art. 12 DSA extends the obligations of Chapter III beyond illegal content. Interestingly, the European Parliament’s Committee on Legal Affairs (JURI), has proposed to limit the application of fundamental rights in Art. 12 DSA only to harmful content (see amendments 39 and 40). Either way, the result is that the DSA will expand the scope of content moderation decisions subject to regulation as compared to e-Commerce Directive. Still, as we show, it remains unclear how these T&Cs relate to fundamental rights.
Art 12’s DSA aims of transparency and enforcement are dealt with in two distinct paragraphs. Whereas paragraph (1) includes information obligations, paragraph (2) deals with application and enforcement and, arguably, brings providers’ T&Cs within the scope of EU fundamental rights.
Article 12(1) DSA: Information Obligation
Art. 12(1) DSA sets out an information obligation for providers of intermediary services regarding certain content moderation practices outlined in their T&Cs. It aims to ensure that the T&Cs are transparent and clear as to how, when and on what basis user-generated content can be restricted. The objective of the obligation appears to be acts of content moderation by providers that impose “any restriction” on users. But it is unclear whether content moderation actions by the provider that do not stricto sensu restrict what content their users can post, such as ranking, recommending or demonetising content, are within the scope of Art. 12 DSA.
The second sentence of paragraph (1) explicitly refers to “content moderation”, a concept defined in Art. 2(p) DSA as covering activities undertaken by providers to detect, identify and address user-generated content that is either (i) “illegal content” (Art. 2(g) DSA) or (ii) incompatible with their T&Cs. Interestingly, the JURI Committee proposes to limit the scope of Art. 12(1) DSA to illegal content (amendment 38), whereas the European Parliament’s Committee on Internal Market and Consumer Protection (IMCO) aims to expand this provision by mandating providers to also inform users of any “significant change” made to the T&Cs (amendment 84).
Further, the provision explicitly mentions “algorithmic decision-making”, raising the important question of what providing information on “any policies, procedures, measures and tools” might look like (on this see e.g. here, here, and here). However, the exact scope of the paragraph remains unclear, as the phrasing in the first sentence of “any restrictions” appears wider than the definition of content moderation in Art. 2(p) DSA, thereby broadening the provision’s scope.
In its last sentence, Art. 12(1) DSA sets out how this information should be conveyed. Echoing Arts. 7(2), 12(1) and 14(2) GDPR, the T&Cs should be “clear”. However, where the GDPR refers to “clear and plain” language, Art. 12(1) DSA goes one step further by requiring “unambiguous” information, which appears to result in a higher threshold obligation.
Finally, Art. 29(1) DSA sets out a somewhat similar (although less detailed) information obligation for VLOPs regarding recommender systems (for a discussion see here).
Article 12(2) DSA: Applying fundamental rights in content moderation?
From a fundamental rights perspective, the exciting part of Art. 12 DSA is paragraph (2), which regulates the application and enforcement of T&Cs:
“Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.”
The scope is the same as paragraph (1): it only applies to the enforcement of T&Cs that restrict user-generated content. The core obligation is directed at the providers to weigh the “rights and legitimate interests of all parties involved” in a “diligent, objective and proportionate” way when applying their T&Cs. Several legislative amendments expand on this obligation with requirements for application, such as that it must be timely, non-discriminatory, fair, transparent, coherent, predictable and non-arbitrary (see e.g. IMCO 85 and LIBE 59).
As with paragraph (1), the extent of this obligation is unclear. In particular, the provision obligates intermediaries to have due regard to the “applicable” fundamental rights without clarifying what fundamental rights are already applicable in the horizontal relationship between intermediary and user. This matters, since the extent to which users can directly or even indirectly appeal to their fundamental rights vis-à-vis an intermediary in its content moderation decisions is a controversial issue.
In our view, Art. 12(2) DSA can be read in two ways. First, it can be understood as only referring to fundamental rights, which are already applicable in the horizontal relation between intermediaries and users. If so, the provision leaves undetermined the extent to which these are applicable and only obligates intermediaries to have “due regard” if any such rights are applicable. A second and broader interpretation is that Art. 12(2) DSA aims to declare fundamental rights directly applicable in the horizontal relation between intermediaries and users. This would certainly include the right to freedom of expression in Art. 11 Charter (e.g., for users posting content) and the right to non-discrimination in Art. 21 Charter (e.g., for users targeted by content) as well as, potentially, via Art. 52(3) Charter, the extensive case law of the ECtHR.
An obligation in line with the second interpretation would be remarkable, as it would target private actors and presumably apply with equal intensity to all intermediaries. Regrettably, the DSA offers little to no guidance on how to actualise this obligation in practice.
For example, even if what is meant by “restrictions” was properly defined, the scope of “diligent, objective and proportionate” behaviour is fuzzy. Still, promoting “diligent behaviour by providers of intermediary services” seems to be a core aim of the DSA (Recital 3). The requirement of diligence pops up at various other places in the DSA – in Arts. 14, 17, 19 and 20 DSA – primarily in the context of complaint handling by hosting providers. Similarly, the cloudy obligation of enforcing the T&Cs with “due regard” for fundamental rights gives no concrete insight on the extent to which these rights should be considered in individual (including algorithmic) decision-making processes by service providers.
The upshot is that users might not be able to rely on Art. 12 DSA before a court as a means to effectively protect their fundamental rights against a provider. Concretely: can an individual user appeal directly to fundamental rights based on Art. 12(2) DSA in a complaint procedure under Art. 17(3) DSA? The LIBE Committee partially circumvents this problem by proposing a new paragraph 12(2)a that provides that “legal information” can only be excluded or limited from the providers’ services when “objectively justified and on clearly defined grounds” (LIBE 60).
Finally, it is unclear as how broad the scope of “all parties involved” should be understood. It explicitly includes the users affected by the restriction being applied and enforced. For online platforms, it will also presumably include trusted flaggers and other notifiers covered by Arts. 19 and 20 DSA. Beyond that it is difficult to identify other relevant parties at this stage.
Conclusion: avoiding paper tigers
On the surface, Art. 12 DSA looks like a substantial expansion of intermediaries’ responsibilities and a key provision to reign in platforms’ private power over online speech. It holds particular promise to constrain Big Tech’s algorithmic content moderation practices. But a deeper analysis leaves more questions than answers.
Art. 12(1) DSA imposes an information obligation regarding restrictions imposed on users of intermediary services, which obligation extends to algorithmic decision-making. Art. 12(2) DSA introduces an apparently broad obligation for providers to act in a diligent, objective and proportionate manner when applying and enforcing such restrictions, explicitly linked to the respect of fundamental rights. Furthermore, the provision expands the scope of the obligations beyond illegal content, applying also to content which intermediaries consider harmful or undesirable in their T&Cs. These horizontal obligations for all providers of intermediary services providers are welcome additions to EU law.
However, Art. 12(2) DSA, in particular, is too vague on what its crucial obligation entails and the extent to which intermediaries are required to apply fundamental rights in content moderation. The amendments under discussion in the European Parliament are unlikely to offer the necessary clarity in this regard. As a result, if the legislative text remains unchanged or is significantly improved, the application and enforcement dimension of Art. 12 DSA will likely only be effective if and when courts are called to interpret it. Until then, the risk is that Art. 12 DSA remains a paper tiger, ineffectual in regulating the private power of Big Tech via-à-vis online speech.
To avoid this outcome, the EU legislator should first take a normative stand in the DSA and clarify whether the express purpose of Art. 12 DSA is to oblige providers to apply fundamental rights law in content moderation decisions. Platforms may already be going some way in this direction, as exemplified in Facebook’s Oversight Board decisions that apply freedom of expression principles under the International Covenant on Civil and Political Rights. Similarly, some national courts are applying fundamental rights to decisions taken by platforms to remove content (see a Dutch example here) due to their immense power over public debate online. Second, the legislative process should be used to incorporate more concrete links to Art. 12 throughout the DSA, so as to substantiate the meaning and effect of the provision. In particular, if the main concern is to constrain the private power of Big Tech, legislative intervention should focus on linking Art. 12 DSA to the due diligence obligations of VLOPs.
The Max Planck Institute for Innovation and Competition is committed to fundamental legal and economic research on processes of innovation and competition and their regulation. As an independent research institution, the Institute provides evidence-based research results to academia, policymakers, the private sector as well as the general public.