31 August 2021

The European Constitutional Road to Address Platform Power

In the last twenty years, the policy of the European Union in the field of digital technologies has shifted from a liberal perspective to a constitutional strategy, aimed at protecting fundamental rights and democratic values, as driven by European digital constitutionalism. This paradigm shift was primarily triggered by the intolerance of the European constitutional system to the consolidation of platform powers, establishing standards and procedures competing with the rule of law. Looking at online content, the deplatforming of Donald Trump or the Facebook decision to block news in Australia are just two paradigmatic examples of governance by platforms; not only over online speech but also fundamental rights and democratic values.

Evidently, the rise of the algorithmic society has led to a paradigmatic change, wherein public power is no longer the only threat to the respect of constitutional principles. The functions exercised by online platforms raise questions about the safeguarding of fundamental rights and democratic values from the autonomous discretion of the private sector, which is not bound by constitutional law. This encourages reflection on how constitutional law could evolve to face the challenges at the intersection between public authority and private ordering.

The Digital Services Act can be considered an expression of the constitutional path of the Union to address platform power. It is a piece of the broader puzzle of measures to shape Europe’s digital future. The GDPR, the proposals for the Digital Markets Act or the Artificial Intelligence Act are other examples of this framework. Against the consolidation of new areas of (private) power, we argue that European constitutional law provides instruments to address this situation. The horizontal effect of fundamental rights and the introduction of substantive and procedural safeguards are two primary pieces to protect European constitutional values in the algorithmic society. The Digital Services Act, in particular, horizontally translates constitutional values to private relationships, thus, representing an example of the European approach to limit platform power.

Framing Platform Power from a Constitutional Perspective

The rise and consolidation of platforms’ powers is not just a coincidence driven by market dynamics. It is primarily the result of a liberal approach taken by constitutional democracies across the Atlantic towards digital technologies at the end of the last century. At that time, it was not possible to foresee this development. Nevertheless, immunizing or exempting these actors – Big Tech’s predecessors – from third-party responsibility has contributed to the transformation of economic freedoms into something that resembles the exercise of powers as vested in public authorities. In other words, the freedom to conduct business has since gained a new dimension, namely, that of private power, which – it goes without saying – brings significant challenges to the role and tools of constitutional law. Instruments of private law or competition law would, in fact, no longer be sufficient to capture the functioning of these actors.

Private actors are now vested with some forms of power that are no longer of merely economic nature. A broad range of decision-making activities are increasingly delegated to algorithms, which can advise, and, in some cases, take decisions based on the data they process, thus mediating how individuals exercise their rights and freedoms. The case of content moderation shows how platforms take autonomous decisions in designing the rules of moderation, enforcing these standards, while balancing rights and freedoms mirroring constitutional review. These are examples of the exercise of quasi-public powers, which de facto propose an alternative model to define the boundaries of online speech on a global scale.

The global pandemic has further highlighted the constitutional role of online platforms in the algorithmic society. On the one hand, private platforms have provided (information) services which even the State failed to deliver promptly, while, on the other hand, contributing to the spread of disinformation, inter alia in deciding to rely just on automated moderation, sending human moderators home. In other words, their central role during the pandemic, good and bad, has resulted in platform actors being thought of as public utilities or essential parts of the social infrastructure, even more so than before.

Despite this relevance, online platforms are private actors, to whom constitutional law does not generally nor directly apply, thus limiting the horizontal extension of constitutional obligations. Rather, constitutional theory frames power as historically vested in public authorities, which by default hold the monopoly on violence under the social contract. Power distributions in the algorithmic society question this premise.

Therefore, the consolidation of the algorithmic society requires dealing not only with the troubling legal uncertainty relating to digital technologies, or the abuse of powers by public authorities, but also the consolidation of private powers defining standards of protection and procedures, as helped by automated decision-making systems.

Searching for (Constitutional) Remedies

Constitutional law provides at least two remedies to mitigate the consolidation of unaccountable powers: the first concerns the horizontal application of fundamental rights vis-à-vis private parties; the second comes from the new phase of European digital constitutionalism, looking at the constellation of substantive and procedural rights to increase the transparency and accountability of platforms powers.

The doctrine of horizontal effect extends constitutional obligations to private actors. Unlike the liberal spirit of the vertical approach, this theory rejects a rigid separation between public and private actors in constitutional law. While subject to a narrower constraints in the US environment, as shown by the recent decision in Manhattan v. Halleck, in Europe, there is more room to extend constitutional obligations to private actors, when freedoms reflect the exercise of public powers. In particular, some cases in Italy and Germany have shown that platforms cannot take discretionary decisions on deplatforming political parties and figures. Instead, they should take into account constitutional safeguards, which limit the possibility to censor free speech. Likewise, in Germany, another court’s decision addressing hate speech showed the limits applying to content moderation. This framework underlines how courts are horizontally stretching constitutional values to limit platform power while enlarging the boundaries of what, in the US, would be called the public forum doctrine.

However, a broader reliance on the horizontal effect doctrine could lead to some drawbacks. Applying this doctrine extensively could undermine legal certainty. Indeed, virtually every private conflict can be represented as a clash between different fundamental rights. In effect, constitutional obligations could be extended to every private relationship. Further, since fundamental rights can only be applied horizontally ex post by courts through the balancing of the rights in question, this process could increase the degree of uncertainty, as well as judicial activism, with evident consequences for the separation of powers and the rule of law. Nevertheless, the horizontal extension could be a strategic move for courts to underline abuses of freedoms or the performance of functions mirroring public authorities.

Due to the drawbacks, it is also worth reaching beyond the debate on horizontal/vertical effects of fundamental rights in the digital age. An alternative weapon might be a digital habeas corpus of substantive and procedural rights, derived from the positive obligation of States to ensure the protection of human rights, which in the European context primarily comes from the framework of the Council of Europe. This requires public actors to intervene in order to protect rights and freedoms from interferences. While substantive rights concern the status of individuals as subjects of a kind of sovereign power that is no longer exclusively vested in public authorities, procedural rights stem from the expectation that individuals should be able to claim and enforce their rights before bodies other than traditional jurisdictional bodies, which employ methods different from judicial discretion, such as technological and horizontal due process. Another potential option could focus on whether human dignity, characterising European constitutionalism, can be enforced as ‘counter-limit’ that, regardless of any horizontal/vertical effect, is likely to create sufficient constraints even for private actors, as the Omega case delivered by the Court of Justice seems to demonstrate.

If, on the one hand, this new digital pactum subjectionis requires us to rethink how rights and freedoms are recognised and protected, it is, on the other, also necessary to understand how their enforcement can be effective, how they can actually be put into practice. In other words, the claim for a new catalogue of substantive rights must be coupled with certain procedural guarantees that allow individuals to rely on a new system of rights and remedies limiting platform power. Therefore, it is necessary to consider the procedural counterweight to the creation of new rights, focusing on the fairness of the process by which individuals can enforce them.

The Digital Services Act Expressing European Digital Constitutionalism

Within this framework, the adoption of the Digital Services Act will play a critical role in providing a supranational and horizontal regime to mitigate the challenges raised by the power of online platforms in content moderation. This legal package promises to provide a comprehensive approach to increase transparency and accountability in content moderation. The adoption of the Digital Services Act can be considered a milestone of the European constitutional strategy, still subject to a regulatory framework that dates back to 2000, established by the e-Commerce Directive. The Digital Services Act will also contribute to fostering the rule of law by counteracting fragmentation resulting, for instance, from the introduction of different guarantees and remedies at supranational and national level by the Copyright Directive or the amendments to the AVMS Directive.

Even if the Digital Services Act proposal maintains the rules of exemption of liability for online intermediaries, it will introduce some (constitutional) adjustments to increase the level of transparency and accountability of online platform. By addressing transparency gaps and providing for novel redress systems, the Commission aims at protecting users from unwarranted interferences, potentially harmful to their constitutional rights to freedom of expression and protection from discrimination. In the meantime, the goal is seemingly also that of guaranteeing the ‘passive’ dimension of freedom of information, that is, the right to receiving pluralistic and unpolluted information, by making sure that individuals are more aware of the functioning of and risks connected to recommender systems.

A variety of the Digital Services Act provisions precisely limit the discretion of platforms in governing their services, by introducing substantive and procedural safeguards. For instance, the Digital Services Act proceduralises the process of notice and take down (Article 14), while also requiring platforms to provide a reason when removing content (Article 15). It is worth underlining also how the Digital Services Act introduces additional obligations with respect to “very large online platforms” (VLOPs), with specific respect to content curation and to the need to foster transparency regarding such an activity. In particular, these platforms are required to conduct a risk assessment about any significant systemic risks stemming from the functioning and use made of their services in the Union, at least once a year (Article 26), while putting in place reasonable, proportionate and effective mitigation measures (Article 27). Likewise, pursuant to Article 29, VLOPs will be required to include in their terms and conditions, in a clear, accessible and easily comprehensible manner, the parameters used by recommender systems. These obligations are just some examples limiting platform discretion, thus, pushing these actors to be more transparent and accountable in their process of content moderation, as inspired by the new phase of European digital constitutionalism.

Therefore, the Digital Services Act can be taken as an example of the resilience of the European constitutional model reacting to the threats of platform power. This new phase should not be seen merely as a turn towards regulatory intervention or an imperialist extension of European constitutional values. It is more a reaction of European digital constitutionalism to the challenges for fundamental rights and democratic values in the algorithmic society. In effect, this framework underlines how constitutional law could play a critical role in limiting platform power, while promoting the protection of fundamental rights and freedoms.


3 Comments

  1. Michael Stevenson Tue 31 Aug 2021 at 18:18 - Reply

    An interesting article. However, it is a pity that the terms are used so ambiguously. What does “constitutional” mean, what is meant by “constitutionalism”? Is “constitutionalism” involved simply because rights are brought into play? Or does the term always come into play when something is particularly important politically?
    Or is “constitutionalism” simply the “icing on the cake” to spice things up a bit?

  2. Martin Gak Thu 2 Sep 2021 at 13:00 - Reply

    This seems simultaneously convoluted and quaint. The vertical-horizontal question seems entirely misplaced. It is simply the case that enforcement requires verticality. The idea of horizontality at the beginning of the paper seems to start as some obscure adumbration of supererogatory acts from the platforms and then devolves into a conceptually incoherent idea of the relation between jurisdictional authorities and platforms.

    I am at an entire loss about what this passage might mean: “Despite this relevance, online platforms are private actors, to whom constitutional law does not generally nor directly apply, thus limiting the horizontal extension of constitutional obligations.” I am not sure if there is a school of law that claims that constitutional provisions (constitutive regulae) does not apply (directly or otherwise) to private actors.

    To a large degree this seems to make two claims that while possible to juggle in very large rings where the leisure for malabarism can be accomodated, the political urgency of the violation of campaign financing through digital means, the organization of possible crimes against humanity (Burma), the governance and curtailing of fundamental European values like the public presentation of nudity, the witting or unwitting aiding and abetting of criminal activity (ISIS and NPD on Facebook), etc, gives no such space and requires not a legal but a political tough-headed solution.

    The private platforms have occupied a large swath of the European public space and have become the bodies of governance of a space that from 1945 on we have agreed to determine by democratic and republican ideas. The usurpation of democratic rules with TOSs is not merely to be corrected with regulatory instruments. One would imagine that January 6th 2020 settled that argument.

    Evidently, not.

  3. Judit Bayer Sun 21 Nov 2021 at 15:13 - Reply

    This is the best analysis of platforms’ power that I read recently. I enjoyed a lot – its approach to the quasi-public powers, its framing and description of the horizontal human rights issue. (If somebody does not understand that, is probably not familiar with the background of it.) But I do not see that brief description of some of DSA’s aspects well embedded in this concept. It is too obvious how DSA is just not responding to these conceptual challenges that this writing so clearly framed.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Digital Constitutionalism, Digital Services Act


Other posts about this region:
Europa