Trump’s Indefinite Ban
Shifting the Facebook Oversight Board away from the First Amendment Doctrine
After months of waiting, the decision has come. The now famous Facebook Oversight Board (FOB) has upheld Facebook’s decision that, in January, banned former President Donald Trump from accessing and posting content through his Facebook and Instagram accounts.
While this decision still raises broader questions about the legitimacy of the Board and the private enforcement of human rights online, more importantly, it underlines a trend showing how the FOB is applying protections of free speech. The FOB’s increasing reliance on the principle of proportionality and transparency is a paradigmatic example of an ever-growing distance to the First Amendment dogma characterising US constitutionalism and the proximity to the European (digital) constitutional approach.
The Call for Transparency and Proportionality
Trump’s use of the platform was found “to incite violent insurrection against a democratically elected government”. On January 21, Facebook referred the case to the Oversight Board, also in light of the five previous violations Trump had committed against its Community Standards. In the specific case that led to the FOB decision, after having removed two pieces of content (a one-minute video and a post) published by Donald Trump during the Capitol Hill attack, Facebook suspended the relevant account, by applying a 24-hour block. The day after, however, Facebook decided to extend the ban “indefinitely and for at least the next two weeks until the peaceful transition of power is complete”. What the FOB has stressed mostly is the absence of clear and transparent rules on how similar ‘digital bans’ can be applied, reviewed or revoked.
The main problem highlighted by the FOB decision lies with the lack of transparency due to the absence of clear rules applicable, most notably, to account-level sanctions against influential users. This is a key point, in the view of the FOB, also considering the claim advanced by some political figures that private censorship would be driven by political bias: “the lack of transparency regarding these decision-making processes appears to contribute to perceptions that the company may be unduly influenced by political or commercial considerations”. Here, the FOB sheds some light on the strict connection between content moderation and democracy.
Social media reactions to Trump’s posts on the January 6 Capitol Hill siege triggered a broad debate on the relationship between states’ power and online platforms. Similar issues are discussed in Europe. In Italy, for instance, between 2019 and 2020 the Court of Rome delivered some interesting, but inconsistent decisions on Facebook’s power to disable access to two far-right political movements’ pages (see here for one of them). The private enforcement of freedom of speech on social media platforms is key for both European and US constitutionalism, also in view of the ongoing reforms that may lead to increase transparency and accountability through the approval of the Digital Services Act and the revisiting of Section 230 of the Communications Decency Act.
“What if?” – Applying US standards
The decision could seem an emancipation of the Board from US standards. Indeed, its modus operandi for determining legitimate restrictions on freedom of speech and striking a fair balance between the interests at stake appears to be driven by a proportionality-based assessment far from the First Amendment dogma. If this holds true, there are good reasons to step back from the merits of the FOB decision and question how a US court would have handled such a matter, in light of First Amendment jurisprudence.
To answer this question, it is worth recalling Brandenburg v. Ohio, delivered by the US Supreme Court in 1969, regarding hate speech and “fighting words”. This landmark decision revisited the previously established “clear and present danger” test. The Court held that restrictions on free speech do not violate the First Amendment upon two requirements: a) if the speech is directed at inciting or producing imminent lawless action; and b) the speech is likely to incite or produce such action.
The FOB concedes that Facebook was right to rely on such First Amendment jurisprudence when removing Trump’s posts. In the FOB’s opinion, posts by influential users pose a high probability of imminent harm. Furthermore, the persistent narrative of electoral fraud, accompanied by repeated calls to actions, made it possible to establish a serious risk of violence. The key point to address, however, is not the legitimacy of the removal of the posts published on January 6, but rather that of the indefinite suspension of Trump’s Facebook and Instagram account. The initial block was supposed to cover (at least) two weeks until the completion of the transition of power. Such a temporary measure would have to connect the restriction on Trump’s accounts and the material risk of harm that their use could cause in the democratic transition. “Silencing Trump”, in other terms, could make sense in light of the unprecedented and special circumstances of the case.
However, the indefinite extension of such a measure seems incompatible with the initial rationale behind Facebook’s decision. Could the support and praise expressed by Trump in his January 6 posts create an indefinite risk of imminent lawless action, including after Biden took office on January 20? The answer would be negative based on the assessment in this case. At first glance, the FOB seems closer to the US doctrine of clear and present danger. Nonetheless, how the assessment is conducted is what makes the FOB approach increasingly distant from the US standard of the First Amendment.
Moving towards European standards?
Against the backdrop of the US constitutional scenario, the FOB’s approach is departing from the US paradigm. Beyond the content creator’s demand to “defer to American law in this appeal”, the FOB applied international human rights law standards for restricting the right to freedom of expression – the principles of legality, legitimate aim, and necessity and proportionality. The relevance of the principle of proportionality was particularly relevant for the Board’s advice to Facebook to provide a more granular response rather than permanent banning.
The factors considered in the assessment underline the relevance of the principle of proportionality. By also relying on Rabat Plan of Action, the Board highlighted how the context characterized by high political tensions, the status and the reach of the speaker as political figure and their intent, the characteristics of the inciting content and the imminence of harm would lead to assessing Facebook’s decision as proportionate, even if consisting of a permanent ban.
Elevating proportionality not only leads the Board in the dimension of human rights law, but also closer to the European standard of judicial review. Looking at the European Convention of Human Rights or the Charter of Fundamental Rights of the European Union, European constitutionalism does not tolerate disproportionate interferences to constitutional rights while not recognising axiological prevalence. In effect, European courts are active players in the balancing of fundamental rights against the principle of proportionality. The influence of European values was also visible in that the minority of the Board would have found Trump to have violated even more community standards, considering the importance of dignity, which, in European constitutionalism, constitutes one of the overarching values of the entire system.
Besides, the Board’s call for more transparency and clarity is also in line with the European strategy to increase the accountability of online platforms. The draft Digital Services Act can indeed be considered a milestone in this process, and potentially a global standard limiting the exercise of private powers in content moderation. The proposal aims to establish more specific obligations for intermediaries that reflect the nature of the service provided: it is not by coincidence that very large platforms (such as Facebook) should be subject to stricter obligations, including transparency requirements, in light of the particular risks in the dissemination of illegal content and societal harms that they pose. This approach would fill the lack of clarity about which the Board complained, in relation to the standards of moderation of public figures’ speech.
The FOB in the Fog of Private Powers
In this case, it appears that both US and European constitutional law would not protect Trump’s speech. Rather, the key point concerns how online platforms can handle similar matters and what they can do in the future, should similar issues come up. It is a procedural but also a very substantive problem that mirrors the critical nature of the relationship between private and public power when it comes to the enforcement of freedom of speech.
Within this framework, transparency and proportionality could be defined as the guiding principles of the Board. Still, the case reveals how Facebook is not entirely keen on these values. It has not shared information which could have changed the balance of the proportionality assessment. By relying on a procedural argument, the platform declined to answer questions levelled by the FOB, which are not required for decision-making according to the charter and are out of the scope of the FOB scrutiny. Nonetheless, this information would have enriched the context and contributed to striking a fairer balance in this case. Indeed, how Facebook’s news feed and other features impacted the visibility of Trump’s content or whether account suspension or deletion impacts the ability of advertisers to target the accounts of followers are not exactly neutral information to understand the logic of Facebook when deciding to block the account in question.
Facebook’s resistance to the FOB’s development is part of the game. The question is whether Facebook has created the Board to ensure oversight to increase its accountability or to institutionalize a process of human rights review that is still driven by private standards. This case has shown an increasing proximity with European standards, thus, marking the principles of transparency and proportionality as beacons in this new age of private enforcement of human rights in the information society.
Either Facebook is just a private company like another, and then blocking someone’s content is just a matter of private litigation between company F and its client T.
Or Facebook has become a natural monopoly and its acts should meet the same standards as if it was the government itself.