Ireland cannot do it alone
All Member States should contribute to the supervision of very large online platforms under the Digital Services Act
Due diligence obligations for online platforms are probably the most relevant aspect of the European Commission’s proposal for a Digital Services Act (DSA). These rules will be enforced through administrative sanctions. If, for example, Facebook fails to implement user-friendly flagging mechanisms to report illegal content (Art. 14(1) DSA), the competent public regulator might impose a fine on Facebook.
Such an enforcement regime relies heavily on the capability and willingness of the competent regulators to fulfill their oversight role. Therefore, it is crucial which regulator has jurisdiction. The DSA relies on a strengthened country of origin principle. Only the member state where a platform is established will be competent in the first place. The Commission will be an additional regulator of last instance.
This new oversight structure might prove counter-effective. Our experiences with the country of origin principle are dissatisfying at best (free flow of services, but not free flow of protection and enforcement). And since the DSA does not envision the Commission as a strong super-regulator, its new role will not help to outweigh the barriers for enforcement that come with the strict country of origin principle. Therefore, the oversight structure of the DSA requires recalibration. Europe needs to start this discussion now, as Member States have begun to take positions on the DSA proposal and the Commission is pressuring to move forward with its legislative mega-project.
In the following, I analyse the shortcomings of a strict country of origin approach and suggest an alternative oversight model which would allow for all Member States, especially the ones who are willing and capable to spend resources, to contribute to the oversight over very large platforms.
Status quo: Ireland in charge, target countries relying on exemptions
Under the existing legal framework, the country of origin is responsible to ensure compliance of that provider with its laws (Art. 3 (1) E-Commerce-Directive). Other Member States are not allowed to apply their legal regimes, for example intermediary liability rules for hate speech, because that would “restrict the freedom to provide information society services from another Member State” (Art. 3(2) E-Commerce-Directive).
Overall, the country of origin principle creates a one-stop-shop solution: the rationale is that online platforms only need to understand and follow the laws (and orders) of the Member State where they are established. It is clear that in areas of law with little harmonization and where cross-border enforcement is far from effective (e.g., intermediary liability for hate speech), this results in a very industry-friendly regime (free flow of services, but no free flow of protection / enforcement).
Art. 3(4) of the E-Commerce-Directive creates a seemingly very narrow backdoor, enabling other Member States, which are affected by the online services (target countries), to take measures in specific cases when following a detailed procedure. National intermediary liability rules like the German NetzDG and the French Loi Avia, which apply to platforms established in other Member States, all rely on the exemption. It is heavily disputed whether that exemption can justify such laws (for in-depth discussion of this topic see D. Holznagel, CRi 2020, 103-109).
The future framework – exclusive oversight by Ireland and a Commission “leading from behind”
Art. 40(1) of the draft DSA will override Art. 3(4) of the E-Commerce-Directive. This means that for future enforcement of the DSA’s due diligence obligations, only the country of origin will have jurisdiction, without exemption. For very large platforms, the European Commission will be a regulator of second instance. Other Member States cannot start proceedings on their own, but must refer matters to the country of origin or the Commission.
Under such an oversight structure, the role of the Commission would be crucial to counterbalance the not-so-unrealistic scenario that the country of origin (for most Big Tech companies that is Ireland) fails to fulfill its oversight role for the whole Union and all its citizens. However, the DSA will make it very hard for the Commission to fulfil this role. In principle, the DSA attributes the task of oversight to the Member States (DSA, recital 72). A lengthy procedure needs to be gone through before the Commission could finally issue a sanction. This lengthy procedure will offer the platforms multiple lines of defense, enabling them to water down any charges against them and to adjust their behavior before they have to fear a final sanction. Thus, the Commission is more likely to slowly lead from behind. It will not be in a position to substitute a passive country of origin. In particular, because the Commission is only budgeting 50 full-time employees for its new oversight role. One can guess that for political reasons, the Commission is hesitant to become a “super-regulator” for online platforms.
We should include, not exclude 26 Member States from contributing to oversight over the “Giants”
With a Commission unwilling and unable to take-up the role of a super-regulator, excluding 26 of 27 Member States from contributing to oversight and exclusively relying on the country of origin in the first place is a bad idea.
Having faith in Ireland (or any other country of origin alone) is not good enough
So far, and with just a few exemptions, Europe has failed to build up meaningful oversight over online platforms in the field that the DSA will cover (for example measures against the spread of illegal hate speech). Ireland, where Facebook, YouTube, Twitter and the likes are established, has been either unwilling or not capable of conducting at least minimum oversight for all the Dublin-based mega-platforms. One can draw parallels to the (lack of) enforcement of the GDPR, where observers are describing a regulatory standstill. However, we can skip a potential blame-game. In the field of compliance with the DSA, we are going against gigantic platforms with unprecedented impact on our citizens, societies, democracies and economies. To police such huge platforms’ compliance requires large resources and might easily overwhelm any single national regulator. Therefore, we should be happy to get as many national authorities into the ring that we can, if they willing to spend the resources necessary (if you are going against Goliaths, take all the Davids you can get).
Very large platforms should not be allowed to cherry-pick their regulator
Under the current draft DSA, very large platforms are effectively free to cherry-pick their regulator by announcing establishment in the respective country. It is noteworthy that all major platforms are – in their own words – run by Irish subsidiaries of prominent parent companies, these include Facebook, YouTube, Twitter, TikTok and Instagram. It is also telling that shortly after the revision of the AVMSD, which effectively strengthens the E-Commerce-Directive’s country of origin principle for video-sharing platform providers, YouTube quietly changed its imprint and terms of services in January 2019. According to these documents, their services were no longer provided by the U.S. Google LLC, but from now on by Google Ireland Limited. For providers without an establishment in the European Union, the draft DSA will make cherry-picking even easier: by choosing the seat of the legal representative, per Art. 40(2) DSA. Therefore, the country of origin principle leads to a race to the bottom, as is the case under the E-Commerce-Directive, where companies are attracted by countries with little appetite for enforcement. It can also trigger a cat-and-mouse game: when a very large platform suddenly switches jurisdiction to a smaller Member State, it might take years until this country can boost its regulator with the resources required for the immense task suddenly imposed on it.
Local expertise is crucial
Even harmonized due diligence obligations will still have to take into account national laws, because relevant obligations in the DSA implicitly require evaluation of whether the platforms are handling illegal content adequately. Yet what amounts to illegal content or activity, is widely non-harmonized among Member States (e.g. holocaust denial), and the DSA does not attempt harmonization. Local authorities will better understand the applicable national norms, languages involved, as well as cultural and factual backgrounds.
Include all 27 Member States as “agents of a common cause” under Commission-coordination
So what is the alternative? How can we (1.) get as many regulatory resources into oversight as possible, (2.) prevent that a single Member State becomes a bottleneck of enforcement, when (3.) the European Commission will not be in the position of an ambitious super-regulator and (4.) oversight for the whole Union requires expertise in various local languages, cultural norms and contexts?
I suggest that all Member States should be included to contribute to oversight over very large platforms, acting as “agents” of the Commission. The Commission will be in a role of coordinating their efforts, keeping track of resources and progress made, and to appoint cases to authorities which have the capacities to handle them.
As a starting point, I suggest that all Member States would have jurisdiction over very large platforms. This proposal is not new, the DSA already knows this concept for providers with no establishment in the Union, per Art. 40(3). Any sanctions must be proportionate to the relevance of the platform in that Member State, thus respecting the principle of ne bis in idem. Likewise, this also already applies to providers with no establishment in the Union (Art. 40(3)).
In the model that I suggest, the European Commission might intervene at any given time and (1.) either decide to proceed with the case on its own or (2.) to appoint a single Member State to proceed for the whole Union in cooperation with other Member States. That way, when several Member States start investigating a very large platform’s compliance, the Commission is free to (1.) either let these Member States proceed with parallel cases (which would respect ne bis in idem, see above), or (2.) to consolidate proceedings, or (3.) to take the case itself. The Commission would remain the regulator of second instance as now foreseen in the draft DSA.
Alternatively, the future European Board for Digital Services could be in a position to assign jurisdiction over certain platforms to different Member States by majority decisions.
Of course, this proposal might draw criticism. For example, one might argue that jurisdiction of 27 Member States will create legal fragmentation. But this would be short-sighted: The compliance rules itself are harmonized (Chapter III of the DSA) and even the current draft DSA requires 27 regulators to interpret these rules, for the platforms established in their territory. In any scenario, the Commission would always be the regulator of last instance and the European Court of Justice would deliver final interpretation, guiding all Member States. Finally, the fear of fragmentation is exaggerated in the first place. When emphasizing fragmentation (see, e.g., recital 2 and 4 of the DSA), the DSA is not spotlighting Europe’s initial and substantial problem that requires legislative action (which is the inactivity of Ireland), but the reactions to solve it (national measures like NetzDG or Loi Avia). Keep in mind also that the Commission is somewhat cursed to highlight the fragmentation issue to justify the European Union’s competence for the DSA (based on Art. 114 of the Treaty).
Another argument against the suggested model might be raised: It does not seem very Union-friendly, as Member States will start to dispute which country shall be in charge and how oversight should be handled. However, as the very large platforms affect all Member States, this is inescapable. Even under the draft DSA, Member States will have to argue that the country of origin is too passive in order to get the Commission to intervene (Art. 45(1) and 45(5)). And honestly, national authorities trying to demonstrate that they can do better in regulating the giant platforms is a scenario that we should not be afraid of. To the contrary, it should be very much welcomed.