On 2 June 2020 the European Commission launched an official consultation seeking views on its proposed implementation of a new Digital Services Act (the “DSA”) to update the existing E-Commerce Directive – see our recent blog post. The EU has controversially promised that the DSA will contain new illegal content liability rules for digital platforms, which will need to coexist alongside the stringent content monitoring required by EU legislation covering more specific areas of harm (such as the DSM Directive in relation to copyright – discussed here).

Under the existing regime, “information society services” (“ISSs”) have a “safe harbour” whereby they are not directly liable for illegal content if they comply with certain conditions. There is also a prohibition on an ISS being required to pro-actively monitor illegal content which is frequently tied to the right to freedom of speech.

However, the E-Commerce Directive was adopted in 2000; for context, neither Facebook, YouTube, nor Instagram had been founded at that time, and Amazon was yet to turn its first profit. Platforms have since grown exponentially, and content-related scandals have led to demands that they “do better” at policing. Inconsistent CJEU case law and member state approaches have also exposed uncertainties which need to be resolved.

Recent debate

European parliamentary committees have recently been vying to suggest the approach the DSA should take, with an overall preference for retention of the existing safe harbour regime but at the same time ensuring platforms are held more effectively to account:

  • The Committee on Legal Affairs’ legislative report (22 April 2020) focuses on ways the DSA can increase regulatory oversight of large platforms. It recommends establishing clear content moderation procedures and a “notice and action” framework, with any final decision regarding legality of content being made by a judicial rather than a private body.
  • The Committee on the Internal Market and Consumer Protection’s legislative report (24 April 2020) favours retaining the existing liability framework, whilst also proposing a legally binding takedown mechanism with recourse to an out-of-court dispute settlement and clarification regarding “active” and “passive” hosting.
  • The Committee on Civil Liberties, Justice and Home Affairs’ own-initiative report on fundamental rights issues posed by the DSA (27 April 2020) argues for the creation of a new EU regulator. Whilst not a content moderator per se, it could impose sanctions based on a platform’s transparency and how much it “amplifies” illegal content; users should also be able to enforce their own rights easily online.

Industry bodies have also been lobbying for the DSA to protect their members’ interests, with the typical divide emerging between tech platforms and traditional publishing organisations. For example, the EU Tech Alliance, which represents European scale-up tech companies, published a paper (4 May 2020) arguing (perhaps unsurprisingly) that the current regime should be maintained and that individuals should bear the brunt of liability, not the platform. It suggests clearly distinguishing between illegal content (which should form part of the liability regime) and lawful but potentially harmful content (which should not).

What’s next?

The European Commission’s consultation is open until 8 September 2020, and will be critical in informing its proposals for the DSA published at the end of the year. Keep an eye on The Lens for further updates.

The UK, however, has expressed a conflicting attitude to content monitoring: its Online Harms White Paper takes a robust approach, but it has also announced that it will not implement the DSM Directive which indicates an intention to avoid stringent copyright monitoring. The UK Government has previously said that it will retain the provisions of the E-Commerce Directive as far as possible post Brexit, and it will be interesting to see if its attitude changes in response to the DSA.