This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
THE LENS
Digital developments in focus
| 3 minute read

House of Lords: we need an Online Harms Bill to save our democracy

Lockdown has created a perfect storm of debate around online content. We are more reliant than ever on our social media drip-feed, which presents us with conspiracy theories and “fact-checks” political figures. Corporations have been pulling out of Facebook advertising in order to #stophateforprofitIn the EU, political gears have also been shifting as the European Commission launched a consultation on a new Digital Services Act which will overturn existing rules on online liability (see our blog posts here and here).

By contrast, progress in the UK regarding online harms has been somewhat more leisurely. The UK Government published an Online Harms White Paper in 2019, the consultation in response to which flagged concerns about its generous timetable (the DCMS has indicated that legislation might not come until 2024). The same consultation said that the Online Harms White Paper did not address the urgent action required in respect of electoral interference and political advertising, given the election battlefield has shifted online. In response, the House of Lords Select Committee on Democracy and Digital Technologies (the “HoL Committee”) was established in June 2019 to specifically investigate how digital technology can be used to support rather than undermine democracy.

After a year-long investigation, the HoL Committee published its report on “Digital Technology and the Resurrection of Trust” on 29 June 2020 (the “Report”). The Report reiterates the urgency with which the UK should introduce online harms regulation. The Chair, Lord Puttnam, described the government’s current 2024 deadline as “unacceptable” and called for a Draft Online Harms Bill to be published “immediately”. As long as voters cannot trust the information they receive and are vulnerable to manipulation, the core values of our democracy remain at risk.

Key findings

The Report makes 45 recommendations for the UK government. Key findings include:

  • Process is not enough – in some cases platforms need to be directly responsible for the content they host, rather than simply ensuring they have adequate systems and processes in place.
  • Harmful, but still legal, content must be regulated – the Report focuses on “misinformation” (wrong information which is an honest mistake) and “disinformation” (where there is a purposeful intent to misinform). The Report is clear that both should be regulated, even when they are not illegal.
  • Financial penalties - Ofcom should be able to take action against digital platforms who do not uphold their duty of care to protect users from harmful content – including by issuing fines of up to 4% global turnover for serial offenders (equivalent to GDPR-level fines).
  • Different liability for different content – the greatest sanctions should be reserved for where platforms rank, recommend or target content to users; and only once it has reached a certain level of virality or is produced by users with large audiences.
  • Transparency – an Ofcom code of practice should require online platforms to conduct regular internal and external audits on their algorithms in relation to their effect on users at risk of discrimination.
  • Restrictions on political advertising – another code of practice should be established banning and allowing the takedown of inaccurate political advertising.
  • An independent ombudsman – a content moderating body should be set up in the UK who can respond to appeals from users.

What next?

Although the HoL Committee is unable to enforce its findings, the UK Government is required to respond to the Report within 60 days of its publication (so by the end of August). This will increase the pressure on the UK Government to focus on pushing through the draft legislation, as well as stimulate wider public debate.

The Report signals another step towards a future of increased regulation and in turn an increase in the “monitoring” activities of online platforms (as is now required in relation to copyright works under the EU’s new Copyright Directive – see our update here). It may also elicit concerns around free speech and the risk of political censorship if the rules go too far. On the other hand, it could be argued that our speech is not currently “free”; it is just that platforms and algorithms influence what we see, rather than governmental bodies. The Report’s nuanced approach towards liability levels for different categories of content may be a welcome compromise.

As above, the Report has come at a time when reform of online content liability is being hotly debated elsewhere, including in the EU. The extent to which the UK Government will reflect the EU’s upcoming Digital Services Act (which will update the existing E-Commerce Directive) in national legislation remains uncertain, and perhaps the Report will sway the new UK legislation towards taking an even harder line than the Digital Services Act on general monitoring and platform liability in respect of political misinformation and disinformation.

“Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society.”

Sign up to receive the latest insights. Click here to subscribe to The Lens Blog.

Tags

online harms, content, digital services act, online platforms