The UK government has published its long awaited draft Online Safety Bill, two years after publication of the Online Harms White Paper. But can the bill deliver on the government’s ambitious manifesto commitment to make the UK the “safest place in the world to be online”?
What does it cover?
The bill builds on the approach outlined in the Online Harms White Paper:
- It covers not only illegal, but also certain legal but harmful content, including fake investment opportunities and romance scams (both of which were added since publication of the white paper). There is a particular focus on harmful content for children.
- It would apply to providers of user-to-user services and all search services where the service has links with the UK.
- It sets out categories of services that are subject to varying degrees of regulation and has an overlay of general duties that apply across all categories. For example, providers of category one services (which will include the largest social media sites) will need to set out in their terms and conditions how they will address online harms. The Secretary of State will create regulations that set out threshold conditions for each category and Ofcom will create a “register of categories” setting out services meeting the threshold conditions for each of the categories.
Enforcement: high fines
Ofcom will have the power to issue large fines for breaches of the new rules - up to the higher of £18m or 10% of global turnover. It will also have the power to block access to sites in the UK.
The bill also includes a new criminal offence for senior managers of companies that do not comply with Ofcom’s requests for information, although this will not automatically come into effect. The government has stated that this could be introduced at a later date if “tech firms don’t step up their efforts to improve safety.”
Could the bill curtail freedom of expression?
Given the looming threat of large penalties, there is a fear that the regime could excessively curtail free speech if service providers over censor (removing large swathes of content) rather than risk breaching the law.
The government has built protections into the bill which aim to combat this threat. All companies in scope will need to put in place safeguards for freedom of expression when fulfilling their duties and service users will need to have access to effective routes of appeal for content that has been removed. Users will also be able to appeal to Ofcom. In addition, category one service providers will have to conduct and publish assessments of their impact on freedom of expression and demonstrate they have taken steps to mitigate any adverse effects. They will also have a statutory duty to protect UK users’ access to journalistic content and additional obligations relating to “democratically important” content. The government has somewhat confidently stated that these measures “remove the risk that online companies adopt restrictive measures or over-remove content in their efforts to meet their new online safety duties.”
Lack of clarity
One wonders how such risk can so definitively have been removed when the bill contains various duties and formulations that remain, somewhat necessarily, vague. For example, content that is harmful to children is defined as content which the Secretary of State has designated in regulations as either “primary priority” content or “priority” content which is harmful to children, or which meets conditions set out in the bill. These conditions include where a provider has reasonable grounds to believe that the nature of the content risks directly or indirectly having a significant adverse physical or psychological impact on a child of ordinary sensibilities. The definition of content that is harmful to adults is similar. But who decides what constitutes a child or an adult “of ordinary sensibilities” and what would have an impact on them? The Lords Communications and Digital Committee has written a letter asking the government a number of questions about this “unclear” language, including whether the terminology used in the bill relating to psychological impact has any clinical basis.
A wrestle with an octopus
As Ruth Davidson, former MSP, put it, this is an octopus with which Oliver Dowden MP, Secretary of State for Digital, Culture, Media and Sport, knows he is wrestling. On one side, the Government is being criticised that the bill will impact free speech and goes too far. On the other side is the argument that harmful content online has gone unchecked for too long. Prior to the publication of the bill, the government faced criticism that a new regime was needed urgently to save democracy. And there are still concerns that not all online harms are within the scope. Fraud committed via advertising, email or cloned websites does not fall within the regime, and a letter penned by the Lords Communications and Digital Committee also expressly noted a concern that user-to-user pornographic services are not covered.
The lobbying attached to this bill does not seem to be over, and we may still see further changes. The government has stated that the draft bill is to be subject to pre-legislative scrutiny by a joint committee of MPs during the current parliamentary session. A final version of the draft legislation will then be introduced before parliament later this year.