The UK’s new Online Safety Bill has finally been agreed, over two years after it was first introduced. The OSB, which the government boldly claims ‘will make the UK the safest place in the world to be online’, cleared its final parliamentary hurdle on 19th September and is awaiting Royal Assent.
The OSB has proved controversial, and has been subject to intense debate both within Parliament and the media. Civil liberties groups have previously decried it as a ‘censor’s charter’, while the NSPCC by contrast claims it marks a “new era for children’s safety online”. Such scrutiny has led to both piecemeal additions to the Bill as well as dilution of its scope, most notably the removal of a requirement on tech companies to take down ‘legal but harmful’ content for adults. So where does that leave us?
Who is caught by the new regime?
The two key categories of services that fall within the ambit of the OSB are: (i) user-to-user services, being internet services which contain the functionality for a user to encounter content uploaded to, shared on, or generated on, the service by other users; and (ii) search services, which are internet services allowing users to search multiple websites or databases. Those services must also have links with the UK to be caught.
Certain services are carved out of the OSB’s scope. This includes e-mail services, internal business services, and services whose only user-to-user interaction is via ‘below-the-line’ content, such as comment sections. Even so, the government’s impact assessment estimates 25,000 organisations remain in-scope in the UK, with Ofcom putting the total figure at over 100,000 when factoring in overseas companies.
It is also worth bearing in mind that even for those not within the OSB’s scope, other UK regulators have shown an interest in the harm caused by online services. For example, the ICO and CMA drew attention to harmful website design practices in August of this year.
What are the new rules?
The OSB introduces a number of new duties. Examples of the duties placed on service providers include:
- Illegal content: using proportionate measures / processes to prevent users encountering ‘priority’ illegal content (such as terrorism content) and minimise the time such content is present, and taking down any illegal content swiftly when alerted of it.
- Risk assessments: conducting, and keeping up-to-date, an ‘illegal content risk assessment’, including conducting such an assessment before making a significant change to a service.
- Content reporting and complaints: allowing users to easily report illegal content, and operating an easy-to-use and transparent complaints procedure that allows users to complain about, among other things, their content being taken down.
- Free speech and privacy: explicitly taking account of the importance of freedom of expression, and privacy laws, when implementing online safety measures.
'Category 1' organisations (which will be designated later, but which is expected to include the biggest social media players) will have some additional obligations, and there are also further duties where services are ‘likely to be accessed by children’.
The latter include obligations such as conducting specific ‘children’s risk assessments’ and using proportionate measures to prevent children from encountering ‘primary priority’ harmful content (such as suicide, self-harm, or eating disorder content). This is a key focus of the legislation, and service providers may be mandated to use ‘highly effective’ age verification or estimation techniques to manage children’s access to content.
The Bill will also require certain providers to use reasonable measures to prevent individuals from encountering fraudulent adverts, and introduces (or bolsters) criminal offences relating to cyber-flashing, revenge porn, encouragement of self-harm, and threatening communications.
One particularly controversial aspect of the OSB is the power it grants Ofcom to require in-scope providers to use or source technology to scan private messages for child abuse content, which prompted some tech companies to threaten a withdrawal from the UK market. Companies including WhatsApp and Signal argued that such technology could not exist without dismantling encryption and eroding privacy. The government has since sought to clarify that it has no intention of “weaken[ing] the encryption technology used by platforms”, explaining that any such technology must meet standards set out in the Bill before Ofcom can demand its use. While some have viewed this as a concession that the power will not in practice be exercised, the legislation remains unchanged.
Will the new law have teeth?
Ofcom will have powers to issue large fines for those that do not comply with the new regulatory regime. Fines will be up to the greater of £18 million or 10% of global annual revenue.
The Bill also includes the possibility of personal criminal liability for ‘officers’ (i.e. directors, managers) for offences - such as a failure to comply with children’s safety duties - committed ‘with the consent or connivance’ of that officer, or where it is ‘attributable to any neglect’ of that officer.
What happens next?
It expects its powers to commence two months after the OSB becomes law.
Ofcom will then begin carrying out a series of phased consultations relating to draft Codes of Practice and guidance. Phase one will focus on illegal harms, and will be published shortly after Ofcom’s powers commence; phase two will focus on child protection duties; and phase three will focus on the particular requirements that fall upon Category 1 and Category 2 services. Ofcom expects to submit its advice to the government on categorising services six months from Royal Assent.
Ofcom's expectation is that the illegal content safety duties will enter force, and its first Codes of Practice will be issued, approximately one year from its first consultation commencing, with companies expected to conduct their risk assessments in the months preceding issuance.
We will be producing a longer-form article on the Online Safety Bill and have also produced recent content on the Digital Services Act, the EU’s legislation for creating a ‘safer digital space’.