Online harms
Central to the Government’s digital strategy is its intention ‘to make the UK the safest place in the world to be online and the best place to start and grow a digital business’.
Part of walking that tightrope, between online safety and business innovation, is the Government’s commitment to protect internet users from online harms. In its Online Harms White Paper published in April 2019, the Government describes a number of harms ranging from live-streaming terrorist attacks and self-harm imagery, to disinformation and trolling. The ramifications of the Facebook Live broadcast of the Christchurch massacre are intuitive and abhorrent. But less obvious harms also shape many users’ experiences online. A 2017 NHS report showed 20% of children aged 11-19 experienced cyberbullying that year.
In its White Paper the Government proposed a regulatory framework to tackle these harms, and place more responsibility on companies for the safety of their users online. Following 3 months of consultation, the Government has now published its initial responses.
Key themes in the Government response
A key theme of the consultation was the need to provide clear definitions of the harms in scope. There was concern that terms in the White Paper such as ‘coercive’ or ‘intimidating’ left too much room for debate and uncertainty. There was also a concern amongst respondents that a wide-reaching obligation on companies to search out and remove harmful, but legal, content on their sites would be an unacceptable fetter on freedom of speech and expression.
The Government responded that the new regulation will have differentiated expectations for illegal content versus content which is legal but potentially harmful.
Importantly, the regulation will not require removal of legal content - companies will be able to decide what type of legal activity is acceptable on their services, provided they are fully transparent with users about potentially harmful content available on their site. In scope companies will need to ensure illegal content is removed expeditiously and that the risk of it appearing is minimised by effective processes.
The framework will therefore be designed to focus on platforms’ wider systems to deal with online harms. In scope companies will need to have clear and accessible processes which demonstrate how they monitor harmful content.
A further criticism of the White Paper was the lack of clarity around which businesses would be in scope of the regulation. The Government has responded that it expects fewer than 5% of UK businesses to be caught by the regulation - only companies that facilitate sharing of user-generated content (for example through comments, forums or video-sharing). A mere ‘social media presence’ will not put a business in scope, nor will it be designed to catch B2B services.
Whilst there remains a lack of detail for businesses at this stage, the Government promises regulatory guidance to help businesses understand whether the services or functionality they provide online would fall under the regulation.
The Government suggests it will ensure there will not be disproportionately burdensome requirements for small businesses. There was a particular concern that the proposal would be so far-reaching and require such resources for compliance that only the corporate giants (Facebook, Google etc) would be able to comply, and the rules would effectively push small contenders out the market.
The Government is minded to appoint Ofcom as the regulator to oversee the new Online Harms regulation. Ofcom may well be best-placed given its high-profile experience and relationships with many major players online. Ofcom has also already published a paper on harms that arise online. It will be interesting to see how its role develops with other regulators expressing keen interest in the different aspects of this area, such as the ICO and CMA.
Next steps
The Government has committed to continue engaging with stakeholders ahead of publishing its full policy response. Businesses are encouraged to participate in dialogue with the Government and Ofcom, to help shape development of the new regulation, and to ensure it is implemented proportionately, and in balance with existing rights and responsibilities in respect of privacy and freedom of expression.
Businesses operating comment, chat or sharing functionalities should begin considering whether their online offering is likely to put them within scope of the new regulation, and what processes they have in place for monitoring content on those systems.