This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Digital developments in focus
| 3 minutes read

UK Government Scraps Controversial Proposals to Tackle Legal but Harmful Content Online

The UK government has confirmed that it is abandoning controversial provisions in its Online Safety Bill to make tech firms take down legal but harmful material accessed by adults. Protections for children remain unchanged, and other new protections for adults have been introduced.


From the time that the Online Safety Bill was first published, in May 2021, provisions in it requiring tech firms to tackle all legal but harmful content caused controversy (see our earlier blog on this). By March 2022, when the bill was introduced in Parliament, this had been limited to firms only being required to tackle certain defined types of legal but harmful content.

However, even this watered-down version of the measures was met with criticism due to the potential to hamper freedom of speech online. There was concern that the bill would still incentivise platforms to over censor, especially given the risk of significant penalties (up to 10% of revenues). TechUK, for example, had pointed out that those who shared personal stories online relating to surviving self-harm could face having their content removed.

The latest changes 

The changes that have now been announced mean that whilst tech firms will still be required to protect children and remove content that is either illegal or prohibited by their terms of service, they will not face a requirement to address specific types of legal but harmful content. However, some of the content which could have been deemed harmful (for example content relating to self-harm and harassment) is being tackled in another way.  Other changes that the government has now proposed include the criminalisation of encouraging self-harm and the sharing of people’s intimate images without their consent. Content that falls into these categories will now therefore be caught by the requirements to tackle illegal content.

Tech firms will also now have to give adult users more choice about what content they see and engage with. The government has been keen to stress that the changes being made also include the addition of new duties to make platforms more transparent and accountable to their users. This will include changes to force tech firms to publish more information about what risks their platforms pose to children, and Ofcom will have the power to require platforms to publish details of any enforcement action it takes against them. Another amendment is that the criminal offence of controlling or coercive behaviour has been added to the list of priority offences in the bill. This means platforms will have to take proactive steps to deal with such content, such as putting in measures to allow users to manage who can interact with them or their content.

The current Culture Secretary, Michelle Donelan, has said the latest changes mean the bill is now “focused where it needs to be: on protecting children and stamping out illegality online.” Interestingly, Donelan also suggested that the change removes the “threat that tech firms or future governments could use the laws as a licence to censor legitimate views.” This echoes a sentiment expressed in a letter sent in July this year from nine senior Conservatives to then Culture Secretary Nadine Dorries, demanding that the provisions aimed at regulating legal but harmful content be removed from the bill. The letter alluded to the risk that a future government, including one ran by the Conservative Party’s “political opponents”, the Labour party, could use the bill to restrict free speech.

What protections remain in the bill?

The Department of Culture has suggested that the changes mean users will benefit from a ‘triple shield’ of protection when online as social media firms will be required to:

  • remove illegal content,
  • take down material in breach of their own terms of service, and
  • provide adults with greater choice over the content they see and engage with.

Responses to the changes and next steps

Whilst those who criticised the inclusion of measures to tackle lawful but harmful content from the outset will be glad to see them scrapped, the move has been criticised by the Labour Party and by campaigners and charities. Shadow Culture Secretary Lucy Powell has called the changes a “major weakening” of the bill. The head of the Samaritans charity, Julie Bentley, has also voiced concerns, stating that “increasing the controls that people have is no replacement for holding sites to account through the law”.

The bill returned to Parliament on 5 December and the House agreed to recommit certain Clauses and Schedules to a Public Bill Committee. The Public Bill Committee is expected to sit on Tuesday 13 and Thursday 15 December, and so any parties interested in submitting written evidence on the relevant clauses and schedules are advised to do so before 13 December.

The bill is not expected to reach the House of Lords until January 2023. This does not leave the government with much time, because as the bill originated in a previous parliamentary term it is subject to a 17 March 2023 deadline. It is possible that a limited extension could be made, but even so, the government is now under time pressure to get this bill pushed through and made into law.

Today’s announcement refocuses the Online Safety Bill on its original aims: the pressing need to protect children and tackle criminal activity online while preserving free speech, ensuring tech firms are accountable to their users, and empowering adults to make more informed choices about the platforms they use.


onlinesafety, onlineharms, onlinesafetybill, regulating digital