Last week, Facebook published its Community Standards Enforcement Report, evidencing high automatic detection rates for sexual, violent and terrorist content and the extent to which Facebook has taken action on such and certain other types of content. Whilst reports like this show that social media companies have started to take calls for content moderation seriously, they may fail to stem the tide of government regulation.
The UK government is currently consulting on a new regulatory framework for companies that allow users to share or discover user-generated content or interact with each other online.
The framework will include the imposition of a new statutory duty of care, obliging such companies to take action against illegal or otherwise unacceptable content. Compliance will be overseen by an independent regulator which will be able to request annual transparency reports from companies within the scope of the regime.
The consultation asks questions around the types of complaints procedures that companies should make available to their users, the differentiation between public and private content, and the support that should be given to start-ups and small companies to enable them to comply with the new regulatory framework.
Regarding enforcement action, the consultations requests views whether the independent regulator should be empowered to disrupt business activities, undertake ISP blocking, implement a regime for senior management liability, and/or require a company based outside the UK/EEA to appoint a nominated representative in the UK/EEA.
The consultation is open for responses <u>until 1 July 2019</u>.