Abstract of air vent.

Government publishes online safety guidance for businesses

Published on 25 November 2021

The question

What are the key issues highlighted by the House of Lords with the Government’s Online Safety Bill?

The key takeaway

The House of Lords welcomes the changes proposed by the Government’s draft Online Safety Bill (the Bill), but notes that in a number of respects the draft legislation is flawed in that it may result in the over removal of content, thereby curtailing online users’ freedom of expression.

The background

The Bill was published in May 2021. It establishes a new regulatory framework to tackle harmful content online, including fines and other sanctions for non-compliance. 

The House of Lords Communications and Digital Committee has recently published a report on freedom of information in the digital age (the Report), which includes its thoughts and comments on the Bill.

The development

The Report agrees with the Government’s approach in a number of respects. It supports the Government’s proposals in the Bill which require online platforms to remove illegal content. The Report also supports the Government’s intention to protect children and vulnerable adults but notes that the proposals within the Bill in this regard do not go far enough. The Report considers that the police should be provided with additional resources in order to enforce the law on harassment, death threats, incitement, spreading hate amongst other offences. The Report considers that online platforms should contribute to the additional resources provided to the police. 

The Report also highlights a number of perceived flaws within the Bill. In its current format, the Bill requires online platforms to state within their terms of use the type of content that is legal, but which they nonetheless consider harmful. This “legal but harmful” content would then be removed at the discretion of the online platform. The Report considers that the Government’s approach to “legal but harmful” content is incorrect. The Report calls for existing laws to be properly enforced and for content that is sufficiently harmful to be criminalised, and not left to platform operators to rule on whether certain user generated content should be removed. The Report cited the recent racial abuse aimed at the England football team as a prime example of behaviour that should be criminalised.

The Report also looks for the Bill to go further and empower platform users in order to promote civility online. Furthermore, there are calls for a duty to be imposed on platform users to ensure that responsible design choices are made, providing users with a neutral means of communicating with one another, and allow users to control what content they are shown through easily accessible settings.

The Report proposes that legal but objectionable content (falling short of legal but harmful) should be dealt with through the platform design (providing a neutral, unadulterated view of content), digital education and competition regulation. This empowerment, the Report notes, would better protect freedom of expression.

Why is this important?

The overarching issue for regulators is how to protect vulnerable online users, whilst allowing those users to freely express themselves. There is clearly a balance to be struck but the Report suggests that the focus should be on protecting those most vulnerable as well as criminalising certain behaviour in order to act as a deterrent. 

Any practical tips?

The House of Lords Committee is wary of online content being over removed in a bid to keep vulnerable users safe online and reduce harmful online content. Their approach is based on properly enforcing existing legislation and regulating the design and management of online platforms. In order to better enforce existing laws, the Committee considers that platforms should contribute to the resources of the police. 
If the approach recommended by the Committee is taken, online platforms will need to look at the design of their platforms and how information/communications are displayed. Platforms will also need to be prepared for increased enforcement action to be taken by the police in respect of the content displayed.