Reflection of surrounding buildings on RPC's building.

Online Harms White Paper: consultation response

Published on 02 June 2020

What has been the Government's response to the initial consultation on the Online Harms White Paper (White Paper)?

The key takeaway

Due to its organisational experience and robustness, Ofcom will be the new online harms regulator. Ofcom’s responsibilities will include ensuring that online companies have processes and systems in place to fulfil their duty of care to keep people using their platforms in a safe manner. 

The background

In April 2019, the White Paper was released. This set out the Government’s intention to improve protections for users online through imposing a duty of care on online services to moderate a wide spectrum of harmful content and activity on their services, including child sexual abuse material, terrorist content, hate crimes and harassment. Following the release of the White Paper, a consultation was run from 8 April 2019 to 1 July 2019, which received over 2,400 responses from companies in the technology industry, including think tanks, rights groups, governmental organisations, individuals and large tech giants. On 21 February 2020, the UK Home Office and Department for Digital, Culture, Media & Sport published the Government's Initial Consultation Response to feedback received through a public consultation on its White Paper. 

The guidance

Scope of regulation

The White Paper introduced a new duty of care that would apply to any online service that either (1) facilitates the hosting, sharing, or discovery of user-generated content; or (2) facilitates online interactions between users. Business to business services would fall outside the scope of this regulation. The duty of care will only apply to companies that facilitate the sharing of user generated content, such as through comments or video sharing. According to the UK government, only a very small proportion of UK businesses (less than 5%) will fit within the definition of who the duty of care applies to.

Scope of the duty of care 

The White Paper introduced a new duty of care on companies to ensure that all companies have appropriate systems and processes in place to react to concerns over harmful content and improve the safety of their users. These include compliant mechanisms (that are effective!) and transparent decision-making over actions taken in response to reports of harm. The Government indicated that it will take a different approach to content and activity that is illegal (such as hate crimes) as opposed to harmful but legal content (such as cyberbullying). While the duty of care will require companies to expeditiously remove illegal content from their services, they will not have a similar obligation to remove legal content.  Instead, companies will have to state publicly what content and behaviours are unacceptable on the service (for instance in their terms of service), and to have systems in place to enforce these statements consistently and transparently.

Freedom of expression

The UK government has explained that it recognises the importance of free speech. Companies will now be required, where relevant, to state what content and behaviour they deem to be acceptable on their sites and enforce this consistently. A higher level of protection is required for children, and services in scope will need to ensure that illegal content is removed expeditiously.

The regulator 

Ofcom will be the independent regulator as it has a proven track record of experience, expertise and credibility. It will be equipped with the powers, resources and expertise it needs to effectively carry out its new role. Ofcom’s focus on the communications sector means it already has relationships with many of the major players in the online arena. The Response does not define the sanctioning powers that will be available to Ofcom, but it suggests that these may include the power to issue fines, impose liability on senior managers and, in certain circumstances, require companies to improve systems or even engage in measures like ISP blocking.

Age verification and transparency requirements

In-scope service providers will need to implement appropriate age verification technologies to prevent children from being exposed to inappropriate content.  They will also need to adopt certain transparency measures depending on the type of service and risk factors involved. As such, the regulator will be able to require companies to submit annual reports explaining the types of harmful content on their services, as well as information on the effectiveness of the company's enforcement procedures. 

Why is this important?

Companies within scope will be required to have appropriate processes and mechanisms in place, if not there already. Terms and conditions will also need to be amended to comply with the duty of care and codes of practice will need to be clear and accessible to all (including children). Ensuring compliance will be important as Ofcom is likely to have the power to impose fines, disrupt business activities, block services and impose liability on individual members of senior management for non-compliant organisations. 

Any practical tips?

While many platforms are already ramping up their efforts to combat harmful content, the impact of the new duty of care needs to be considered very seriously, not least as the Government stated in its response that “online harms is a key legislative priority”. To underline this, the Government also said that it will start working on interim codes of practice with law enforcement and industry bodies to tackle terrorism and child exploitation in the meantime (ie while it waits for Ofcom to step into its new role).