Water cooler and triangular chairs

Ofcom's 'Roadmap to Regulation' underway with its consultation on illegal harms duties under the Online Safety Act

12 January 2024. Published by Jessica Kingsbury, Associate

In November, Ofcom, as new online safety regulator, published the first of four major consultations under the Online Safety Act ("OSA"), which sets out its proposals for how "user-to-user" ("U2U") services (essentially any online website or app that allows users to interact with each other) and online search services (i.e. Google, Bing and similar) should approach their illegal content duties under the new legislation.  The consultation provides guidance in a number of areas including governance, content moderation, reporting and complaints mechanisms, terms of service, supporting child users, and user empowerment.

The consultation is the first step in the process which will culminate in Ofcom publishing a final code of practice for illegal harms duties, likely by the end of 2024 (the code must ultimately be approved by Parliament before coming into force).  

The significance of the codes of practice in the context of duties arising under the legislation is addressed in s.49 of the OSA, which provides that a service provider "is to be treated as complying with a relevant duty if the provider takes or uses the measures described in a code of practice which are recommended for the purpose of compliance with the duty in question".  That said, adopting a different approach to that recommended by Ofcom will not necessarily mean that a provider fails in its duties, provided it can demonstrate compliance through other means.

Recap - What are the illegal content duties?

All in-scope services must comply with the OSA's illegal content duties which include:

  • To take proportionate measures to prevent individuals encountering "priority" illegal content and to mitigate and manage the risk of harm posed by illegal content on the service (see below for the difference between "priority" illegal content and illegal content). 
  • To minimise the length of time for which any "priority" illegal content is present on the service and to swiftly take down any illegal content the provider is notified of
  • To carry out, and keep up to date, an illegal content risk assessment identifying (among other things) the risk of users encountering illegal content on the service, and the level of risk of harm presented by illegal content on the service. 
  • To specify in its terms of service the steps taken to protect users from illegal content and to apply these terms consistently, including specifying any proactive technology used by a service to tackle illegal content and ensuring that these provisions are clear and accessible to users.  

The duties to remove, mitigate and manage harm posed by illegal content apply across all areas of a service – including requiring service providers to consider the way their service is designed, operated, and used.

What is "illegal content" under the OSA?

"Illegal content" is defined in the OSA as "content that amounts to a relevant offence".  Notably a "relevant offence" only includes criminal offences under existing law – and these are sub-categorised as "priority offences", "non-priority offences" and "inchoate offences".  Illegal content does not include anything posted online that could result in civil liability (e.g., defamatory or privacy-infringing content).

Priority offences 

"Priority offences" are broadly the most serious offences and are identified in Schedules 5, 6 and 7 of the OSA.  There are over 130 identified in the legislation, including child exploitation and abuse ("CSEA") offences, fraud and terrorism offences, and harassment.  These offences either already exist offline or have been created by the new legislation, and the OSA recognises they can be committed online to the extent (a) the content consists of words, images, speech or sound which amounts to a priority offence, or (b) the possession, viewing or accessing of the content, or its publication or dissemination, amounts to a priority offence.  Content comprising a "priority offence" amounts to "priority" illegal content, in respect of which the most burdensome duties apply (see "Recap" above)

"Non-priority" offence

The Act recognises that there could be other "non-priority offences" committed online under existing criminal statute.  In essence, an offence is a "non-priority offence" amounting to "illegal" content under the Act provided the victim or intended victim of the offence is an individual and the offence is created by statute.  There are some limited exceptions (e.g., offences concerning the infringement of IP rights, the safety or quality of goods, and some consumer protection offences).   

Inchoate offences

Finally, the OSA recognises that illegal content duties extend to "inchoate offences" i.e., where the content consists of someone assisting, encouraging, attempting, or conspiring to commit a priority or non-priority offence, even if they themselves do not commit the offence (for example, content which encourages terrorism). 

Section 192 of the OSA provides that service providers should judge content to be illegal where there are "reasonable grounds to infer" that the content amounts to a relevant offence, based on "relevant information that is reasonably available".  The threshold for making this reasonable inference is therefore lower than the standard applied in the criminal justice system.  In order to make these assessments, the draft consultation suggests that service providers should follow the process set out in Ofcom's proposed Illegal Content Judgements Guidance ("ICJG").  Alternatively, Ofcom proposes that service providers should draft their terms and conditions so as to prohibit all content which would be illegal in the UK and to make content moderation decisions in accordance with these prohibitions.

Illegal content duties under the OSA – Ofcom's perspective

The illegal content duties are categorised by Ofcom as:  

  1. The 'risk assessment duties' which require service providers to assess the risk of harm arising from illegal content (on U2U services) or activity on their service.  
  2. The 'illegal content safety duties' which require service providers to take proportionate steps to manage and mitigate those potential harms. 

Risk assessment duties

As above, both U2U services and search services must prepare a risk assessment setting out the risk of illegal content being present or disseminated on the service.  In respect of search services, this means the risk of illegal content being encountered via search results.  Additional obligations are placed on U2U services to assess the risk of their service being used to commit a priority offence or in a way which facilitates such commission.

Unlike for the priority offences, service providers are not required to assess each possible kind of non-priority illegal content separately, and U2U services do not have to assess the risk that their service will be used to commit or facilitate non-priority offences.  However, services will still be required to assess the risk of harm from relevant non-priority offences appearing on the service, meaning if there is reason to believe that other (non-priority) types of illegal harm are likely to occur, these should be considered in their risk assessment.

As part of its consultation (and as required under s.98 of the OSA) Ofcom has also undertaken its own sector-wide risk assessment to identify and assess the risk of harm posed to individuals in the UK by U2U and search services.  To carry out that assessment, Ofcom considered the five characteristics relevant to identifying risks of harm specified in the OSA (a service's functionalities, user base, business model, governance, and systems and processes).  Ofcom has identified a further three characteristics as potentially giving rise to risk: service type, the existence of recommender systems, and commercial profiles.  Ofcom will publish Risk Profiles which consider these characteristics and their bearing on the risk of certain kinds of illegal harm.  Services are required to evaluate similar characteristics specified in the OSA and take account of Ofcom's Risk Profiles when carrying out their own risk assessments.

Illegal content safety duties

U2U services

In respect of both priority and non-priority offences, U2U services must: 

  • Have proportionate systems and processes to swiftly take down content amounting to an offence when the service becomes aware of it; and 
  • Use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm identified in their risk assessment. 

In respect of priority offences only, U2U services must also use proportionate measures relating to the design or operation of the service to:

  • Prevent individuals from encountering such content and minimise the length of time it is present on the service; and
  • Effectively mitigate and manage the risk of a service being used for the commission or facilitation of an offence.

In addition to these duties, U2U services must also include provisions in their terms of service specifying how individuals are to be protected from illegal content, and they must apply these provisions consistently.

Search services

The safety duties in relation to priority and non-priority illegal content with which search services must comply are broadly the same. They must:

  • Use proportionate systems and processes to effectively manage the risks of harm to individuals from such content; and
  • Operate the service using proportionate systems and processes to minimise the risk of individuals encountering such content.

Search services must also “include provisions in a publicly available statement specifying how individuals are to be protected from search content that is illegal content” and must apply these provisions consistently.

Who must comply?

Ofcom has said that "the number of online services subject to regulation could total more than 100,000 and range from some of the largest tech companies in the world to very small services".  The services in scope include social media platforms, search engines, dating apps, messaging services, gaming sites, and adult services, to name a few.  The OSA is international in reach, covering all services that have links with the UK, i.e., if they have a significant number of UK users, if the UK is one of their target markets, or if there are reasonable grounds to believe that the service could cause significant harm to UK users.

The measures recommended by Ofcom are not blanket proposals for every in-scope service.  Rather, the measures recommended for a particular service will depend on the size of the service and the level of risk the service presents.  Ofcom considers "large services" to be those with an average user base greater than 7 million per month in the UK.  Ofcom intends to assess risk level by reference to three broad categories: low risk, specific risk, and multi risk.  Different harm-specific measures are recommended for different risks identified by a service.  Multi-risk indicates that a service faces a number of significant risks for illegal harms (i.e., is medium-to-high risk for at least two kinds of illegal harms).  

Ofcom's recommendations

Ofcom has proposed a number of measures which it recommends services adopt in order to ensure compliance.  Ofcom's "consultation at a glance" summarises these proposals with reference to the size and risk level of the service.  The proposals are extremely broad in scope, so we focus here on a small number of key recommendations which are likely to be of concern to in-scope services:

Governance and accountability

Ofcom propose that all in-scope services (regardless of their size or risk level) should name a person who is to be accountable to the most senior governance body of the business for compliance with illegal content, reporting and complaints duties.

Ofcom additionally proposes that in-scope multi-risk and large services should:

  • Provide written statements of responsibilities for senior members of staff who take decisions relating to the management of online safety risks;
  • Track evidence of new kinds of illegal content on their services, including any unusual increases in particular kinds of illegal content, in order to report this evidence through the relevant governance channels; and
  • Implement a code of conduct for employees and training in respect of the service's approach to compliance for relevant individuals.

For larger services, Ofcom has suggested that the most senior body should carry out an annual review of its risk management activities, including assessing how developing governance risks are being managed, as well as implementing an internal monitoring and assurance function.

What is clear from these proposals is that Ofcom will be placing the onus on senior management to ensure robust governance processes are implemented and maintained.  The regulator acknowledges the significant cost that may be involved in implementing some of the suggested measures, but says that many recommendations are targeted only at the largest or highest-risk services and suggests this approach is proportionate to the scale of each service and the level of risk posed.  It nevertheless envisages that all services, regardless of size or risk level will be required to name a person, presumably a senior manager, for accountability purposes.

Content moderation

Where services become aware of illegal content, U2U services must have systems and processes in place to swiftly take it down, and search services must have systems and processes in place to deindex and downrank it.  To comply with these duties, Ofcom recommends that all large and multi-risk services set and record internal content policies, measure their performance against recorded targets, adequately resource their moderation functions and ensure employees involved in content moderation receive appropriate training.

Again, Ofcom says its draft codes take a proportionate approach to content moderation, based on the type of content present on the service and the risks posed by it.  For example, Ofcom recommends that the largest and highest risk services should use automated technology to detect and remove the most harmful content (including CSEA material).  They suggest a technique known as "hash matching" to detect image based CSEA material, and automated URL detection to detect URLs which have been previously identified as hosting CSEA material or which include a domain identified as dedicated to such material.  In order to tackle fraud, according to Ofcom large and medium-to-high risk U2U services should implement standard keyword detection technology to identify content that is likely to amount to a fraud offence (for example, content which offers to supply individuals' stolen personal or financial credentials). 

For all general search services, Ofcom recommends deindexing URLs which have been identified as hosting CSEA material or as being part of a website entirely or predominantly dedicated to CSEA material.  To do this, Ofcom says services should source an appropriate list of CSEA material URLs from third parties with expertise in this area, and such lists should be regularly monitored and updated.

Default settings for children (U2U services)

Ofcom recommends numerous measures to prevent under 18s encountering illegal content on U2U services, with a specific focus on grooming for the purpose of sexual abuse. 

Ofcom does not consider that self-declaration is an adequate form of age-assurance and says that, while this should continue to be used for the time being, it intends to develop proposals for the deployment of specific age assurance technology on U2U services which will require more stringent standards of age verification.

The default settings recommended by Ofcom for U2U services include:

  • U18s should not appear in network expansion prompts, and such prompts should not be presented to U18s;
  • U18s should not appear in other users' connection lists, and the under 18's own connection list should not be visible;
  • It should not be possible for U18s to be sent direct messages by users outside their connection list or, where the service does not have formal connection features, there should be mechanisms to stop U18s receiving unsolicited messages; and
  • The location information for U18s should not be visible to anyone.

Ofcom says these measures are intended to tackle strategies often deployed by perpetrators to groom minors, including sending scattergun "friend" requests to large volumes of children and sending unsolicited direct messages to children they are not connected with.  Again, though Ofcom acknowledges that there will be costs involved in implementing these measures (to the extent they have not been already), it considers that it is proportionate for services which are high risk for grooming to incur these costs irrespective of the size of the service.  This suggests that Ofcom will likely take a robust approach in response to any apparent failure to implement appropriate measures to protect children from illegal content online.

How to prepare

The draft codes of practice published by Ofcom are lengthy and detailed.  For the first time, they provide an in-depth insight into the measures Ofcom expects in-scope services to implement in order to ensure compliance with the illegal content duties under the new legislation.  Companies should digest these thoroughly, including (a) considering whether they are likely to be considered a large and/or medium-to-high risk service (and therefore will be expected to comply with more robust requirements), (b) considering the draft risk assessment guidance, and (c) considering the draft guidance on Illegal Content Judgement, the final versions of which will have a significant impact on the processes and systems in-scope services will be expected to have in place in future.  

Services can also play a role in shaping Ofcom's approach and its final guidance through responding to the consultation.  Responses to the consultation must be submitted by 5pm on Friday 23 February 2024.  

Ofcom aims to finalise the illegal harms code of practice by Q4 2024 before submitting it to the Secretary of State for approval.  At that point, services will be required to complete their first illegal harms risk assessment.  Services must comply with the illegal content safety duties from the point the final code is approved by Parliament in order to avoid potential enforcement action by Ofcom.

As such, in-scope services should also start preparing to carry out their first illegal harms risk assessment, as well as reviewing their complaints procedures, terms of service, policies, and content moderation measures.  In particular, consideration should be given to the various proposals and recommendations made by Ofcom to assess the extent to which these have already been implemented, and if not, whether such measures could be introduced.  If an in-scope service considers Ofcom's recommendations cannot or should not be introduced, they should consider and record their justification for this.  Particular focus should be given to identifying and addressing the most significant risks of serious and ongoing harm on the service, in particular to children, given Ofcom's indication that this will be its immediate priority.

Looking ahead

Phase 2 of Ofcom's roadmap to regulation concerns child safety duties and pornography.  The regulator's first consultation under this phase was published in December 2023 and sets out draft guidance for services that host pornographic content.  Further consultations relating to child safety duties will follow in Spring 2024, with draft guidance on protecting women and girls to be published by Spring 2025.

Phase 3 will address the additional duties for categorised services, including publishing transparency reports and deploying user empowerment measures.  Ofcom will publish advice to the Secretary of State regarding categorisation and draft guidance on the approach to transparency reporting in Spring 2024.  The register of categorised services is expected by the end of 2024.  Further codes and guidance, including regarding fraudulent advertising and transparency notices, are expected between mid-end of 2025.

If you have any questions about the legislation or consultation, please contact: Rupert.Cowper-Coles@rpc.co.uk and Nadia.Tymkiw@rpc.co.uk.