Ducks overlooking outside scenery on bridge.

New ICO guidance on content moderation and data compliance

Published on 17 April 2024

The question

How can organisations using content moderation technologies and processes best comply with data protection laws?

The key takeaway

The Information Commissioner’s Office (ICO) has published new guidance on how organisations can comply with data protection laws, specifically the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018, when it comes to deploying or providing content moderation services.

The background

Content moderation is defined in the guidance as:

  • the analysis of user-generated content to assess whether it meets certain standards; and
  • any action a service takes as a result of this analysis. For example, removing the content or banning a user from accessing the service.

The ICO’s guidance is directed at individuals and organisations utilising or considering content moderation, as well as those offering related products or services. It caters to both data controllers and processors and is particularly relevant for compliance with the Online Safety Act 2023 (OSA). The primary audience for the guidance comprises trust and safety professionals, as well as individuals in roles focused on data protection compliance, such as data protection officers, general counsel, privacy-legal professionals and risk managers.

The development

The focus of the guidance is specifically on moderating user-generated content within user-to-user services, aligning with the definitions provided by the OSA. Section 3(1) of the OSA defines a user-to-user service as:

an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

Section 55(3) of the OSA defines user-generated content as content that is:

  • “(i) generated directly on the service by a user of the service, or (ii) uploaded to or shared on the service by a user of the service, and;
  • that may be encountered by another user, or other users, of the service by means of the service.

The guidance helpfully sets out what organisations “must”, “should” and “could” do to comply with data protection laws in the context of user-generated content moderation within user-to-user services. An explanation of each category is given below, inclusive of examples for each term:

  • must” specifies mandatory requirements that must be implemented to ensure compliance with legislative frameworks. Examples of what organisations must do include: (i) carrying out data protection impact assessments prior to processing personal data if this is likely to carry risk; (ii) ensuring particular care if the data of children is being processed; (iii) identifying a lawful basis before using personal information in a content moderation system; (iv) informing people about how their data may be processed; and (v) placing technical and organisational measures to ensure an appropriate level of security to the risks of using personal information;
  • should” refers to recommended best practices which organisations should strive to implement to enhance their content moderation and data protection efforts. If a different approach is taken than one specified by the ICO, organisations may need to demonstrate that their approach also meets regulatory standards set by the law. Examples of what organisations should do include: (i) having an appeals process in place for decisions made around content moderation that users can easily access and use; (ii) informing users about the types of content prohibited on their service accompanied by an explanation of why this is the case and how such content could be actioned; and (iii) providing data protection training for moderators to ensure they are up-to-date with current standards and practices when making their decisions;
  • could” refers to optional strategies which could be adopted to ensure optimal compliance. While not mandatory, the ICO strongly encourages their implementation to enhance the effectiveness of content moderation and data protection practices across business operations. Examples of what organisations could do in relation to compliance with data protection laws include: (i) incorporating information relating to automated-decision making within content moderation into their privacy policy or terms of service; (ii) auditing moderator decisions periodically to check that they use personal information consistently and fairly; and (iii) implementing access controls to ensure that human moderators are only able to view and access personal information relevant to inform moderation decision-making.

Why is this important?

This guidance reflects the ICO’s ongoing commitment to facilitating organisations’ compliance in the context of online safety technologies. It also helps to establish and support regulatory coherence between data protection and online safety. It is important that organisations carefully consider the guidance when engaging with content moderation technologies and processes to ensure effective compliance with applicable data protection laws within a fast-changing regulatory landscape.

Any practical tips?

Although the ICO notes that this guidance is not a comprehensive guide to compliance, it serves as a useful starting point for organisations which are involved with utilising or considering content moderation, as well as those offering related products or services. Clearly, organisations should be especially prudent in ensuring compliance with the obligations which they must do in accordance with the ICO’s guidance. Equally, best practice is always a sensible option on sensitive data-related matters such as online harms, and particularly when it comes to automated decision-making which is attracting significant interest in the developing world of AI.

Spring 2024