Abstract of machinery with blue tint.

ICO: Age Appropriate Design Code for information society services

Published on 04 July 2019

What steps does the Information Commissioner’s Office (ICO) require to ensure adequate protection of children online?

The background

The ICO drafted the Age Appropriate Design Code (the Code) to provide standards and guidance for what is expected of information society services (ISS) that process personal data and are also likely to be accessed by children under 18. This is in line with the ICO’s obligations under the Data Protection Act 2018 (section 123) which required the preparation of a code of practice addressing these issues. The Code is due to be finalized by the end of this year and needs to be approved by Parliament before final publication.

The scope

The Code will apply to ISS, defined as “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”.

This definition is wide enough to include apps, programs, search engines, social media platforms, online messaging services, online marketplaces, streaming services, online games, news and educational websites, and any websites offering other goods or services to users over the internet. Note that the ICO has confirmed that “remuneration” in the definition includes both services funded by advertising and services provided to end users free of charge.

Most online service providers are captured within the definition and they will need to review how they are processing personal data to check for compliance with the Code even if their services are not aimed at children. Essentially, the Code applies if children under 18 are likely to use the service.

The standards

There are 16 standards outlined in the Code. To be compliant, all of the standards must be implemented. Some brief examples of the standards and practical guidance provided are as follows:

  • age-appropriate application: consideration must be given to the age range of the audience as well as the needs of children at different ages and stages of their development. The standards of the Code will apply to all users unless there is a robust age-verification mechanism in place to distinguish children from adults. Self-declaration of age or age range on its own will not amount to a robust age-verification mechanism
  • profiling: profiling options must be off by default unless, with consideration to a child’s best interests, a compelling reason can demonstrate otherwise. Profiling may be allowed if appropriate measures are in place to protect children from any harmful effects, like having privacy settings with options specific to different types of profiling that are switched on by the child
  • nudge techniques: nudge techniques cannot be used to lead or encourage children to provide unnecessary personal data, turn off privacy protections, or extend use. There are many different nudge techniques used by ISS. Under the new standards, the only exceptions to allow any use of nudges is for high privacy options, wellbeing enhancing behaviours, or parental controls and involvement.  

Why is this important?

Aside from the clear importance of children’s personal data and privacy, the Code will require major changes by ISS to their website design and operations to ensure compliance. The consequences of regulatory action include assessment notices, warnings, and of course GDPR level fines. The ICO is likely to take more severe action in cases of harm or potential harm to children than other types of personal data.

Any practical tips?

The definition of ISS catches a huge swathe of online businesses. They should all start conducting internal reviews to assess the impact of the Code. Any failure to comply with the Code will make it extremely difficult to show compliance with the GDPR and the Privacy and Electronic Communications Regulations. When it comes to breaches concerning children in particular, that could provide extremely costly.