Ducks overlooking outside scenery on bridge.

UK ICO and CMA release joint position paper on harmful design in digital markets

Published on 24 October 2023

The question

What are the impacts of the ICO and CMA joint position paper on “Harmful Design in Digital Markets” and what action should companies take in light of its guidance?

The key takeaway

The ICO and CMA have provided clear guidance on what they jointly consider to be harmful design of digital products and services, in particular harmful nudges and sludge, confirmshaming, biased framing, bundled consent and predefined default settings. Empowering user choice and control and the testing and trialling of design choices is now a must for all businesses in the digital sphere, especially those likely to appeal to children.

The background

On 9 August, the UK Information Commissioner’s Office (ICO) and the Competition and Markets Authority (CMA) published their joint position paper on harmful design in online markets. The paper focuses on the ways in which information and choices are presented to users (referred to as Online Choice Architecture or OCA) and its effect on user’s choice and control over their personal information. The paper provides five examples of potentially harmful design practices which can risk infringing data protection, consumer and competition laws. It also offers guidance on good practices that companies are expected to adopt in relation to the design of their OCA.

This publication follows both the CMA’s 2022 Discussion Paper “OCA: How Digital Design Can Harm Competition and Consumers” and the 2021 Joint Statement by the CMA and ICO “Competition and Data Protection in Digital Markets” and is the latest example of the ICO continuing the implementation of its strategy as set out in the ICO25 plan.

The development

The CMA and ICO highlighted in their 2021 Joint Statement that user choice and control over personal data are fundamental to data protection. OCA plays a major role in shaping users’ decision making online. Poorly designed or misused OCA can undermine user data protection by causing users to share more information than they would otherwise volunteer and depriving them of meaningful control over their personal information.

The paper highlights the five potentially damaging OCA design practices which firms must avoid in order to ensure compliance with data protection, consumer and competition laws:

1. Harmful nudges and sludge

This is where a company makes it easy for users to make inadvertent or ill-considered decisions (“harmful nudge”) or creates the same effect by creating excessive or unjustified friction which makes it difficult for users to get what they want or do as they wish (“harmful sludge”). For example, using a cookie pop-up which contains an option to accept all non¬essential cookies, but which does not contain an equivalent option to reject them. The use of harmful nudges and sludge may infringe both Article 5(1)(a) of the GDPR and Regulation 6 of PECR. It is expected as a minimum that users must be able to refuse non-essential cookies with the same ease as they can accept them. Where an accept all option is offered, there must be an equivalent option to reject all and both options must be presented with equal prominence.

2. Confirmshaming

Applying pressure or shaming users into doing something by making them feel guilty or embarrassed if they do not, eg through the use of language which suggests that there is a good or a bad choice. The example provided is a pop up which asks users to provide an email address and phone number in exchange for a discount which includes a reject button which states “Nahh, I hate savings”. The use of Confirmshaming is likely to infringe Article 5(1)(a) of the GDPR.

3. Biased framing

Presenting choices in a way that either emphasises the supposed benefits of a particular option in order to make it more appealing to the user (“positive framing”) or alternatively the negative consequences of an option to dissuade the user from selecting the option (“negative framing”). This can highly influence users’ decision making and impede users’ ability to assess information independently and accurately. Biased framing may infringe Article 5(1)(a) and Article 7 of GDPR. Additionally, if the practice is misleading to users, it may breach Regulation 3 and 5-7 Consumer Protection from Unfair Trading Regulations 2009.

4. Bundled consent

Requesting that users consent to their personal information being used for several separate purposes or processing activities via a single consent option. This can make it harder for users to control what their data is used for. The effect of this is that consent is unlikely to be specific or informed and as such risks infringing Article 5(1)(a) GDPR.

5. Default settings

Where a predefined choice of settings is provided to users which they then must actively take steps to amend. Users can be deterred from amending default settings due to the difficulty of altering them. The example provided refers to social network post settings, which are set by default to being public (ie the post is viewable by everyone with an account on the platform). The user would be required to take steps to amend the settings to make their content more private. Most users are unlikely to do this and therefore this increases the risk of their personal data being available more widely and used without their knowledge or understanding. Where a company’s settings are by default intrusive it will be difficult for them to justify such an approach. This potentially risks infringing Article 5(1)(a) and 5(1)(c) of the GDPR and Regulation 6 of PECR. If users are likely to be children, settings should be set to “high privacy” by default following the ICO’s Age Appropriate Design Code (unless the business can demonstrate a compelling reason for a different default setting taking into account the best interests of the child).

Why is this important?

The positions outlined in the paper are based on existing guidance and publications by the ICO and CMA and do not supersede or reopen existing legal guidance. The paper emphasises that companies are expected to make improvements to their design of OCA in light of the guidance provided. If companies fail to meet expectations, the ICO makes it clear that that it will take formal enforcement action where it is deemed necessary to protect people information and privacy rights, particularly where this involves risks or harms for people at risk of vulnerability (including children). Additionally, the CMA is currently investigating both the Emma Group and the Wowcher group in relation to the use of wider OCA practices.

Any practical tips?

Well-designed OCA can help users to make informed choices which are aligned with their goals, preferences and best interests with regard to the use of their personal information. To achieve this the ICO and CMA expect that companies have regard to the following factors which should be used to guide their OCA design:

  • users should be placed at the heart of design choices: OCA and default settings should be built around the interests and preferences of users
  • design should empower user choice and control: users should be helped to make effective and informed choices regarding their personal data and put users in control of how their data is collected and used
  • design choices should be tested and trialled: testing should be carried out to ensure design choices are evidence based
  • compliance with data protection, consumer and competition law: companies should consider whether OCA practices could be unfair to users or anti-competitive.

Where products or services are likely to accessed by children, companies should also ensure that they adhere to the standards provided in the ICO’s Age Appropriate Design Code and follow the ICO’s Children’s Code Design Guide.

The ICO and CMA welcome further participation and feedback from interested stakeholders. A joint ICO and CMA workshop on good practices for the design of privacy choices online is scheduled to take place during the Autumn.

Autumn 2023