Sun reflecting on RPC building.

CMA continues consultation on potential harms caused by algorithms

Published on 02 August 2021

The question

What competition and consumer harms are the Competition and Markets Authority’s (CMA) finding in the operation of algorithms, and how is it seeking to address these?

The key takeaway

Following the conclusion of the consultation, the CMA is now looking into ways in which it can mitigate and remedy the harms it outlined in its earlier research paper on algorithms. 

The background

The CMA has been investigating algorithms and how they can reduce competition and harm consumers for some time, having published a research paper on the topic in January 2021. The paper sought to analyse algorithms and their impact from a competition and consumer perspective. The harms identified include: 

  • personalisation (which is hard to detect by consumers or others and targets vulnerable consumers)
  • exclusion or reduction of competition through algorithms (eg through preferencing your own services over others), and
  • failure to prevent harm through the overseeing of platforms using algorithms. 

In conjunction with the review, the CMA called for evidence in a consultation. It published the evidence that was submitted in June 2021. 

The development

Most of the 35 respondents agreed with the CMA’s assessment of the potential harms, but did note that: there are several nuances to the harms identified; some harms were missing (including use of consumer data); and there is a need for legal analysis, empirical evidence, and a proportionate approach for any investigation into the harms in the future. 

The CMA will publish its next steps and potential future intervention soon, once all the evidence has been reviewed. However, as highlighted in the research paper, potential future steps being considered by the CMA include:

  • ordering firms to disclose information about their algorithmic systems to consumers
  • requiring a firm to disclose more detailed information to approved researchers, auditors and regulators, and to cooperate with testing and inspections. Cooperation may involve providing secure access to actual user data, access to documentation and internal communications on the design and maintenance of the algorithmic system, and access to developers and users for interviews
  • imposing ongoing monitoring requirements and requiring firms to submit compliance reports, providing ongoing and continuous reporting data or API access to key systems to auditors and regulators
  • requiring firms to conduct and publish algorithmic risk assessments of prospective algorithmic systems and changes, and/or impact evaluations of their existing systems, and 
  • ordering firms to make certain changes to the design and operation of key algorithmic systems and requiring them to appoint a monitoring trustee to ensure compliance and that the necessary changes are made.

The CMA has also flaunted the possibility of further investigations before any of the above steps are taken, so that it can better understand the use of algorithms in various marketplaces and what might therefore be appropriate given their use in various contexts. 

Why is this important?

Algorithms are near ubiquitous in most technologies and services these days and are an integral part for many making their services as valuable as they are (such as Google Search). The CMA’s consultation shows clear intent in regulating the algorithm space, and to provide further transparency in how they work, which raises challenges for those who want to keep them proprietary and confidential. It is clearly in the interests of all affected parties to ensure that they follow the developments in this space and to feed into any possible future investigations so that their position can be better understood and any regulations shaped in a way which both protects consumers whilst also allowing algorithms to be used in the best, most useful way possible. 

Any practical tips?

  • If your business runs on or utilises algorithms, follow the CMA’s investigations closely
  • Consider engaging in early dialogue with the CMA to help ensure future compliance and to limit any potential exposure of proprietary technologies and/or confidential information, and
  • Consider if any immediate steps need to be taken with the design and operation of your algorithmic systems in order to get ahead of the regulatory requirements (and likely investigations) which will inevitably follow in this space.