RPC Data Dispatch Publication Banner

Data dispatch - April 2024

Published on 17 April 2024

Welcome to the fourth edition of Data Dispatch from the Data Advisory team at RPC. Our aim is to provide you on a regular basis with an easy-to-digest summary of key developments in data protection law.

The format makes it easy for you to get a flavour of each item from a short summary, from which you can click "read full article".

Please do feel free to forward on the publication to your colleagues or, better still, recommend that they subscribe to receive the publication directly.

If there are any issues on which you'd like more information (or if you have any questions or feedback), please do let us know or get in touch with your usual contact at RPC.

Key developments

Government publishes response to AI White Paper consultation

The Government has published its response to its consultation on the AI White Paper which sets out the government's "pro-innovation" approach to AI regultation.

The Government initially published its White Paper entitled: A pro-innovation approach to AI regulation (the White Paper) in March 2023. The White Paper outlines the government's sector-based approach to regulation, where existing regulators are empowered to create sector-specific rules based on five key principles that revolve around the safety, transparency and fairness of AI. The public consultation of the White Paper ran from 29 March to 21 June 2023 and the Government has subsequently responded.

The response is wide-ranging – however the key points are:

  • A package of over £100 million will be invested in new innovations and used to support existing sectoral regulators (including the ICO) to build their technical expertise.
  • The government has asked a number of regulators (including the ICO) to publish an update setting out their strategic approach to regulation by 30 April 2024.
  • New guidance has been published to support regulators to implement the five principles effectively.
    The Digital Regulation Cooperation Forum (comprising Ofcom, the CMA, the FCA and the ICO) will manage the AI and Digital Hub, a pilot scheme for an advisory service to support innovation.
  • The Government will establish a new central function to bring coherence to the sectoral regulatory regime.
    Businesses developing highly capable general purpose AI systems will be subject to targeted binding requirements.
  • The Government recognises that legislative action will be required at some point but will "take our time to get this right".

In the coming months, the Government expects to formally establish the activities set out in its response to support regulator capabilities and coordination, including a new steering committee to support regulator coordination.



ICO's consultation series on generative AI and data protection

The Information Commissioner's Office (ICO) has launched a consultation series which aims to gather feedback from stakeholders about how data protection law should be applied to the development and use of generative AI technology in the UK, with the aim of establishing more clarity in this area.

The ICO's consultation series kicked off on 15 January 2024 with the publication of the ICO's first "chapter" on 'the lawful basis for web scraping to train generative AI models'. This was followed by chapters on the application of the purpose limitation and accuracy principles; the consultation on the latter closes on 10 May 2024. All three chapters can be found here.

This first chapter covers the ICO's initial thoughts on the collection and use of training data as part of the generative AI lifecycle and in particular on:

  • legitimate interests as a possible lawful basis for collecting training data via web scraping and for training generative AI models on web-scraped data;
  • how developers may demonstrate that (i) the purposes of its processing of data are legitimate, (ii) the processing of data is necessary for those purposes, and (iii) the rights of any affected data subjects do not override the interests being pursued by the controller; and
  • in relation to (iii) above, assessing the risks to individuals in the use and deployment of AI.

The second chapter focuses on how the UK GDPR principle of purpose limitation should apply to generative AI. In particular, it notes that:

  • the different stages in the AI lifecycle may process different personal data for distinct purposes. It is important for organisations to understand what the purpose of processing is at each stage so that they can assess each such purpose separately and understand their data protection law obligations. Each purpose must be "specific and clear"; general, broad statements of purpose are unlikely to be sufficient; and
  • developers must ensure that any reuse of training data to train a new or different model is compatible with the original purpose of collection and what data subjects would reasonably expect. If not, a new purpose may need to be determined. Where there is no direct relationship between developer and data subject, privacy notices and public messages, as well as technical measures such as anonymisation and other privacy-enhancing measures may serve to reduce the privacy risks to data subjects of such processing.

We will discuss the third chapter, on the accuracy principle, in a future edition of Data Dispatch.

By providing their feedback on the propositions put forward by the ICO, organisations may gain a deeper insight into, and help shape, the direction the ICO will take in its interpretation of UK data protection laws as they apply to emerging uses of generative AI technology. 


Enforcement action

ICO fines HelloFresh £140,000 after spam marketing campaign

The ICO has fined HelloFresh £140,000 for sending millions of spam marketing messages across a seven-month campaign period.

HelloFresh is a meal delivery service which operates on a subscription basis. The marketing emails and texts subject of this investigation were sent based on an opt-in statement, which customers had (at some point) agreed to. However, this opt-in statement did not make any reference to the sending of marketing via text and was also bundled with an age confirmation statement which was likely to unfairly incentivise customers to agree to the statement. HelloFresh was also sending marketing messages to previous customers for up to 24 months after such individuals had cancelled their subscriptions.

The Information Commissioner's Office (ICO) investigated and found that HelloFresh had contravened regulation 22 of the Privacy and Electronic Communications Regulations (PECR) by sending a total of 80,893,013 direct marketing messages to subscribers, all of which were in breach of PECR. HelloFresh had failed to obtain specific and informed consent from its customers to send such marketing messages. The ICO also found that it was not in the reasonable expectations of former customers that they would receive marketing messages up to 24 months after ending their subscriptions. The ICO identified HelloFresh's contravention of PECR as serious and issued a £140,000 fine.

This serves as an important reminder for companies to gain specific and informed consent from a customer before sending marketing messages. This includes informing a customer of what kind of marketing they are consenting to receive (whether it be via text and / or email) and for how long they are consenting to receive marketing material (even if this is after their subscription ends). It is also important not to 'bundle' consent with other aspects, such as the age confirmation statement in this case.


ICO action taken regarding HelloFresh

HelloFresh Monetary penalty notice


Tagadamedia fined €75,000 by the French Data Protection Authority for misleading competition forms and failure to have a compliant record of processing activities

The French Commission Nationale de l'Informatique et des Libertés (CNIL) has fined Tagadamedia €75,000 (which equated to approximately 1.6% of Tagadamedia's turnover) after it found that Tagadamedia had breached data protection law by not obtaining the valid consent of web users to process their personal data and not having a compliant record of processing activities.

Tagadamedia is a company which collects the personal data of web users (who it refers to as 'prospects' and who numbered approximately 6 million) when they participate in competitions, surveys and product testing. This data is then sold on to advertising partners.

As part of its proactive investigation into commercial prospecting which the CNIL undertook over the course of 2022, it found that Tagadamedia had breached Article 6 GDPR by processing the personal data of prospects without a lawful basis. This was found because the consent forms which were used by Tagadamedia to collect personal data failed to allow the company to obtain freely-given, informed and unambiguous consent from prospects.

The CNIL found that the online consent forms which were used by Tagadamedia to collect personal data did not obtain valid consent because of the manner in which they were presented. For instance, CNIL found that the prominence given to the button which allowed web users to give their consent to the sharing of their personal data for advertising purposes, relative to the button to reject such use, meant that this was not a valid consent request. Additionally, the meaning of the wording used on the button ("JE VALIDE") was unclear and potentially misleading and the use of reduced text size when explaining the effect of clicking on the button encouraged web users not to read those sections and to agree to the transmission of their personal data to Tagadamedia's commercial prospecting partners. The sharing of personal data was also held to be unfair in breach of Article 5(1)(a) GDPR.

The CNIL also found that Tagadamedia violated the requirement under Article 30 GDPR to maintain a record of processing activities (ROPA). This was because Tagadamedia's joint ROPA with a partner company did not identify the controller as between Tagadamedia and the other company involved.

Tagadamedia was fined €75,000 for breaches of GDPR. Additionally, the CNIL instructed Tagadamedia to implement a GDPR-compliant consent form within a month or risk a daily penalty of €1,000 for continued non-compliance.

The CNIL's decision highlights the continued importance of empowering user choice and control over how their personal data will be processed. This issue was also recently addressed in the UK by the ICO's and CMA's joint position paper on 'Harmful Design in Digital Markets' which explains how companies should design digital products and services to enable such choice and control.

For more information on ICO's and CMA's joint position paper on 'Harmful Design in Digital Markets', please see our article in the Autumn 2023 edition of Snapshots here.



CNIL Fines Amazon France Logistics €32 Million for Employee Surveillance Violations By Benjamin JACOB and Emma JOLLY (PDGB, France

Amazon's French subsidiary, Amazon France Logistique, specialising in warehouse logistics on French territory, was imposed a significant fine of €32 million on January 23, 2024, by the French data protection authority (CNIL). The fine was imposed due to various breaches of the General Data Protection Regulation (GDPR) regarding the company's monitoring practices towards its employees.

As part of its investigatory powers, the CNIL took action following various press articles and numerous employee complaints highlighting the company's practices within its warehouses.

In fact, each employee is equipped with devices allowing real-time documentation of every task, including handled items, picking locations, and storage positions. This documentation generates personal data enabling the company to conduct close and intrusive surveillance of its employees by quantifying their quality of work, productivity, and periods of inactivity, particularly by calculating their break times.

The CNIL found that the continuous monitoring processes of employees' productivity, which exceeded the genuine need for operational management excessively infringe upon their rights, notably concerning privacy, working conditions respecting their health, dignity, and safety.

The CNIL also identified transparency and security breaches linked to CCTV processing. In fact, employees and external visitors were not properly informed of the CCTV systems, and several security breaches were identified that made it more difficult to trace access to images, and to identify each person who had carried out actions on the software.

The disproportionate nature of the implemented monitoring and employee evaluation measures was considered by the CNIL as violating both national and European data protection law, and as an infringement on employees' right to privacy. In this context, the CNIL imposed a €32 million fine to Amazon France Logistique, with the publication of the deliberation.

This penalty is not the first one against the Amazon Group, as the CNIL had previously fined Amazon Europe Core €35 million in 2020 for non-compliance with cookie legislation.


Need to know

EDPB adopts report on the role of DPO

The EPDB has adopted a report on the functioning of Data Protection Officers (DPO) following an investigation into compliance with the GDPR provisions relating to DPOs.

The European Data Protection Board has adopted a report which includes recommendations to support DPOs in their role and assist with compliance with Articles 37 to 39 GDPR. The report follows a year-long investigation conducted by 25 DPAs across the EEA as part of the Coordinated Enforcement Framework (CEF).

DPOs reported that they were:

  • not being adequately designated, particularly in instances where there is an obligation to appoint a DPO under the GDPR, bearing in mind the need to ensure no conflict of interest with their day-to-day role;
  • provided with insufficient resources and training, with key concerns being a lack of human resourcing such as the appointment of deputy DPOs and the fact that a majority of DPOs received 24 hours or less of training per year; and
  • not given enough power to handle tasks, having not being given sufficiently defined duties or sufficiently integrated into the decision-making process on data protection issues, including a lack of ability to report to top-level management.

To address these concerns the EDPB set out a list of recommendations for supervisory authorities, organisations and DPOs to take into account. For businesses, some of the key recommendations include:

  • promoting the role of the DPO internally, and providing sufficient guidance, resources and training materials. This may include engaging with the DPO to review resource requirement and tracking the outcomes of these discussions;
  • ensuring that the business instils a degree of separation between the organisation's and the DPO's obligations, as to avoid conflicts of interest;
  • regularly reviewing the DPO's involvement within the organisation, making sure there are clear lines of communication between the DPO and senior management and clearly defining their duties in an engagement letter; and
  • staying abreast with initiatives and guidance released by their respective supervisory authorities, including implementing codes of best practice where available.

Prior the recent adoption of the report, ten EU supervisory authorities had taken action for non-compliance with DPO requirements. The most significant of these fines was issued by the Berlin DPA against a German e-commerce company for contravention of the conflict of interest provisions in Article 38(6). The company under investigation was issued a €525,000 fine due to the fact that the DPO was also acting as managing director, able to take effective decisions over the data processing carried out and who, in effect, could monitor himself. The company had also been warned about this default once before in 2021.

Businesses in the UK should be aware that whilst under UK GDPR the role of the DPO is set to be replaced by the Senior Responsible Individual, where they must also comply with EU GDPR and appoint a DPO, this guidance should be considered when managing their role.


IAPP article



The CNIL launches a public consultation on a draft guide to transfer impact assessments

On 8 January 2024, the French data protection regulator, Commission Nationale de l'Informatique et des Libertés (CNIL), announced a public consultation to gather feedback on a draft methodology and checklist for carrying out a transfer risk assessment (TIA) (available here).

The CNIL's draft guide is organised into six steps:

  1. Know your transfer – this step requires data exporters to determine certain information about the data transfer and document it in the provided table format so that they can understand the nature of the transfer and the sensitivities of it.
  2. Identify the transfer tool used – this step involves data exporters documenting the transfer tool to be used and therefore assessing whether they are required to conduct a TIA for the purposes of the particular transfer (i.e. a TIA will be required if relying on an Article 46 GDPR transfer mechanism to make the transfer but not if an adequacy decision or a derogation under Article 49 GDPR applies).
  3. Assess the laws and practices in the destination country and the effectiveness of the transfer tool – this step requires data exporters to assess whether the laws and practices of the third country could adversely impact the appropriate safeguards which are being put in place to protect the personal data being transferred, or which could prevent the data exporter from fulfilling its obligations under data protection law.
  4. Identify and adopt supplementary measures – this step requires data exporters to identify the existing security measures (technical, contractual and organisational) in place, and consider if any supplemental measures are required, to protect the data transferred.
  5. Implement the supplementary measures and take necessary procedural steps – this step asks data exporters to list the steps (in the form of an action plan) which they need to take in order to implement any supplementary measures they have identified at step 4 above.
  6. Review– this step reminds data exporters to document the frequency with which they will re-assess the appropriate safeguards they are relying on to transfer personal data and (if applicable) the supplemental measures which they have put in place.

The public consultation on the CNIL's draft guide closed on 12 February 2024. Although not yet finalised, the guide may provide some assistance to organisations in how to structure and document their TIAs.

Along with the Information Commissioner's Office (ICO)'s recent update to its guidance in relation to transfers to the US (covered in the January edition of Data Dispatch here), the CNIL draft guide indicates a recognition by regulators of the practical difficulties that organisations face when carrying out these risk analysis exercises.



Reminder - DPDI bill update

The Data Protection and Digital Information Bill has been given until 12 December 2024 to complete its passage through the parliamentary process, although it is expected to be agreed sooner considering the general election expected later this year. The Bill is currently undergoing further scrutiny in the House of Lords Committee Stage, where members are discussing potential amendments.



Reminder – UK SCCs deadline

 From 21 March 2024 the old UK SCCs (which are the old EU SCCs adapted to the UK context) ceased to be compliant with the data transfer rules in the UK. Contracts involving the transfer of personal data outside the UK must become compliant by replacing the old UK SCCs with either the UK International Data Transfer Agreement (IDTA), or the UK addendum to the new EU SCCs.



Reminder - Enforcement of journalism code

On 6 July 2023, the ICO published a Data Protection and Journalism Code of Practice, providing practical guidance for media organisations and journalists on using personal data for journalism whilst respecting privacy laws and freedom of expression. The Code came into force on 22 February 2024.