RPC Data Dispatch Publication Banner

Data dispatch - December 2023

Published on 12 December 2023

Welcome to the second edition of Data Dispatch from the Data Advisory team at RPC. Our aim is to provide you on a monthly basis with an easy-to-digest summary of key developments in data protection law.

Welcome to the second edition of Data Dispatch from the Data Advisory team at RPC.  Our aim is to provide you on a monthly basis with an easy-to-digest summary of key developments in data protection law.

If there are any issues on which you'd like more information (or if you have any questions or feedback), please do let us know or get in touch with your usual contact at RPC.

Key developments

The UK's AI Summit and other AI developments from around the world

On the 1st and 2nd of November, representatives from 28 countries, tech companies, academia, and civil society leaders gathered for the first major global AI Summit at Bletchley Park. 

The key event was signing the Bletchley Declaration on AI Safety, described as fulfilling "key summit objectives in establishing shared agreement and responsibility on the risks, opportunities and a forward process for international collaboration on frontier AI safety and research". Leading AI companies (including OpenAI, Google DeepMind, Anthropic, Microsoft, and Meta) agreed to allow governments to test their latest models before they are released to the public.  

On 30 October 2023, US President Joe Biden signed an executive order which, amongst other actions, requires AI developers to share safety results with the government and directs the National Institute of Standards and Technology to develop new standards on AI safety. The US government is describing this as "the most significant actions ever taken by any government to advance the field of AI safety".

On the same day more AI news was announced in the shape of a new global code of conduct. The group of seven industrial countries (G7) announced the International Code of Conduct for Organizations Developing Advanced AI Systems. This guidance will aim to promote safe, secure, trustworthy AI.

In the UK, the government confirmed in its response to a report from the Science, Innovation and Technology committee that it does not propose to regulate AI in the short term.  The government confirmed that it will provide more detail on its approach in the forthcoming response to the AI Regulation White Paper (published in March 2023). 

Meanwhile the EU's AI Act is in a final period of intense negotiation between the Commission, Council and Parliament.  EU stakeholders are mindful of the need to finalise the text in the first few weeks of 2024 to avoid the legislation being derailed by the European elections.

Most recently at techUK's Digital Ethics Summit on 6 December 2023, John Edwards, the Information Commissioner, warned that "there are no excuses for not ensuring that people's personal information is protected if [companies] are using AI systems, products, or services." He noted that advice is available through the ICO's Innovation Advice Service and Sandbox.  

It's been a busy month in the world of AI regulation!

(Bletchley Declaration)
(US Executive Order)
(G7 Code of Conduct)
(ICO Comments at Digital Ethics summit)
(UK Government Response)


The Information Commissioner's Office (ICO) urges Organisations to ensure Proper Monitoring Practices in the Workplace

The ICO's updated guidance is aimed at employers across both the public and private sector, and provides direction on how organisations may monitor their employees in a fair and lawful manner.

The updated guidance states that, with the rise of flexible and remote working, organisations are increasingly implementing monitoring methods such as: message and keystroke monitoring, monitoring webcam footage or audio recordings, and monitoring software which tracks their employees' activities.

While the ICO highlights that data protection law does not prevent such practices, it has urged all organisations to consider their legal obligations and their employees' privacy before implementing such methods.

Additionally, the ICO states that where an organisation is looking to monitor its employees, it must:

  • inform employees about the nature, extent, and reasons for the monitoring in a manner which is easy to understand; 
  • have a lawful basis and clearly defined purpose for such processing, using the least intrusive means to achieve it and retaining only the information relevant to it;
  • carrying out a Data Protection Impact Assessment for any monitoring which is likely to result in a high risk to the rights of employees; and
  • making the personal information collected available to employees should they request it.

(ICO Guidance)

Enforcement action

FCA fines Equifax

The Financial Conduct Authority (FCA) have fined Equifax £11m for failing to monitor and manage the security of customer data in the UK. Around 13.8 million people had their personal data accessed making this one of the largest breaches in cybersecurity history. The hackers accessed names, addresses, credit card details and more.

The FCA said the breach was "entirely preventable". They also said there were known weaknesses in the firm's data security, and that Equifax failed to take appropriate action to protect customer data. Equifax also did not realise that there was a breach pertaining to UK data until six weeks after its US parent company had discovered the hack. They then followed on by giving an inaccurate impression of the number of customers which were affected in public statements.

The Joint Executive Director of Enforcement and Market Oversight at the FCA said: "Financial firms hold data on customers that is highly attractive to criminals. They have a duty to keep it safe and Equifax failed to do so". Regulated firms must implement appropriate security arrangements to protect their customers' data and, in the event of breach, notify affected data subjects in an accurate and fair way. 

(FCA Press Release)

Snap's 'My AI' Chatbot under scrutiny by the ICO

The Information Commissioner's Office (ICO) has issued a preliminary enforcement notice against Snap Inc and Snap Group Limited (Snap), alleging that they failed to adequately evaluate the privacy risks associated with Snap's AI-powered chatbot 'My AI.' The ICO's initial inquiry suggests that the risk assessment carried out by Snap prior to launching their 'My AI' tool did not sufficiently consider the data protection risks, especially in relation to handling the personal data of 13 to 17-year-old children. 

These findings from the ICO are provisional, and Snap will have an opportunity to present their case. The initial notice outlines the potential actions Snap may have to take and, if a final enforcement notice is issued by the ICO, Snap may be prohibited from offering 'My AI' as a feature to UK users of Snap's services.

This preliminary notice emphasises the need for appropriate risk assessments before launching innovative products especially when handling high-risk processing involving generative AI and children's data.

(ICO News)

Tribunal Reverses ICO Fine in Clearview AI Case

Clearview AI uses web crawlers to scrape images of human faces from the internet, storing them in a database for its facial recognition software. Although Clearview AI is based in Delaware and does not have a presence in the UK or EU, the ICO argued that the database likely contains images of UK residents, leading to it issuing an Enforcement Notice in 2022 alleging misuse of biometric data.

The First-Tier Tribunal's decision hinged on the material scope of the UK GDPR. It determined that, since Clearview's clients were exclusively foreign criminal law bodies, the acts of these governments fell outside the purview of the UK GDPR. Consequently, the First Tier Tribunal concluded that the ICO lacked jurisdiction to issue the fine.

However, the Tribunal did provide commentary of the territorial scope outlined in Article 3 of the UK GDPR and of Clearview AI's data processing activities, which related to database creation and user image matching. The Tribunal agreed with the ICO's submission that the processing activities were  linked to the monitoring of data subjects' behaviour in the UK by law enforcement clients, meaning that Clearview's database fell within the territorial scope of Article 3 of the UK GDPR.

This decision is particularly useful for overseas service providers when considering the material and territorial scope of the UK GDPR, including the boundaries of the "monitoring behaviour" provision (Art. 3(2)(b) of UK GDPR).

The ICO has sought permission to appeal the tribunal's decision.

(First-Tier Tribunal Decision)
(ICO seeks leave to appeal)

Need to know

The ICO publishes its draft 'Data Protection Fining Guidance' for public consultation

The Guidance explains that the ICO may only exercise its power to impose fines under Article 58(2)(i) and Article 83 UK GDPR by giving a penalty notice to a controller or processor in accordance with section 155 of the DPA 2018.

The Guidance then provides that the ICO may choose to impose a fine where a controller or processor has not complied with the provisions of UK GDPR or the DPA in relation to: (i) the principles of processing, (ii) the rights conferred on data subjects, (iii) the obligations placed on controllers and processors, or (iv) the principles for transfers of personal data outside the UK.

Further, the Guidance states that the ICO will calculate the appropriate fine by:

  • assessing the seriousness of the infringement(s);
  • accounting for turnover (where the controller or processor is part of an undertaking);
  • calculating the starting point for the fine by having regard to the above points;
  • adjusting in consideration of any aggravating or mitigating factors, and
  • assessing whether imposing the fine would be effective, proportionate, and dissuasive.

Additionally, the Guidance states that the maximum fine which the ICO may issue will depend on (a) whether the infringement attracts the standard or higher maximum (which depends on which provision of UK GDPR has been infringed) and (b) whether the controller or processor forms part of an ‘undertaking’ (e.g., the controller is a subsidiary of a parent company). This affects the maximum fine which the ICO can impose as follows:

Fine Type

Not an 'Undertaking'

Undertaking

Standard maximum fine

£8.7 million

£8.7 million or 2% of worldwide turnover in the preceding financial year, whichever is higher.

Higher maximum fine

£17.5 million

£17.5 million or 4% of worldwide turnover in the preceding financial year, whichever is higher.

 

Finally, the Guidance provides that where the ICO finds that the ‘same or linked processing operations’ infringe more than one provision of UK GDPR, the overall fine imposed will not exceed the maximum amount applicable to the most serious of the individual infringements.

(ICO Draft Guidance)

New AG Opinion on Compensation Under GDPR

The EU Court's Advocate General delivered his opinion on two joined cases which addressed the question of whether the theft of a data subject's personal data by an unknown perpetrator qualifies as identity theft and whether it merits compensation for non-material damage under Article 82(1) of the GDPR. The cases involved two individuals whose personal data was stolen from Scalable Capital's trading application but had not experienced any malicious use of their data. 

The Advocate General clarified that the theft of personal data, even without identity theft, can lead to compensation for non-material damage if there is proof of a GDPR infringement, actual harm, and a direct link between the two. He added that mere possession of identifying data is not considered identity theft, and that the concept of "non-material damage" was not to be defined by reference to national law, rather that this was to be defined at an EU level to be applied uniformly across member states. The right to compensation must still be evaluated on a case-by-case basis, taking all relevant factors into account.

The Advocate General followed the decision of the Court of Justice of the European Union ("CJEU") earlier this year in Oesterreichische Post in confirming that, whilst material or non-material damage would need to be established in order to claim compensation, there is no minimum threshold to be met.

The case will now be considered by the CJEU.

(AG Opinion)