The biggest data protection, GDPR and AI stories of 2024

Over the last year, regulatory bodies across the globe have continued to intensify scrutiny over AI technologies and data privacy. Italy’s Garante has once again imposed a significant €15 million fine on OpenAI, citing issues with ChatGPT’s data collection practices. This fine is part of an ongoing wave of regulatory actions, including the European Data Protection Board’s (EDPB) opinion on AI and GDPR, and various privacy laws enacted across the United States. The UK brought back a relatively controversial set of changes to data protection, as GDPR fines large and small show no sign of slowing down.

OpenAI falls foul of Italy’s Garante once again

Italy’s data protection authority, the Garante, has continued their single minded crusade against all things ChatGPT related with a €15m fine against OpenAI, the makers of ChatGPT. VinciWorks has previously reported on the Garante’s scepticism of AI compliance with GDPR, and Italian authorities have closed out 2024 with an official fine.

The fine centres around OpenAI using personal data to train ChatGPT without properly informing users and for failing to implement robust age verification to protect children from inappropriate content. While OpenAI cooperated with the investigation, the fine also includes a mandate to launch a six-month public awareness campaign in Italy about its data collection practices. Significantly, the campaign will include the nature of data collected, both user and non-user information, for the purpose of training its models, and the rights that users can exercise to object, rectify, or delete that data.

The fine comes after Italian authorities briefly banned ChatGPT in March 2023, just a few months after the AI service went public in November 2022. The ban was lifted after OpenAI implemented the required changes. 

The European Data Protection Board gives its opinion on AI and GDPR

On 17 December 2024, the European Data Protection Board (EDPB) issued a critical opinion addressing the interplay between AI model development and the EU’s General Data Protection Regulation (GDPR). Responding to a request from Ireland’s Data Protection Commission, the opinion clarifies how personal data can be lawfully processed in AI contexts, focusing on anonymity, legitimate interest, and the use of unlawfully processed datasets.

The EDPB underscored the stringent criteria for deeming AI models anonymous. To qualify, models must ensure that:

  • Output data cannot identify individuals whose personal data trained the model.
  • Personal data from training datasets cannot be extracted through queries.

Documentation, including Data Protection Impact Assessments (DPIAs) and technical measures, plays a pivotal role in proving anonymity. Supervisory authorities will evaluate these measures on a case-by-case basis.

The EDPB confirmed that legitimate interest can justify personal data processing for AI development, such as conversational agents or cybersecurity tools. However, organisations must meet a three-step test:

  • Demonstrate a valid legitimate interest.
  • Prove that processing is necessary for this purpose.
  • Show that individuals’ rights are not overridden.

Context, transparency, and mitigating measures, like offering opt-outs or respecting website terms against data scraping, are crucial in balancing rights and interests.

AI models developed using unlawfully processed data may face restrictions unless the data has been effectively anonymised. The EDPB stresses the responsibility of AI deployers to verify the lawfulness of data used, even if sourced from third-party developers.

This opinion reinforces GDPR’s high standards for data protection while acknowledging the complexity of AI innovation. It emphasises accountability, requiring robust documentation and careful assessments throughout the AI lifecycle.

The guidance also signals potential challenges for organisations relying on public data scraping or using third-party AI models. It highlights the importance of proactive compliance measures, including thorough testing, transparency, and mitigating risks to individuals’ rights.

The rise and risks of biometric authentication in the age of AI

The rise of AI and deepfakes has prompted companies, from banks to healthcare providers, to adopt advanced biometric tools to combat impersonation and enhance system security—often at the cost of privacy. Tools like “liveness detection,” designed to confirm a person’s identity in real-time, have gained traction, spurred by remote onboarding and hiring trends accelerated by the pandemic. Regulators, such as New York’s Department of Financial Services, have urged organisations to implement technologies capable of resisting AI-generated deepfakes, including texture analysis and motion-based verification.

However, as these tools grow more sophisticated, so do concerns about privacy compliance. Laws like Illinois’ Biometric Information Privacy Act (BIPA) and GDPR restrictions in Europe highlight the liability risks of collecting and storing sensitive biometric data.  The UK’s ICO are pushing back against companies who routinely use biometric data. Serco, one of the UK’s largest employers, was told to stop using fingerprint scanners and facial recognition software for staff clocking on and off in a warning that could force many other employers to change their practices.

The ICO has also warned that Google’s decision to allow advertisers to track customer’s digital fingerprints was “irresponsible” and threatened to intervene. The ICO said in a 19 December 2024 blog post: “The ICO’s view is that fingerprinting is not a fair means of tracking users online because it is likely to reduce people’s choice and control over how their information is collected”

Deepfake threats range from tricking IT help desks into credential resets to impersonating executives for financial fraud. Despite the risks, there is an importance of liveness detection to validate identities, but robust authentication tools must align with data protection requirements. 

GDPR fines large and small continue apace

Some significant GDPR enforcement actions in 2024 highlight a continued crackdown on data protection violations by European regulators. The Netherlands’ data protection authority (AP) issued a €30.5 million fine to Clearview AI for retaining images of Dutch citizens in its database, violating GDPR rules. This fine surpasses those previously imposed on Clearview by regulators in France, Italy, Greece, and the UK. Furthermore, the AP warned of an additional €5.1 million penalty for ongoing non-compliance, potentially bringing the total to €35.6 million. Clearview argues it is not subject to GDPR, claiming it lacks operations or customers within the EU.

Another major fine targeted Uber, which was fined €290 million by Dutch authorities for transferring the personal data of 172 drivers, including criminal records and location data, to the US without adequate safeguards. The breach spanned two years, and the fine, although significant, falls well below the GDPR’s maximum penalty threshold of 4% of global annual turnover. Uber’s 2023 revenue was approximately €34.5 billion, making this one of the largest fines imposed on a tech company under GDPR since its inception.

Smaller GDPR enforcement actions also reflect the broad scope of compliance requirements. In Spain, Uniqlo faced a €270K fine, reduced from €450K after corrective measures, for a payroll data breach affecting 447 employees. In Belgium, a telecommunications company was fined €100K for failing to respond to a customer’s request for information and for poor communication regarding contract changes. In Denmark, the Municipality of Vejen was fined nearly €27K after unencrypted laptops containing sensitive data about students and teachers were stolen from a school. 

Germany opens the door to GDPR fines for hurt feelings

The German Federal Court of Justice (BGH) has clarified the scope of financial compensation for GDPR breaches, ruling that individuals can claim damages for non-material harm, such as emotional distress, when they lose control over their personal data. This judgment stems from a 2021 Facebook data breach where third parties accessed over 500 million user profiles. The court confirmed that loss of control alone—without evidence of misuse or tangible harm—constitutes damage under Article 82 GDPR. This aligns with European Court of Justice (ECJ) rulings and strengthens the ability of individuals to seek compensation for privacy violations.

The BGH’s interpretation signals a significant shift in GDPR enforcement, lowering the threshold for damage claims and potentially encouraging more individuals to pursue legal action. While plaintiffs must provide substantial evidence of personal impact, the court permits standardised submissions if they establish a clear connection to the breach. Additionally, the court acknowledged the possibility of preemptive claims for future risks, setting a precedent for addressing long-term data exposure concerns.

This ruling challenges Germany’s previously restrictive approach to GDPR claims, where courts often dismissed non-material damages as insignificant. By aligning with ECJ standards, the decision underscores that privacy violations alone are sufficient for compensation. However, differing interpretations across European courts could lead to fragmentation in case law and many more cases as advocacy groups test the limits of the law.

The UK resurrects its shift away from GDPR

When the UK went to the polls on 4 July, 2024, the previous government’s almost ready Data Protection and Digital Information Bill (DPDI), failed to pass and was dropped. DPDI had faced significant criticism for potentially throwing the UK’s adequacy decision from the European Commission into question. But the new Labour government has brought back a decent amount of those changes in its Data (Usage and Access) Bill (DUAB), due to be passed into law in 2025.

It introduces a new lawful basis for processing data under “recognised legitimate interests,” allowing public bodies to request data from private organisations for specified purposes, including direct marketing and network security. DUAB also relaxes restrictions on AI and automated decision-making for most personal data but maintains stricter rules for sensitive data. Scientific research benefits from expanded definitions, enabling commercial projects to evolve without renewed consent, provided they meet ethical standards.

DUAB introduces several new provisions, such as allowing the government to designate certain data types, like “neurodata,” as special category data with stricter processing limits. It also empowers the Treasury to establish “smart data” schemes, enabling customer data sharing with third parties. Enhanced enforcement capabilities under the Privacy and Electronic Communications Regulations (PECR) bring GDPR-level fines for e-marketing and cookie violations. The bill also adjusts timelines for data subject rights requests, excluding periods for identity verification and scope clarification.

However, DUAB has also dropped some of the more controversial aspects of the previous DPDI. Notably, DUAB avoids weakening the definition of “personal data” and retains current rules on Data Protection Impact Assessments (DPIAs), subject access requests, and consultation with the Information Commissioner’s Office (ICO). However, it transitions the ICO’s role to a corporate entity called the Information Commission and introduces new complaint-handling processes that require individuals to approach data controllers first.

State-by-state: the United States’ new generation of data privacy laws

In 2024, 10 new state-level consumer privacy laws were enacted or took effect in the United States, expanding the already complex landscape of privacy regulation. These laws generally protect state residents acting in personal capacities, excluding employees or business contacts, with California continuing to lead as the only state extending protections to employees and business-to-business data. Most states maintain the established framework of distinguishing “controllers” (entities deciding how data is used) from “processors” (entities handling data on behalf of controllers). The definitions of personal data and exemptions for de-identified or publicly available data remain broadly consistent, though nuanced variations exist, such as Oregon’s explicit inclusion of household-linked device data.

Several states introduced new consumer rights and expanded definitions of sensitive personal information (SPI). For example, Oregon and Maryland broaden the scope of biometric data, while California and Colorado added neural and biological data to their SPI definitions. New Jersey included financial data in SPI, joining California as an outlier. Maryland’s approach stands out for its stringent restrictions, prohibiting the collection, processing, or sharing of SPI unless strictly necessary for requested services, contrasting with the opt-in consent model adopted by other states. Additionally, heightened protections for minors have been introduced, with New Jersey and Delaware raising the minimum opt-in age for targeted advertising and sales to 17 and 18, respectively, while Maryland outright bans such activities for minors under 18.

States are also adopting operational requirements to strengthen data governance. For example, Minnesota now requires businesses to maintain data inventories, echoing international frameworks like the GDPR, while Maryland introduces stringent data minimisation obligations, limiting personal data collection to what is necessary for consumer-requested products or services. Enforcement mechanisms remain consistent across most states, with attorney generals holding the exclusive right to enforce violations, as opposed to separate authorities like in the EU.

What businesses should worry about in GDPR and data protection going in to 2025 

As we head into 2025, businesses should be aware of several key data protection considerations to ensure compliance and mitigate risks. Here are a few important things to keep in mind:

Stricter enforcement of GDPR and other privacy laws
With the growing focus on data protection, regulatory bodies in the EU, UK, and the US are tightening enforcement of data privacy laws like GDPR and state-level privacy regulations. Businesses should anticipate increased scrutiny and be prepared for potential fines for non-compliance. This includes ensuring that personal data is processed lawfully, securely, and transparently, with proper documentation and clear user consent practices.

Expanded definition of personal data
Privacy laws are expanding the scope of “personal data” to include more types of sensitive information, such as biometric data, neurodata, and even certain behavioural data. Companies should update their data inventory to reflect these changes and assess how they collect, store, and process such data. In particular, businesses dealing with health, biometric, and financial data should be aware of the additional compliance requirements.

Increased focus on AI and data collection
AI technologies, particularly those used for machine learning, data scraping, and biometric authentication, are drawing more attention from regulators. Businesses should ensure that they have strong data protection measures in place for any AI-driven processes, particularly where personal data is used for training models. Transparency about how data is used, including the ability for users to opt-out or delete their data, will be increasingly important.

Data minimisation and purpose limitation
Data minimisation – the practice of collecting only the data that is necessary for a specific purpose – is becoming a more prominent focus for regulators. Businesses should reassess their data collection practices and ensure they are not gathering excessive or irrelevant information. Furthermore, they should clearly define and communicate the purposes for which data is collected and ensure it is not used for other unintended purposes.

User rights and transparency
As data protection laws evolve, the rights of individuals over their personal data (such as the right to access, rectify, or delete data) will continue to be a focal point. Businesses should ensure they have systems in place to respond to these requests efficiently and transparently. Users should also be informed about their rights, particularly in relation to AI systems that may process their data.

Cross-border data transfers
With the continued rise in global data privacy regulations, businesses that transfer data across borders will need to comply with a variety of laws governing international data transfers. The EU’s Schrems II ruling, which invalidated the Privacy Shield framework for transatlantic data transfers, has added complexity to this issue. Businesses should be aware of the data transfer mechanisms they use, such as Standard Contractual Clauses (SCCs), and any changes to these mechanisms that could affect their operations.

Biometric and deepfake risks
As biometric authentication systems and deepfake technology become more common, businesses must balance security with privacy. The use of facial recognition or fingerprint data requires heightened protection under laws like the GDPR and the Biometric Information Privacy Act (BIPA). Moreover, the rise of deepfake technology raises new risks for fraud and identity theft, meaning businesses must invest in tools to detect and mitigate such threats.

Vendor management and third-party risks
As companies increasingly rely on third-party vendors and cloud services to store and process data, they must ensure that these partners comply with data protection laws. Data breaches caused by third-party vendors can lead to significant liabilities. It’s crucial to conduct regular audits and ensure that contracts with third parties include strong data protection clauses and mechanisms for addressing breaches.

By keeping these issues in mind, businesses can better prepare for a data protection landscape that will likely continue to evolve rapidly in 2025 and beyond. Ensuring robust compliance practices will not only help avoid penalties but also build trust with customers and stakeholders.

Join our free webinar AI and GDPR Compliance in 2025 – What you need to know for the year ahead

Join our free, 1-hour webinar on Tuesday 14 January at midday UK time. Gain actionable insights into how to stay compliant, protect sensitive data, and build trust with customers in an increasingly complex regulatory environment.

 

How are you managing your GDPR compliance requirements?

GDPR added a significant compliance burden on DPOs and data processors. Data breaches must be reported to the authorities within 72 hours, each new data processing activity needs to be documented and Data Protection Impact Assessments (DPIA) must be carried out for processing that is likely to result in a high risk to individuals. Penalties for breaching GDPR can reach into the tens of millions of Euros.

GDPR added a significant compliance burden on DPOs and data processors. Data breaches must be reported to the authorities within 72 hours, each new data processing activity needs to be documented and Data Protection Impact Assessments (DPIA) must be carried out for processing that is likely to result in a high risk to individuals. Penalties for breaching GDPR can reach into the tens of millions of Euros.

“In a world older and more complete than ours they move finished and complete, gifted with extensions of the senses we have lost or never attained, living by voices we shall never hear.”

Picture of James

James

VinciWorks CEO, VInciWorks

Spending time looking for your parcel around the neighbourhood is a thing of the past. That’s a promise.

How are you managing your GDPR compliance requirements?

GDPR added a significant compliance burden on DPOs and data processors. Data breaches must be reported to the authorities within 72 hours, each new data processing activity needs to be documented and Data Protection Impact Assessments (DPIA) must be carried out for processing that is likely to result in a high risk to individuals. Penalties for breaching GDPR can reach into the tens of millions of Euros.

GDPR added a significant compliance burden on DPOs and data processors. Data breaches must be reported to the authorities within 72 hours, each new data processing activity needs to be documented and Data Protection Impact Assessments (DPIA) must be carried out for processing that is likely to result in a high risk to individuals. Penalties for breaching GDPR can reach into the tens of millions of Euros.

How are you managing your GDPR compliance requirements?

GDPR added a significant compliance burden on DPOs and data processors. Data breaches must be reported to the authorities within 72 hours, each new data processing activity needs to be documented and Data Protection Impact Assessments (DPIA) must be carried out for processing that is likely to result in a high risk to individuals. Penalties for breaching GDPR can reach into the tens of millions of Euros.

GDPR added a significant compliance burden on DPOs and data processors. Data breaches must be reported to the authorities within 72 hours, each new data processing activity needs to be documented and Data Protection Impact Assessments (DPIA) must be carried out for processing that is likely to result in a high risk to individuals. Penalties for breaching GDPR can reach into the tens of millions of Euros.