Is your use of facial recognition breaking the law? Lessons from Ireland’s €550k fine

The Irish Data Protection Commission (DPC) has fined the Department of Social Protection (DSP) €550,000 for its use of facial recognition and biometric matching software without a valid legal basis. The fine followed a detailed investigation into the department’s use of facial scanning and biometric matching technology during the registration process for Public Services Cards, where images of applicants were cross-checked against an internal database to detect duplicate identities.

 

What happened?

 

The DSP uses facial recognition technology during Safe 2 registration, the process by which individuals verify their identity to receive a Public Services Card (PSC). Applicants submit a photo, which is compared to a database of existing images to prevent duplicate registrations. With over 3.2 million cards issued and mandatory use for certain welfare services, this system affects a significant portion of the Irish population.

 

But the DPC’s investigation, launched in July 2021, found multiple failings in how the DSP handled this data:

 

  • No valid legal basis under the GDPR for collecting and processing biometric data

  • Unlawful retention of sensitive data

  • Inadequate transparency about how the data was being used

  • A flawed DPIA that omitted essential risk and legal assessments

 

As a result, the DPC issued a fine and gave the department nine months to either establish a valid lawful basis for the processing—or stop collecting biometric data altogether.

 

“None of the findings relate to technical security failings,” noted Deputy Commissioner Graham Doyle. “This is about whether the DSP has the legal and procedural framework in place to use this type of technology at all.”

 

What does the GDPR say about biometric data?

Biometric data, such as facial templates, is classified under Article 9 GDPR as a special category of personal data. This means it can only be processed under strict conditions, such as:

  • Explicit consent

  • Employment, social security or social protection law, where authorised

  • Substantial public interest, on the basis of Union or Member State law

Even then, organisations must demonstrate that their processing is necessary and proportionate, and that less intrusive alternatives have been considered.

 

What are the takeaways for EU and UK businesses?

This case sends a clear message to any organisation operating under UK or EU data protection law: using biometric technologies like facial recognition requires a solid legal foundation, not assumptions or convenience.

 

  1. A valid lawful basis is essential
    Whether under UK GDPR or EU GDPR, processing biometric data (a special category under Article 9) demands more than just a general justification. You must rely on a specific legal basis, such as explicit consent or substantial public interest, and document it clearly.

  2. DPIAs are not optional for high-risk tech
    Facial recognition and biometric matching fall squarely within the scope of processing that requires a Data Protection Impact Assessment. A superficial DPIA that skips over legal risks or fails to assess proportionality could itself be a breach.

  3. Transparency must be real, not vague
    You need to inform data subjects, clearly and upfront, what data you’re collecting, how it will be used, for how long, and under which legal basis. Failure to do so is a direct breach of GDPR obligations.

  4. Retention of biometric data must be justifiable
    Holding on to sensitive personal data “just in case” is not defensible. Your retention policy must be purpose-driven and legally grounded.

  5. You’re accountable, even if you’re public sector
    This fine was imposed on a government department, but the same obligations apply to private companies, charities, and any other data controllers. Regulatory scrutiny around biometric and AI technologies is increasing, and enforcement bodies across the EU and UK are watching closely.

  6. Don’t assume GDPR is static—especially in the UK
    The Data (Use and Access) Act 2025 has now passed, introducing changes like:

 

  • Easing automated decision-making rules for non‑special-category data

  • Acknowledging “recognised legitimate interests” for certain processing

  • Removing mandatory DPIAs and DPO requirements in some cases


These reforms reflect a shift toward flexibility and innovation—but also present compliance risks. Organisations must monitor the UK law closely to stay aligned with both UK and EU standards, particularly as divergence continues.

 

A wider pattern in GDPR enforcement?

 

This isn’t the first time the Irish DPC has taken issue with the Department of Social Protection. A 2019 investigation also found serious non-compliance in how Public Services Cards were issued. Although the department initially appealed, it later withdrew and came to terms with the regulator.

 

But this latest fine is part of something bigger. Across Europe, including in the UK, GDPR enforcement is shifting. Regulators are increasingly targeting not just organisations, but also individuals. Accountability is becoming personal, and the use of emerging technologies like facial recognition and AI is under intense scrutiny.

 

For businesses, the message is clear: using biometric or AI-driven data without a watertight legal basis isn’t a grey area—it’s a fast track to enforcement.

 

How can VinciWorks help?

 

  • In addition, our GDPR courses include an in-browser editing tool that lets you customise the courses to reflect your information security challenges and best practices.

 

  • Omnitrack’s GDPR Workflows, developed with top law firms, streamline compliance by automating data collection and management. This ensures completeness, reduces administrative burden, and simplifies regulatory evidence.