Only 3.5% of UK compliance professionals say their organisation is ready for AI regulation, VinciWorks survey finds

AI and data protection

New VinciWorks research exposes a significant training and readiness gap across UK-regulated industries. As GDPR enforcement intensifies, the EU AI Act enters its implementation phase, and the UK regulator signals that obligations are already in force

A new survey by VinciWorks, a leading compliance eLearning provider, has found that just 3.5% consider their organisation fully prepared for AI’s changing regulatory landscape, with more than a quarter (29%) saying they are still figuring out what rules even apply to them. The findings paint a picture of an industry struggling to keep pace with converging regulatory pressure from the EU AI Act, sustained GDPR enforcement and the UK’s sector-led approach to AI governance.

Regarding preparedness, over a quarter (28%) said they were aware of the rules but had no clear plan, and 6% admitted they were simply unsure of their position. Combined, over three-fifths (63%) of respondents could not describe their organisation as prepared for the regulatory environment now taking shape.

Training gap leaves firms exposed

The most significant finding concerns training. Only a fifth (22%) of respondents said their organisation provides AI awareness training that they consider effective. Nearly half (48%) said their company had no AI training at all but would like to provide it. A further 12% had no plans to offer training, and 12% had something in place that they themselves described as not very effective.

Roughly four in five (78%) organisations lack effective AI training at a time when regulators across the EU and UK expect staff to understand and document how AI systems handle personal data. For organisations in regulated sectors, that exposure is direct and material.

GDPR is already the main enforcement tool

With regards to where GDPR causes the greatest practical difficulties for AI use, more than a quarter (27%) of respondents pointed to automated decision-making rules, 23% to data minimisation and retention, and a fifth (21%) to vendor and model provider oversight. The spread across all five options is itself instructive: organisations are not struggling with one isolated requirement. The challenge runs across the entire data protection framework as it applies to AI.

Nick Henderson-Mayo, Head of Compliance at VinciWorks, commented, “GDPR is bundled into AI compliance. Regulators are applying existing data protection laws to AI systems right now, and they expect organisations to be able to explain what their systems are doing, justify their lawful basis and demonstrate that individuals’ rights remain meaningful. If you’re using AI that processes personal data, the ICO expects you to comply with your data protection obligations today.” 

Disruption is building, confidence remains low

While almost two-thirds (64%) of respondents described AI as only slightly or not at all disruptive to their compliance programme so far, 12% said it had already been very or extremely disruptive to their operations. With enforcement actions increasing and implementation deadlines approaching, the balance of experience is likely to shift.

On compliance confidence, only 9% of respondents said they felt very confident that their organisation’s AI use was compliant, while a combined third (33%) said they were not very confident or not confident at all. The largest group, three in ten (30%), described themselves as only somewhat confident.

The recent survey conducted by VinciWorks, a global compliance eLearning provider, polled 230 compliance, legal, and IT professionals to understand their awareness and preparedness for AI compliance.

Want to go deeper on the GDPR side of AI compliance? Download our guide, When data thinks: The intersection of GDPR and AI, for practical steps on improving data quality and governance, reducing risk in automated decision making, and building an approach you can document and evidence.