AI delayed, GDPR dismantled and big tech wins big. Will “Digital Omnibus” spark a European privacy meltdown?

In a political shockwave reverberating across Europe, the European Commission has simultaneously unveiled:

  1. A last-minute “AI Omnibus” proposal delaying the AI Act’s core high-risk obligations
  2. A sweeping GDPR reform package that privacy advocates warn could gut Europe’s digital rights

 

What should have been a routine regulatory update has erupted into what noyb calls “the biggest attack on Europeans’ digital rights in years.” This is not cautious reform. This is a regulatory earthquake.

 

AI Act: Timeline chaos and a regulatory twilight zone

 

The Commission has proposed delaying the core of the AI Act, pushing Annex III high-risk systems to late 2027 and Annex I sector-based AI to August 2028, even though the Act still formally enters into force in August 2026. This creates what experts describe as a legal void: a two-year period in which high-risk rules technically apply, enforcement could begin, yet the Commission is simultaneously attempting to postpone those obligations and may even seek a retroactive “legislative pardon,” the legality of which is uncertain before the EU Court of Justice. 

 

Adding to the instability, the Commission has given itself a “nuclear option” to accelerate implementation if standards and guidance mature quickly, meaning high-risk rules could suddenly activate just six months or a year after such a decision. The result is a deeply unpredictable enforcement timeline.

 

GDPR overhaul: The most controversial rewrite since 2018

 

While the delayed AI Act dominated initial headlines, the real shockwave emerged from the GDPR amendment package, a set of changes that noyb argues will “massively lower protections for Europeans” and overwhelmingly benefit US Big Tech. Max Schrems called it “the biggest attack on Europeans’ digital rights in years,” warning that the proposal actively undermines the protections the GDPR was built to guarantee. According to noyb, the Commission has abandoned decades of careful, evidence-based policymaking and instead embraced Silicon Valley’s “move fast and break things” mentality. The GDPR was never meant to be reopened, not by the Member States, not by Parliament, and certainly not by civil society, all of whom opposed doing so. Yet the Commission, driven by what noyb describes as internal political pressure and last-minute panic rather than strategy, pushed the amendments forward under the leadership of President Ursula von der Leyen, Vice-President Henna Virkkunen, and Justice Commissioner Michael McGrath.

 

What’s actually changing in the GDPR?

 

The new draft includes sweeping amendments. Some were expected. Others appeared at the last minute. Many, according to noyb, “open the floodgates” to large-scale commercial surveillance.

 

Key changes confirmed in the final GDPR amendment draft

 

Redefining “personal data”
The Commission introduces a more “subjective” definition of personal data where information may only count as personal if a company intends or is able to identify an individual. According to noyb, this creates a major loophole that could allow ad-tech firms, data brokers, and Big Tech to sidestep GDPR entirely. Max Schrems compared it to “a gun law that only applies if the owner confirms he intends to shoot.” The practical effect is that two companies could hold identical datasets, yet only one would be bound by the GDPR depending on their stated intentions.

 

New Article 88a replacing the cookie consent rules
The proposal introduces a stricter, ePrivacy-style consent requirement but only for personal data. Non-personal data remains under the old ePrivacy framework, creating a fragmented system where protections differ depending on how data is classified. This split is expected to generate compliance confusion rather than clarity.

 

Automated, machine-readable choice signals
This provision enables browser- or device-based consent signals. In principle, it’s a welcome modernisation, assuming it is implemented cleanly and consistently across the EU.

 

A new legal basis for AI development and operation
The GDPR now introduces a dedicated legal basis for processing data in the development and operation of AI systems. noyb warns that this effectively grants AI a special privilege, a way for otherwise unlawful processing to become lawful simply because it is done “for AI.” This is unprecedented in EU data protection law and considered by critics to be highly risky.

 

Expanded powers to classify pseudonymised data as “non-personal”
The Commission can now issue implementing acts specifying when pseudonymised data may fall outside GDPR entirely. This opens the door for certain industries to operate with significantly reduced oversight. noyb describes this as “death by 1,000 cuts” to fundamental rights.

 

Clearer rules for scientific research
The proposal provides more certainty around when data can be further processed for scientific research purposes, addressing long-standing ambiguity in the GDPR.

 

What didn’t make it into the final proposal

 

  • A narrowing of Article 9’s scope
    The earlier draft would have limited “special-category” data to information that directly reveals a sensitive trait. That clarification was removed, meaning the current broader interpretation, which can also cover inferred traits, remains untouched. 
  • A new duty for DPAs to run “regulatory sandboxes”
    The leaked version would have required Data Protection Authorities to create innovation sandboxes with relaxed GDPR rules for testing new technologies. This mandate was dropped from the final text.

 

Is this a “Privacy Meltdown”?

 

noyb highlights four major threats in the proposal that could seriously weaken GDPR protections. First, the new subjective definition of personal data creates a pseudonymisation loophole, allowing companies to claim they don’t intend to identify anyone, making enforcement nearly impossible. Second, the rules would broaden remote access to users’ devices for vague purposes like “aggregated statistics” or “security,” raising risks of intrusive scanning. Third, AI training could increasingly use Europeans’ personal data, including social media posts and messages, with minimal opt-out options, benefiting large US tech companies, as only 7% of Germans support such use. Finally, data subject access rights could be severely restricted to “data protection purposes” only, potentially blocking employees, journalists, and consumers from accessing critical information, a move noyb says violates CJEU case law and Article 8 of the EU Charter.

 

What does this all mean for businesses?

 

For businesses operating in the EU, the combined impact of the AI Act delays and GDPR overhaul is a mix of uncertainty, opportunity, and risk.

 

Compliance timelines are unpredictable: With the AI Act’s high-risk obligations delayed, accelerated, or potentially retroactively applied, companies cannot rely on a fixed enforcement date. Legal teams now face the challenge of preparing for a moving target, complicating risk assessments and compliance planning.

 

Data handling rules are murkier than ever: The subjective definition of personal data, new AI-specific processing privileges, and ambiguous pseudonymisation rules mean businesses cannot easily determine whether certain datasets fall under GDPR or not. Companies handling user or customer data may find themselves in constant legal gray zones, with enforcement depending on interpretations rather than clear rules.

 

Bigger players may gain advantage: Many of the changes, particularly the AI processing exemptions and regulatory loopholes, disproportionately favor large tech firms with sophisticated legal and technical resources. Smaller businesses and startups may struggle to navigate the complexity, potentially increasing market concentration.

 

Risk of reputational and legal exposure: Even if a company technically complies with the shifting rules, public perception of misusing personal data or AI systems could result in serious reputational damage. At the same time, disputes over whether certain data is “personal” or falls under AI privileges may trigger new waves of litigation.

 

Innovation versus accountability trade-offs: The regulatory sandboxes, AI privileges, and flexible definitions could create opportunities for innovation and data-driven services. However, businesses must balance this against the risk of regulatory scrutiny, potential fines, and growing public concern over privacy and digital rights.

 

Businesses now face a regulatory landscape of unprecedented complexity. Success will depend on careful monitoring of evolving rules, proactive compliance strategies, and thoughtful consideration of ethical and reputational implications.

 

Is Europe in a digital Wild West?

 

The EU set out to simplify its digital rules, but the result has been anything but simple. Instead, the proposals have sparked parliamentary revolt, drawn sharp condemnation from civil society, created massive uncertainty for businesses, weakened privacy protections, and delayed and destabilized the AI Act. According to noyb, they may even represent the largest rollback of digital rights in four decades.

 

Europe now stands at a crossroads. Will policymakers course-correct, or will fast-track regulatory panic reshape the digital future for 450 million people? It seems that the battle over Europe’s digital future has only just begun.

 

Vinciworks’ new conversational learning course on data protection’s rights and responsibilities puts you at the heart of data protection, turning policy into practical action. Guided by AI-powered experts, it explores how personal data should be handled, shared and stored through realistic workplace scenarios. Try it here.