The Digital Omnibus: Eroding Worker Protection and Rights

The European Commission's "simplification" package is, in fact, a deregulatory intervention that weakens workers' data rights.

4th February 2026

The Digital Omnibus Package has been consistently framed by the European Commission as a technical simplification exercise. This argument is difficult to sustain: the Omnibus operates in substance as a deregulatory intervention, reshaping EU digital law in ways that weaken existing safeguards.

This has been the focus of the current Commission almost from day one, closely associated with competitiveness and cost-reduction strategies. In her 2025 State of the Union address, President von der Leyen reiterated that “we need to make business in Europe easier” and “the Omnibuses we have put on the table so far will make a real difference. Less paperwork, less overlaps, less complex rules.”

The intention of this article is to demonstrate that the Digital Omnibus applies a deregulatory logic, weakening protection not through the formal repeal of existing rules but through regulatory design choices that redistribute responsibility, discretion and oversight. The article addresses first a structural disadvantage for workers and then some specific amendments that directly stem from the reallocation of political priorities away from social and labour rights or, as Alemanno argues, away from judicial and legislative actors towards administrative and private compliance systems.

Structural disadvantages for workers and their representatives

Formally speaking, the Digital Omnibus does not amend the EU labour acquis. EU labour protection is constitutionally shaped by collective participation and representation. Articles 152–155 of the Treaty on the Functioning of the European Union recognise social partners and institutionalise social dialogue, while Articles 27 and 28 of the Charter of Fundamental Rights protect information, consultation and collective bargaining. Article 31 of the Charter anchors fair and just working conditions.

However, the Digital Omnibus redesigns EU digital governance and reorients it away from collective and institutional processes, accelerating its individualisation. Responsibility shifts towards individual organisations and the broader infrastructure. Such a regulatory design, which treats compliance as primarily internal to organisations, is misaligned with the procedural and collective architecture through which EU labour protection is meant to operate in practice—especially in data-driven workplaces where power is exercised through sociotechnical systems that workers and their representatives do not control.

If adopted, the Omnibus is likely to weaken the enforceability of rights in the employment context. Workers and their representatives will be less able to obtain sufficient information, question decisions, obtain reasons and challenge decisions before authorities or courts—and to do so in time to prevent harm. In the employment context, where information asymmetries are pervasive, weakening this enforceability infrastructure will disproportionately disadvantage workers and social partners.

Amendments with the most significant implications for labour

These structural effects stem directly from specific Omnibus amendments with far-reaching consequences for worker protection.

The redefinition of personal data (Article 4(1), Recital 34 GDPR)

One of the proposed Omnibus amendments introduces a major change to the definition of personal data. Currently, personal data is defined as “any information relating to an identified or identifiable natural person (‘data subject’)”, while “an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”.

The amendment introduces the following addition: “Information relating to a natural person is not necessarily personal data for every other person or entity, merely because another entity can identify that natural person. Information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity. Such information does not become personal for that entity merely because a potential subsequent recipient has means reasonably likely to be used to identify the natural person to whom the information relates.”

This proposed Omnibus amendment fundamentally weakens the concept of personal data and narrows the material scope of data protection. By stating that information is not personal data for an entity where that entity cannot identify the individual, even if another entity can, the proposal makes the applicability of data protection law contingent on the perspective of the controller and processors.

This turns the definition of personal data into a subjective and fragmented assessment. In practice, almost any dataset can be characterised as non-personal by structuring processing across multiple processors, vendors or systems. For workers, this is particularly damaging: worker data is routinely processed through complex chains involving employers, software providers and other processors. Under this approach, each involved party may claim that it cannot identify workers on its own, even where identification is possible across the system as a whole. As a result, large categories of worker data risk falling outside the scope of the General Data Protection Regulation, undermining the effective protection of workers’ data rights.

This is the most dangerous element of the Digital Omnibus. By making personal data a relative concept, it risks excluding large parts of worker data from protection by design. This amendment should be rejected outright.

The redefinition of scientific research (Article 4(38) GDPR)

Another concept that the Commission wants to redefine is “scientific research”. The intention is to extend the current understanding of the concept to cover “any research that can support innovation, such as technological development and demonstration”. The amendment also adds that “this does not exclude that the research may also aim to further a commercial interest”.

Although Recitals 29–32 of the Omnibus proposal emphasise that the broadened definition remains subject to GDPR principles and safeguards, they do not specify how those safeguards should operate in power-asymmetric contexts such as employment, nor do they recalibrate the interaction with Article 88 GDPR, which allows member states to adopt rules for data processing in employment. This, as NOYB points out, creates significant legal uncertainty. Equally important, it erases the distinction between knowledge production (research) and power exercise (management), hence enabling workplace data processing to be characterised as research and weakening the essence of purpose limitation in the workplace.

“Residual” sensitive data

The risks raised by the redefinition of “personal data” and “scientific research” are compounded by another Omnibus amendment, which introduces a new legal basis allowing the processing of special categories of personal data for the development and operation of AI systems (Recital 33). In practice, the amendment allows special categories of personal data to residually exist in the training, testing or validation of datasets or to be retained in the AI systems or models, even when such data are not necessary for the purpose of the processing.

The amendment also signals that, in order not to disproportionately hinder the development and operation of AI and taking into account the capabilities of the controller to identify and remove special categories of personal data, derogating from the prohibition on processing special categories of personal data under Article 9(2) should be allowed.

This is a major shift from the current GDPR logic, where sensitive data is strictly protected from the start. The amendment allows sensitive data to exist inside AI models and systems as long as it is described as “residual”. De facto, this means very sensitive information may be processed without the knowledge or effective control of data subjects.

In addition, the safeguards apply only after sensitive data has already entered the system, and controllers must remove it once identified. This weakens preventive protection—an essential feature of the GDPR—and prioritises system efficiency over workers’ rights. Finally, the notion of “disproportionate effort” is introduced as a ground for not removing data. For data subjects, this means technical inconvenience outweighs their fundamental rights. In practice, the amendment normalises a lower level of protection and the presence of sensitive worker data in AI systems and models used for evaluation, monitoring and decision-making.

Automated decision-making as a contractual necessity (Article 22(1),(2) GDPR)

Under the previous interpretation of Article 22, automated decision-making was permitted only when it was necessary to perform a contract and no non-automated alternative was available. An Omnibus amendment proposes to alter this logic and allow automated decision-making even when a decision can be taken without automated means.

This breaks the link between automation and necessity and allows controllers to use automated decision-making as a matter of operational choice rather than as an indispensable measure. The amendment drops the requirement to assess whether less intrusive alternatives exist. This creates an incentive to expand automation, particularly in contexts such as recruitment, performance evaluation and contract termination. For workers, this may lead to more frequent exposure to automated decisions with limited meaningful human involvement and reduced opportunities to contest outcomes. From a worker-protection perspective, the amendment to Article 22 should not proceed in its current form.

The right to access (Article 15 GDPR)

The Omnibus amendments also introduce limitations on the right of access, allowing controllers to restrict or refuse access requests on grounds such as disproportionate effort or repetitiveness. Although framed as exceptional, these limitations are formulated broadly and risk becoming routine in complex data-processing environments.

The right of access functions as a gateway right, enabling individuals to understand and challenge data processing, including profiling and automated decision-making. It is a precondition for the effective exercise of other data protection rights, as confirmed by Court of Justice of the European Union case law and European Data Protection Board guidance. In employment contexts, where information asymmetries are structural, access requests are often the only means for workers and their representatives to uncover data-driven practices. It is the functional precondition for almost all meaningful protection against data-driven management.

Regulatory design as a driver of enforcement weakening

The Digital Omnibus is presented as a technical simplification, yet it replaces clear legal thresholds with context-dependent assessments. As NOYB notes, this shifts the burden of interpretation to supervisory authorities, which must assess anonymisation claims, proportionality tests, scientific research quality and AI-specific safeguards without additional resources. The issue is not institutional capacity to engage in complex assessments but the cumulative effect of a regulatory design that multiplies such assessments while lowering the procedural leverage of affected persons.

In employment contexts, where power asymmetries already limit individual complaints, delayed or weakened enforcement undermines the practical effectiveness of rights. The same dynamic affects the interaction with the AI Act, as data protection law remains a key enforcement layer for workplace AI under the AI Act—meaning that formal rights may persist but their enforceability becomes more uncertain and fragmented.

Conclusion

The Digital Omnibus applies a deregulatory logic, weakening protection not through the formal repeal of existing rules but through regulatory design choices that redistribute responsibility, discretion and oversight. By narrowing the scope of personal data, broadening the concept of scientific research and permitting residual processing of sensitive data in AI systems, it erodes enforceability and procedural leverage. Combined with context-dependent assessments that increase interpretative burdens for authorities, these changes disproportionately disadvantage workers—especially in power-asymmetric, data-driven workplaces. In this way, the Omnibus exemplifies deregulation through design: rights remain on paper, but their enforceability becomes more limited, fragmented and uncertain.

AUTHOR PROFILE

Aida Ponce Del Castillo

Aida Ponce Del Castillo

Aida Ponce Del Castillo is a senior researcher at the European Trade Union Institute.

New publications by our partners Harvard University Press

Membership Ad Preview

Help Keep Social Europe Free for Everyone

We believe quality ideas should be accessible to all — no paywalls, no barriers. Your support keeps Social Europe free and independent, funding the thought leadership, opinion, and analysis that sparks real change.

Social Europe Supporter
€4.75/month

Help sustain free, independent publishing for our global community.

Social Europe Advocate
€9.50/month

Go further: fuel more ideas and more reach.

Social Europe Champion
€19/month

Make the biggest impact — help us grow, innovate, and amplify change.

Previous Article

Europe’s Best Tools for Countering Trump

Next Article

Deregulating Workers’ Rights Will Not Save European Industry — It Will Only Deepen the Crisis