For a law of ‘algorithmic justice at work’

Workers must be protected from adverse decisions where responsibility is displaced to apparently anonymous algorithms.

José Varela

More and more companies delegate many of their responsibilities as an employer to algorithms, separating the human factor from labour management and exchanging it for computer programmes.

The recruitment of staff, the organisation of working time, professional promotion and the allocation of bonuses—even the application of a disciplinary regime—are all being put at the disposal of algorithms. This trend poses a severe risk to the rights and freedoms of workers.

Digital platforms already manifest this threat: their algorithms control and monitor their workers, evaluate their performance, determine their remuneration and even execute layoffs—and under abstruse, capricious and opaque criteria. Many attribute to these computer programmes characteristics science rejects, such as infallibility, neutrality and superficial precision.

Make your email inbox interesting again!

"Social Europe publishes thought-provoking articles on the big political and economic issues of our time analysed from a European viewpoint. Indispensable reading!"

Polly Toynbee

Columnist for The Guardian

Thank you very much for your interest! Now please check your email to confirm your subscription.

There was an error submitting your subscription. Please try again.

Powered by ConvertKit

Not infallible

Algorithms are not infallible. Their decisions can be as biased as those of any human; therefore, their resolutions cannot be considered superior or more objective. This is confirmed by studies from Princeton University, which evaluate algorithms as ‘very unreliable’ when applied to work environments. A resolution of the Council of Europe warns against ‘triggering additional interferences with the exercise of human rights in multiple ways’. The World Economic Forum even talks about ‘artificial stupidity’.

Algorithms are not empathetic: they do not understand concepts of humanity or honesty. They do not have a scale of values, nor do they distinguish cultural or social differences. They do not forgive and forget and they are unaware of their own fallibility. They have no common sense.

Even in certain perceptual skills, they demonstrate the capacity of a baby. And they do not correct themselves through considerations of understanding or justice, balance or diversity, ethics or morality. Today, and in the long term, human comprehension is essential for making decisions that are fair and equitable.

Nevertheless, companies continue to implement these tools, so it is necessary to regulate their use and application in labour relations, closely monitoring compliance with human rights and legal obligations to workers. We are not talking about applying a ‘precautionary principle’ to a hypothetical risk: it is much more than a preventive measure. We must prevent repetition of discrimination based on race, gender and ideology—types of discrimination that have already been verified in the operation of many algorithms.

Algorithmic justice

The Spanish trade union UGT is thus advocating for ‘algorithmic justice at work’—a law that regulates the safe use of these computer tools in Europe. Key demands are, in short:

Only through legal initiatives such as these can we ensure safe and fair working environments for 21st-century workers. Digital progress is necessary for our economies and for the competitiveness of the European Union and its member states. But this progress cannot be at the expense of workers’ rights nor bypass union responsibilities.


We need your help! Please support our cause.


As you may know, Social Europe is an independent publisher. We aren't backed by a large publishing house, big advertising partners or a multi-million euro enterprise. For the longevity of Social Europe we depend on our loyal readers - we depend on you.

Become a Social Europe Member


We want progress and we want digitalisation, but not at any price—always with social balances, under fair criteria and with real labour rights.