+34 93 218 40 00 info@gimenez-salinas.es

PDF Version

Artificial intelligence has taken a firm foothold in workforce management.

Today, many decisions that used to rest with human resources managers are supported by algorithms that assign shifts, assess performance, recommend promotions, or even suggest dismissals. Its use promises efficiency, but it also carries an obvious risk: the rules governing those decisions may be opaque and escape the scrutiny of those affected by them — or of those meant to oversee them.

In Spain, the legislature took a significant step to address this challenge by amending the Workers’ Statute through Royal Decree-Law 9/2021 of 11 May, known as the Rider Law. This reform introduced a specific obligation of algorithmic transparency, set out in Article 64.4(d) of the Statute, which requires employers to inform workers’ representatives about the “parameters, rules and instructions” on which algorithms or artificial intelligence systems are based when they influence working conditions, access to or maintenance of employment, and the creation of profiles.

This provision emerged from the debate around digital platform work, where business decisions were being made through automated systems that were impossible to audit. Yet its reach extends far beyond that sector. Any company using digital tools or algorithmic systems that affect employment-related decisions is bound by this duty. That includes everything from workforce scheduling software and productivity tracking applications to recruitment algorithms and performance scoring systems.

The law does not restrict the use of artificial intelligence, nor does it require union approval to implement it. What it does impose is a duty of clear and understandable disclosure. We now have a Guidance on Algorithmic Information in the Workplace, published by the Spanish Ministry of Labour and Social Economy in 2022, which provides direction for both employers and workers’ representatives on how to comply. The document explains that the information should describe the system’s purpose, the types of data used, the rules or criteria applied, the results generated, and the human oversight mechanisms in place. It is not about revealing source code or mathematical formulas, but about explaining clearly how the system influences decisions affecting employees.

The purpose of this requirement is to ensure that workers’ representatives can properly exercise their oversight and safeguard fundamental rights. If an algorithm rewards constant availability or penalises absences without distinguishing their cause, it may create indirect discrimination based on health, age, or family responsibilities. Access to adequate information allows representatives to identify such biases and push for corrective measures.

The obligation, however, is not absolute. It must be balanced against the protection of trade secrets and personal data legislation. Companies may withhold information that would disclose strategic or sensitive technical details, provided they offer a functional explanation sufficient to understand the algorithm’s impact. Nor should they disclose personal data or individual results. Transparency concerns the mechanism itself, not the people subjected to it.

This Spanish regulation broadly anticipates the direction set by the new European Artificial Intelligence Regulation (AI Act), adopted in 2024, which classifies AI systems used in employment as high-risk and requires them to be documented, supervised and explainable. The Workers’ Statute thereby establishes a collective transparency channel that complements the European framework and will likely gain importance as companies further integrate AI into their management processes.

In practice, implementation is still at an early stage. Many works councils remain unaware of this right, and most companies have yet to design procedures to comply effectively. The law does not specify the format or frequency of disclosures, nor how to proceed when the algorithm is supplied by an external provider. There is also a lack of consolidated case law. In practice, companies that are taking compliance seriously tend to start by compiling an inventory of automated systems affecting staff, conducting impact assessments, and preparing annual reports or summary sheets explaining each system’s general logic.

Some organisations have gone a step further by incorporating algorithmic transparency clauses into collective agreements or internal policies, defining what information will be shared, when, and through which channels. This proactive approach reduces the risk of conflict and demonstrates a genuine commitment to the ethical and responsible use of technology.

More than a technical requirement, algorithmic transparency represents a cultural shift. It compels organisations to look inside the “black box” of their digital tools and recognise that algorithms are not neutral. They reflect human decisions, business priorities, and sometimes hidden biases. Managing them responsibly means acknowledging that human dimension, ensuring oversight, and maintaining dialogue with those who represent the workforce.

With the introduction of Article 64.4(d) of the Workers’ Statute, Spain has added a new dimension to the balance between technology and labour rights.

The rule is not meant to curb innovation but to ensure that the classic principles of labour law — transparency, equality and social control over managerial decisions — remain in force in an increasingly automated environment. In the years ahead, as the European AI Act takes full effect and digitalisation accelerates, algorithmic transparency will become an essential part of labour compliance and of the broader equilibrium between technological efficiency and fundamental rights.

Companies that view this obligation as an opportunity — to audit their processes, explain their decisions, and engage with their people — will be better prepared for the new digital workplace. And perhaps that is the true meaning of this rule: a reminder that behind every algorithm, there are still human beings.

For more information or advice, you can contact us at info@gimenezsalinas.es.

PDF Version

Write us an email

Privacy Policy