The parliament has stood firm against mass surveillance but has missed the opportunity to enhance provisions.
Last Wednesday, the European Parliament voted in plenary in favour of strong protections of fundamental rights in its official position on the emerging EU Artificial Intelligence Act. These include sustaining advances on human-rights impact assessments and transparency requirements made in committee.
The vote also upheld red lines against unacceptably harmful uses of AI, including decisively protecting people against live facial recognition and other biometric surveillance in public spaces, as well as emotion recognition in key sectors, biometric categorisation, predictive policing and social scoring.
This is a critical time for AI regulation globally, and the parliament’s final position is in many ways a win for fundamental rights. Work on the AI Act started in 2020 and European Digital Rights (EDRi), its network and partners have been urging EU lawmakers to prioritise fundamental rights and put people before profits from the beginning.
In an historic step, MEPs have listened to the evidence, in ensuring that all live, and most retrospective, uses of systems of remote biometric identification (RBI), in public spaces, are prohibited in their text. This foregrounds preservation of free expression, assembly and non-discrimination in public spaces going into the ‘trilogue’ negotiations with the European Commission and the Council of the EU.
The parliament also voted to ban biometric categorisation on the basis of sensitive characteristics, such as perceived sexuality, gender, race or ethnicity, and emotion recognition in education settings, workplaces, by police and at borders. These prohibitions are just as important for preventing discrimination and protecting human rights as bans on RBI.
Become a Social Europe Member
Support independent publishing and progressive ideas by becoming a Social Europe member for less than 5 Euro per month. Your support makes all the difference!
This was the culmination of years of work by a diverse group of 80 civil-society groups, in the Reclaim Your Face campaign. They will continue to fight for full protection from all retrospective RBI, all emotion recognition and automated behavioural detection in public spaces.
Despite an aggressive last-minute push from the centre-right European People’s Party to overturn the committee agreement on biometric surveillance, MEPs showed that they had heard the voices of more than 250,000 people across Europe who want to keep public spaces free of facial recognition and other biometric mass-surveillance systems.
The EU’s border panopticon
The parliament however failed to introduce new provisions which would protect the rights of migrants from ever-increasing regimes of discriminatory surveillance. This is despite AI systems increasingly being developed to track, control and monitor migrants in new and harmful ways.
The list of prohibited practices adopted does not include the use of AI to facilitate illegal pushbacks or to profile people on the move in a discriminatory manner. Without these prohibitions, the parliament reinforces the panopticon at the EU’s borders.
At committee level, the parliament took significant steps to empower individuals affected by the use of AI systems—including a requirement to provide explanations to those affected by AI-based decisions or outcomes—and to complain when AI systems violate their rights.
The parliament did not however extend these rights and mechanisms in last week’s vote. It voted to reject the right to an explanation for all AI systems (not just ‘high-risk’ uses) and for the right of public-interest organisations to bring complaints when AI systems do not comply with the regulation.
The three-way negotiations on the final text are now under way. They are expected to be completed by the end of the year, with the aim to pass the law ahead of the European Parliament elections in June 2024. The broad coalition of concerned civil-society organisations will continue focusing on human rights during these deliberations.