At the entrance of a supermarket, in the crowd of a festival: millions of British now have their face scanned by technologies of facial recognition in real time, in the only European country to deploy them on a large scale.
At the London Carnival of Notting Hill, where two million people are expected on Sunday and Monday to celebrate Afro-Caribbean culture, cameras using this process were installed at the entrances and outings of the parade.
The objective, according to the police: “Identify and intercept” live people, by scanning the faces and comparing them to the thousands of suspects in its database.
“Real -time facial recognition is an effective tool (…) that has enabled more than 1000 arrests since early 2024,” said Mark Rowley, London police chief, who plans to “more than double his use” in the future.
The use of these technologies has already increased considerably for three years, going from ten operations between 2016 and 2019 to a hundred since the beginning of 2025.
In total, the faces of 4.7 million people were scanned in 2024 in the United Kingdom, notes the NGO Liberty.
The cameras are installed on the roof of a van, where police officers operate, and, when a suspect passes nearby, the system using artificial intelligence triggers an alert allowing to immediately arrest it.
Its “large-scale” use in the British capital, on the occasion of the coronation of Charles III in 2023, or in Cardiff this year before the concerts of Oasis and the matches of the Six Nations tournament, transforms the United Kingdom into a “a country of suspects”, is worried about the Big Brother Watch organization.
“There is no legislative basis (…) so the police have a free field to write their own rules,” Rebecca Vincent, its interim director, told AFP.
Its private use by supermarkets or clothing stores to combat flights with high increases in high increase concerns them particularly, with “very little information” on their collection of data.
Most of them use Facewatch, a service provider that is a list of suspects of offenses in the stores he monitors, and gives the alert as soon as one of them enters one of these businesses.
“They should clearly inform their customers of them,” said “very surprised”, Abigail Bevon, a 26 -year -old forensic practitioner met in front of a chain using Facewatch in London. If it includes the usefulness of this technology for the police, it deems “invasive” its use by a trade.
Prohibited in the EU
In the EU, the legislation framed since February, artificial intelligence has prohibited the use of facial recognition technologies in real time, with exceptions in particular for the fight against terrorism.
Apart from a few cases in the United States, “there is nothing comparable in European countries or in other democracies, the use of this technology (in the United Kingdom) is more similar to that of authoritarian states like China,” said Rebecca Vincent.
“This changes the way of living in the city by removing the possibility of living in anonymity” and can discourage participation in particular in demonstrations, warns Daragh Murray, lecturer at Queen Mary University in London.
The Minister of the Interior Yvette Cooper recently promised a “legal framework” to delimit her use, emphasizing the fight against “serious crimes”.
Without waiting, the Home Office has just extended the use of this process to seven new regions of the United Kingdom.
After the Vans, permanent cameras must also be installed for the first time in September in Croydon, a district in the south of the capital deemed difficult.
The police ensure having “robust guarantees”, promising to delete the biometric data of people who have nothing to blame themselves for.
But the British regulator responsible for human rights said on Wednesday that the use of this technology by London police was “illegal” because “incompatible” with the respect of these rights.
Eleven organizations, including Human Rights Watch, had urged the police to give up using it during the Notting Hill carnival, accusing it in a letter of “unjustly targeting” this community and insisting on the racial biases of AI.
They cite the case of Shaun Thompson, a black man arrested after being wrongly identified by one of these cameras, who has filed a legal action against the London police.