Facial recognition is gaining ground in Europe. But this technology, used to achieve common security objectives on the ground, raises questions about human rights. EURACTIV France reports.
See without being seen. With the expansion (both in time and place) of the terrorist threat in Europe, ensuring safety in public places has become a priority for local and national authorities. Railway stations, airports, sports venues… many places have become potential targets and adopted surveillance technologies. Facial recognition borders science fiction.
Algorithm identifies Welsh fugitive
Cardiff, Wales, last May. During the final game of the Champions League, a man wanted by the police is questioned at the entrance of the Millenium Stadium. His face, filmed by a camera perched on top of a police car, was recognised by a facial recognition system produced by Japanese company NEC.
In real time, NEC’s algorithm compared people caught on film to a database of 500,000 “suspects” whose biometric parameters had been recorded while in police custody in the UK.
According to a public tender by the Welsh police, the arrest was realised as part of a pilot project started last April and running for two years. The cost of the software: about €200,000. A similar project had been running in London, in the Notting Hill festival, but led to zero arrests.
In the UK in 2013, a British authority on the security industry (BSIA) counted 5.9 million surveillance cameras in the country, making the Uk the most surveilled country in the world, with about one camera every 11 people.
But facial recognition in the UK is regulated only by a “code of conduct” with no legal value, leaving police forces a free hand on using this technology.
The biometric boom
Facial recognition technology is booming: a 2016 report by Frost & Sullivan valued the sector at $1.48 billion in 2012, and estimated it will be worth $6.15 billion in 2019. NEC estimates that in the coming decade, it will reach $20 billion.
During a high tech fair held in Tokyo on 3 to 6 October 2017, Japanese firm NEC unveiled its latest version of NeoFace accelerator, affirming that the software could recognise up to 5,000 faces in a crowd.
In a stadium or station, the software could identify people in 99.3% of cases according to its developers.
European countries test biometric cameras
Many EU countries are currently testing this technology. Germany, on 24 August, installed biometric cameras in Berlin’s Südkreuz station. Aabout 100 volunteers registered their face in the software to test the technology. “Oue public spaces need to be safe”, said Thomas de Maizière to Berliner Zeitung.
“An arrest like the one made in Cardiff could not happen in France”, said the colonel Jean-Marc Jaffré, author of a report on the limits and opportunities offered by facial recognition technology in 2016. Researcher in the national police academy, he claims there is “stronger regulatory framework in France.” Police forces need an authorisation by decree from the State Council, and a favourable vote by National Commission on Informatics and Liberty (CNIL), before deploying facial recognition technology.
Currently, France only authorised facial recognition in the Orly and Roissy airports and in Gare du Nord station at the Eurostar terminal.
“Devices in use at these stations are less encroaching on privacy protection”, said the policeman.
“The biometric passport picture is compared to the photo taken at the security gate. Data aren’t stored.”
Except for the boarding gate devices, the technology tested in France does not amount to facial recognition. Christian Estrosi, former mayor of Nice, wanted to install facial recognition cameras in the “fan zone” of the soccer World Cup in 2016.
Then-attorney-general Jean-Jacques Urvoas opposed this: “This proposal risks attacking civil liberties,” he told MPs in May 2016.
Yvelines are the first French department to test smart cameras in 12 buildings, including 6 schools and a town hall, starting from January 2018.
But the system does not include facial recognition: “it is about detecting abnormal behaviours”, said Laurent Rochette, digital counsellor in the department.
Unusual cries, smashed glass, street fights – all these could be detected by the cameras thanks to software developed by startup companies live Evitech or Foxstream.
“Criteria for suspicion are decided by the department’s authorities according to the issues we find on the ground.” Once tested, the project should be extended to 250 projects in the department, including fire stations, for a total cost of €13 million.
“The pilot phase will reveal the soundness of these devices. Are they able to detect all fishy situations?” asked Rochette. CNIL hasn’t given its green light, but the project is already been copied elsewhere: similar devices will be deployed in Toulouse, in a pilot project led by the local police.
A lot of data raise a lot of questions
A legislative proposal tabled in July 2016 in the French parliament suggests to authorise facial recognition processes as part of the fight on terrorism.
Only those people who are listed as potential terrorists by France or European member states would be targeted, and the algorithm would check their biometric data against the police’s database wanted people.
The database includes data on 60 million French citizens. The constitutional council had declared, in 2012, that the breadth of the data collected infringed the right to private life, upheld in the French constitution.
The National Digital Council (CNNum) also alerted the government, last November, demanding that this database be suspended. The statement claims that a similar database could “allow systematic identification of citizens through facial recognition […] for policing aims” and that, as such, “is particularly prone to serious misuse.”