Take a fresh look at your lifestyle.

Live facial recognition open to ‘reckless’ use, ICO warns

The Information Commissioner has published a blog post expressing concern about the potential for misuse of live facial recognition (LFR) in public areas and warning that data protection needs to be considered as its use increases.

LFR tools like Amazon’s Rekognition allow law enforcement and other organizations to record the faces of passersby in public areas and automatically match a database of faces of people of interest, such as missing persons or fugitive criminals.

British Information Commissioner Elizabeth Denham has warned of the potential misuse of LFR in public places, noting that the technology could be used “inappropriately, excessively or even recklessly”.

In a blog post about the views of her new commissioner, she explained her concerns about using LFR to automatically collect biometric data in public spaces. Studies have uncovered numerous unwarranted uses of LFR, including the generation of biometric profiles to target people with personalized ads, and none of the uses studied fully complied with data protection laws.

“I am very concerned about the possibility of LFR being used inappropriately, excessively or even recklessly. When sensitive personal data is collected on a large scale without people’s knowledge, choice or control, the consequences can be significant,” Denham wrote.

“We need to be able to take our kids to a recreational complex, visit a shopping center or explore a city to see the sights without having our biometrics collected and analyzed every step of the way.” She suggested that in the future it might be possible for CCTV cameras to be combined with LFR and social media data:

“LFR is supercharged CCTV.” The Commissioner’s opinion [PDF] — rooted in law and based on her investigations — explains how data protection and privacy should be at the heart of decisions to deploy LFR.

It says organizations must demonstrate high standards of governance and accountability from the outset, including being able to justify that their use of LFR is “reasonable, necessary and proportionate in any specific context in which it is deployed” and that less invasive approaches are insufficient for those context.

It also says organizations should assess the risk of using this intrusive technology, including taking into account issues of accuracy and bias. It does not cover the use of LFR by law enforcement agencies.

Speaking to the PA news agency, Denham added: “We are now at a crossroads, we in the UK and other countries around the world are seeing the deployment of LFR and I think it is still at an early enough stage that it is not too late to put the genie back in the bottle.

” Last year, companies, including IBM, halted or paused work on LFR, while the EU proposed a five-year moratorium on the use of LFR in public spaces, partly influenced by outcry over racially aggravated police brutality;

repeatedly shown to abandon dark-skinned people, for example, a 2018 ACLU study found that Amazon’s Rekognition falsely identified 28 federal U.S. lawmakers as criminals, disproportionately affecting black and Latino lawmakers.

Kingdom took a civil rights activist to court over the use of AFR in public areas around Cardiff, South Wales Police, in court judges ruled that the use of technology in this context was unlawful, although this did not stop the force from using LFR use, this means that changes must be made n are applied in the way it is deployed.