Police face calls to end use of facial recognition software

Analysts find system often wrongly identifies people and could breach human rights law

Police face calls to end use of facial recognition software

Curated via Twitter from Guardian Tech’s twitter account….

Liberty, the civil rights campaign group, has previously called facial recognition “a dangerously intrusive and discriminatory technology that destroys our privacy rights and forces people to change their behaviour”. © 2019 Guardian News & Media Limited or its affiliated companies. All rights reserved.

Police are facing calls to halt the use of facial recognition software to search for suspected criminals in public after independent analysis found matches were only correct in a fifth of cases and the system was likely to break human rights laws.

Its use by South Wales police is currently under judicial review, while the information commissioner, Elizabeth Denham, has criticised “a lack of transparency about its use” and Tony Porter, the surveillance camera commissioner, last year intervened to stop Greater Manchester police using facial recognition at the Trafford shopping centre.

In the UK a court action claims that South Wales police violated privacy and data protection rights by using facial recognition technology on individuals.

David Davis MP, a former shadow home secretary, said the research by Prof Peter Fussey and Dr Daragh Murray at the University of Essex’s Human Rights Centre showed the technology “could lead to miscarriages of justice and wrongful arrests” and poses “massive issues for democracy”. “All experiments like this should now be suspended until we have a proper chance to debate this and establish some laws and regulations,” he said. “Remember what these rights are: freedom of association and freedom to protest; rights which we have assumed for centuries which shouldn’t be intruded upon without a good reason.

The research found that police were too hasty to stop people before matches could be properly checked, which led to mistakes; watchlists were sometimes out of date and included people wanted by the courts as well as those considered “at risk or vulnerable”; and officers viewed the technology as a way of detecting and deterring crime, which the report argued could have been achieved without biometric technology.

Similar facial scanning software is being used in shopping centres, where it is embedded in advertising hoardings to track the shoppers’ age, gender and even mood, and has been deployed by other police forces in Manchester, Leicester and South Wales – where it will be used this weekend at the Swansea airshow.

The Essex researchers also raised concern about potential bias, citing US research in 2018 into facial recognition software provided by IBM, Microsoft and Face++, a China-based company, which found the programmes were most likely to wrongly identify dark-skinned women and most likely to correctly identify light-skinned men.

It is especially inaccurate and prone to bias when used against people of colour: a test of Amazon’s facial recognition software found that it falsely identified 28 members of US Congress as known criminals, with members of the Congressional Black Caucus disproportionately represented.

They said it was “highly possible” Scotland Yard’s use would be ruled unlawful if challenged in court. “While we focused on the police, by far the greater use is in the private sphere,” said Professor Fussey. “There’s a lack of any national leadership on this issue of facial recognition.

Automated facial recognition (AFR) is technology that can identify people by analysing and comparing facial features to those held in a database.

The Neoface system used by the Met and South Wales police is supplied by Japanese company NEC, which markets the same technology to retailers and casinos to spot regular customers, and to stadium and concert operators to scan crowds for “potential troublemakers”.

Link to original article….

Leave a Reply

Leave a comment
%d bloggers like this:
scroll to top