LONDON (Thomson Reuters Foundation) - Almost two in three Britons disagree with police using artificial intelligence such as facial recognition technology to identify suspects, according to a survey released on Friday.
Public bodies and employers could face a popular backlash against “tech creep” in sectors from recruitment to policing, found the poll, commissioned by Britain’s Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA).
“An increasing amount of decision making – in our public services, the job market and healthcare – is taking place via ever more opaque processes,” said Asheem Singh, the RSA’s acting head of tech and society.
“We need an open conversation about AI and other forms of decision making, driven by the principles of transparency and accountability.”
There has been growing debate over the use of facial recognition technology by some police forces, with supporters saying it allows smarter policing, while critics say it is intrusive and often inaccurate.
The technology uses surveillance cameras equipped with facial-recognition software to scan passers-by in public spaces and uses artificial intelligence to compare them to watch lists of people being sought by police.
If a suspect is identified, they can be stopped on the spot.
Pollsters YouGov surveyed more than 2,000 British adults for the RSA, which gathered a “citizen jury” of about 25 people to debate computer decision-making.
The panellists looked at its use by police for automated facial recognition and to help decide when to prosecute people who have been arrested.
They concluded it was possible that machines would be more objective than people, but raised concerns that technology could reflect human biases and said there should be human oversight and accountability over decisions.
The RSA called for greater public engagement by tech firms to educate the public about how technology is being developed and to help shape how artificial intelligence is used.
California in September banned police from using body cameras with facial recognition software, while some privacy campaigners have developed make-up or clothing designed to prevent the technology from identifying them.
Reporting by Sonia Elks @soniaelks; Editing by Claire Cozens. Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women's and LGBT+ rights, human trafficking, property rights, and climate change. Visit http://news.trust.org