Civil liberties groups have criticised a government bill that will allow the police to link facial recognition technology to the driving licence database – fearing it could “make us feel less human”.
The Crime and Policing Bill has passed the committee stage in the House of Commons and could come into effect later this year.
Elena Auer from Amnesty Liverpool and the Collective for Human Rights in the UK said that the increasing use of this technology by the police and private sector “makes us feel less human”.
She said there is a big potential for misuse with this technology and making data mining and sharing legitimate, and that combined with predictive and racialised policing it will marginalise the marginalised instead of dealing with the core reasons behind many crimes.
Human rights group Liberty has also spoken out against the proposed bill, stating there is “always the risk of misidentification”.
Charlie Whelton, campaigns officer at Liberty, said: “This will enable the sharing of biometric information from the driving licence database with the police. Millions of people in the UK who have done nothing wrong are essentially being put in a police line-up without their consent.
“It takes a very powerful tool that should be used in extreme circumstances in a targeted way and turns it into something where it’s being used on everybody.”
Whelton said that there is no framework or accountability in place, and there is uncertainty on how police forces are using or will use this technology.
He added that the Home Office should introduce primary legislation to ensure that facial recognition is being used responsibly.
For example, the EU has the AI Act, which includes safeguards when using facial recognition, such as police forces having to get permission to use it, in the same way they would need a warrant when searching a property.
Jodie Bradshaw from Stopwatch, an organisation that monitors the police, said: “We will see the disproportional criminalisation of racialised people as a result of the use of this technology.”
She added that the use of this technology will cause further mistrust in the police.
Bradshaw mentioned that although improvements have been made in its accuracy, many are still sceptical.
She also spoke about the economic issues with these measures. Bradshaw said that as facial recognition becomes more common and people feel targeted, they will be less likely to go to the high street and more likely to switch to online shopping.
Bradshaw believes that the introduction of this database and technology will cost the taxpayer a lot of money – she suggested the £55 million that the government is spending on facial recognition technology could be better spent elsewhere.
She added that the use of this technology by private companies is dangerous, saying that it undermines the relationship between private companies and the public and that people have the right to go about their daily life without feeling like they are under surveillance.
She said: “People can’t do everyday activities without being filled with a certain dread they are being watched by private actors.”
The Metropolitan Police say they use facial recognition “to keep the people we serve safe, prevent and detect crime and find wanted criminals”.
Police bosses have urged forces to fully exploit the technology – which is also increasingly being used in the private sector.
Last year a 19-year-old woman was searched and kicked out of a Home Bargains store, having been wrongly accused of shoplifting due to an error by the technology.
Shawn Thompson is an anti-knife crime campaigner who was returning home from volunteering at the Street Fathers organisation when he was held by the Metropolitan Police and threatened with arrest until he showed his passport to confirm his identity. He has put in an application for a judicial review of the case.
In 2020 South Wales Police lost a case at the court of appeal which ruled that the use of facial recognition broke equality laws and breached privacy rights.
Asda has announced it will be trialling facial recognition in five of its stores in Manchester as a response to an “epidemic of retail crime”.
Spar has also confirmed it uses the technology in some of its stores.
The Met police said: “Facial Recognition technology can be used in a number of ways by the Met, including to prevent and detect crime, find wanted criminals, safeguard vulnerable people, and to protect people from harm – all to keep the people we serve safe.”
Join the discussion