Rewiring the thin blue line: can technology help keep us safe?
Police officers, academics and commissioners are at odds on live facial recognition’s potential to help tackle crime
Officers from across Police Scotland gather in quiet anticipation, waiting to hear how technology might finally make their work sharper, safer, and smarter. “It feels like we’ve always been on the back foot,” one female officer tells her colleagues, “But we’re finally getting there.”
Lindsey Chiswick takes the stage at the annual Scottish police superintendents’ conference. Everyone is ready with a pen in hand and a notebook. Chiswick is lead for facial recognition – controversial technology which has been trialled south of the border since 2016 – at the National Police Chiefs’ Council.
It’s a normal day in Denmark Hill, “not the best area in London”, Chiswick says. A man is hand in hand with a six-year-old girl and “looks no different from any other father, perhaps grandfather”, she continues. He walks past a live facial recognition (LFR) camera, triggering an alert. Police officers decide it’s a match and go speak to him. The man, who was on a watchlist, had previous convictions for indecent assault and gross indecency and was banned from being with anyone under the age of 14. He had been in a relationship with the girl’s mother for over a year and was arrested before later being jailed for two years.
“That’s an example of what live facial recognition can do, because without that, there’s no way on earth officers would have remembered this guy’s face along with the other potentially 10,000 faces on our watch list that day,” she adds.
Live facial recognition technology is not currently in use north of the border. Police Scotland paused its deployment plans in 2020 following public concern and strong criticism from the Scottish Parliament’s Justice Sub-Committee on Policing. MSPs said deploying the technology would be a “radical departure” from Scotland’s policy of policing by consent.
The system uses cameras mounted on top of vans to scan faces in real time, comparing them against a watchlist. If a potential match is flagged, an officer reviews it before engaging with the individual. If there’s no match, the biometric data is immediately deleted.

However, the Metropolitan Police’s trials of the technology have been far from smooth. As Brian Plastow, Scotland’s biometrics commissioner, puts it to Holyrood, “they let the technology out of the box before it was ready”. In 2019, research commissioned by Scotland Yard found 81 per cent of the people flagged by the system were innocent and not on a wanted watchlist. Six years later, the system still seems to be flawed. Privacy campaign group Big Brother Watch is currently supporting a legal challenge by Londoner Shaun Thompson, who was misidentified by the system. On his way home from a volunteer shift with anti-knife charity Street Fathers, he was wrongly flagged as a suspect and detained by police for around 30 minutes.
“It [LFR] kind of reverses that presumption of innocence. You are basically guilty until you can prove you’re innocent”, says Madeleine Stone, senior advocacy officer at Big Brother Watch.
Earlier this month, growing public mistrust prompted the UK Information Commissioner’s Office (ICO) to step up scrutiny of artificial intelligence (AI) and biometric technologies. In response to research showing that over half of the public fear LFR could infringe on their privacy, the ICO announced new measures to closely monitor police use of the tech.
The technology also raises serious concerns about bias, with several senior officers over the years acknowledging that policing has been institutionally racist. “If the AI already believes for whatever reason [it has to] stop young males, we will stop more young males. We will therefore discover more knives with young males that will be fed into the AI in the next round. It will confirm what it already thinks, and so it becomes self-perpetuating, self-poisoning… you get vicious circles”, Burkhard Schafer, professor of computational legal theory at Edinburgh University, explains.
Any notion that facial recognition would revolutionise policing is probably a fallacy
But LFR, like much of the AI now creeping into society, has thrown policymakers into a perpetual game of catch-up. And while tech has transformed how police forces operate, it has also reshaped crime itself, enabling more targeted, personalised, and often harder-to-trace attacks. As a result, policing finds itself in a problematic middle ground: under pressure to modernise, but often without the legal or ethical guardrails to match the pace of innovation.
Assistant Chief Constable for major crime, public protection and local crime Steve Johnson tells Holyrood: “They [criminals] don’t go to public consultation as to whether they want to use this [technology] to cause people harm, they just do it. They do it very quickly, they’re very tech-savvy and they have access to the best kit that’s available on the market and we’re constantly trying to find ways of catching up with them.”
He adds: “I would love to get to the point where we have, as an industry, our own research and development that’s trying to understand how people might use it and then prevent them from using it that way in the future.”
He admits that LFR is not “infallible” but insists that the technology can be a matter of life and death, citing missing persons cases as a clear example. While Scotland does not currently deploy LFR, it does make use of retrospective facial recognition technology. In other words, officers can input a photograph into a system to analyse existing footage – such as CCTV – to trace a missing person’s movements.
“This is the bizarre thing,” says Johnson. “If an elderly relative fell in the Clyde and died and there was a fatal accident inquiry, we would be able to then go back to the CCTV, put the picture of your relative in and we would be able to show the route that person took before they fell in the river. That doesn’t seem right. You would expect me to try and use that technology to stop them going into the river.”
Plastow also supports the technology but is sceptical of the hype. “Any notion that facial recognition would revolutionise policing is probably a fallacy,” he says. He acknowledges its potential to resolve crime and tackle violence against women, girls and children but insists it will provide “lesser return on investment” than other system-wide technologies, citing body-warn cameras and the government’s landmark DESC (digital evidence sharing capability).
He says it is a “national embarrassment” that it has taken so long for body-worn cameras to be deployed in Scotland. Indeed, the technology has been available in England for decades, and its delay has caused significant rows in the chamber, with shadow justice secretary Liam Kerr describing it as “a damning indictment of the SNP’s neglect of our police”.
Technology like body-worn video in that sense is a bit of a godsend to police officers.
Research shows the technology has helped de-escalate dangerous interactions between police and the public and has proven especially useful in sensitive cases like domestic abuse.
“It can only provide the whole criminal justice process with a much richer picture,” Johnson says. “And some of the harsh realities of public protection very often is [that] many of the victims or survivors of domestic abuse are really challenged about how they present their evidence and continue their family life.
“And that body-worn video can be the best evidence we’re ever going to get of what somebody told us… if you put that in front of a jury or in front of a court it’s quite compelling as evidence.
“It sits on its own. It’s not an officer saying, ‘this is what I saw’ and open to interpretation or values... So, technology like body-worn video in that sense is a bit of a godsend to police officers.”
On the other hand, with DESC, Scotland has been ahead of the curve. The system allows police officers, prosecutors, defence lawyers, court staff and judges to access a unified system to collect, store and manage evidence digitally. One of its goals is to reduce the time officers spend in court – time that, in many cases, is wasted. According to Chief Constable Jo Farrell, around 500 officers are cited to appear in Scottish courts every day, often pulling them away from frontline duties to attend proceedings and “on many occasions not giving evidence.”
Last year, ahead of the phased rollout of the system, justice secretary Angela Constance told Holyrood that the “world-first” initiative was a “crucial reform” that would enable a more “person-centred” approach to policing. The pilot programme in Dundee yielded promising results: the system handled around 19,500 pieces of evidence and freed up approximately 550 hours of police officers’ time.
However, when sensitive data is involved scrutiny inevitably follows, and DESC has faced its share of challenges. Freedom of Information requests from ComputerWeekly to the Scottish Police Authority revealed Microsoft, which hosts DESC in its cloud platform Azure, cannot guarantee data sovereignty, and as of November 2024, Axon (which is contracted to deliver DESC) still managed the decryption keys for the system.
This puts policing data at risk as both Axon and Microsoft are subject to the US Cloud Act, which allows US law enforcement to compel US-based service providers to disclose data that is in their “possession, custody, or control” regardless of where the data is located. This means it can potentially cause conflicts with UK GDPR.
However, an Axon spokesperson said it could guarantee the sovereignty of Scottish policing data, adding that if the firm received a US Cloud Act request, it would “scrutinise the request to ensure it meets applicable legal strictures and is appropriately narrow”.
Microsoft said it “regularly” challenges government requests for data “where there is a lawful basis for doing so.”
And Johnson insists he is “confident” that “checks and balances” have been put in place by the government and commissioners so “all of the legislation that we need to comply with is being complied with”. Similarly, a Scottish Government spokesperson confirmed it had “worked closely with criminal justice partners to ensure all required data security, protection controls and governance are in place and legally compliant ahead of the national roll out” of DESC.
They added: “We recognise the public interest in DESC data security controls, which is why there is robust governance in place in parallel to engagement with the Scottish Biometrics Commissioner and the Information Commissioner’s Office as required.”
But, back in our call, Plastow is clear when I ask about the matter. By awarding the contract to a US-based cloud hosting company “sovereignty of that data cannot be guaranteed. That’s a fact,” he says, adding it also raises concerns about potential data breaches.
He says: “Those companies have been exposed to some really significant data breaches, including significant breaches of US Government data, and of course these big multinationals become a target-rich environment for hostile foreign states.”
In an “ideal world” it would be better if UK-based law enforcement details were “hosted entirely” in the UK, Plastow says, but adds that the matter lies with the UK Information Commissioner, “who’s indicated that he’s content with cloud hosting, providing there’s appropriate safeguards in place”.
For now, it’s still early days for facial recognition in Scotland. “It’s just a conversation at this stage,” Johnson says. Perhaps the lessons learned from south of the border have led Scottish authorities to take a more cautious approach, carefully weighing whether the technology is truly necessary. But fears remain that this digital divide in policing could leave Scotland more vulnerable to crime, as offenders may believe they are less likely to get caught.
Plastow says: “Criminals are very clever people, and they sometimes have access to far greater funding sources than the police do... they will absolutely drive a coach and horses through any gap that they see in that sort of UK policing architecture. So, it’s really important that Scotland doesn’t go in a completely different direction.”
Holyrood Newsletters
Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe