Menu
Subscribe to Holyrood updates

Newsletter sign-up

Subscribe

Follow us

Scotland’s fortnightly political & current affairs magazine

Subscribe

Subscribe to Holyrood
AI surveillance and privacy: Watching You Watching Me

AI surveillance and privacy: Watching You Watching Me

A man is photographed outside Dublin’s famous Temple Bar, he is nursing a pint and smiling for the camera. Posted to Instagram, the image is a snapshot in time, just one of millions of photos uploaded to the social media site every day. 

But why does this one image hold so much significance? 

It features in an art project by Belgian artist Dries Depoorter called The Follower. Well known for creating projects on surveillance, privacy and machine learning, Depoorter used open-source cameras accessible to anyone with a device that connects to the internet. 

The Follower consists of recording free-to-view surveillance streams in well-known tourist spots across the world accessible via EarthCam.com for two weeks. Depoorter collated all the Instagram posts tagged in each location he recorded within those two weeks. An AI image recognition software was then used to compare the Instagram images with the footage to pinpoint the moment a photograph was taken. 

On the face of it, it seems fairly innocuous. Fun even. Depoorter was able to find the moments before and after people created their social media posts. Is that such a big deal? Possibly not – this is just one man’s art project that uses datasets readily available to everyone. 

But it begs the question, what is the capability of well-organised, well-funded groups of people working with AI software within a government, or a powerful organisation? It seems like Depoorter has only shown us the tip of the iceberg.   

There are obvious concerns that something like The Follower could be carried out on a larger scale by a less-than-scrupulous government or organisation. Fabien Benetou, a member of the innovation lab at the European Parliament, believes there is a good chance that the capability of AI-assisted surveillance could be much further developed than we think.     

“To assume that surveillance companies and institutions don’t have something equivalent run by dedicated teams with experts in computer vision and scalability for years, especially with examples shared with Snowden and others, seems sadly a bit naïve,” he says. 

Albert King, chief data officer at NHS National Services Scotland, and former chief data officer for the Scottish Government, says The Follower is “quite voyeuristic” because it is “making people pretty vulnerable”. He says: “People are quite deliberate in crafting an image and this cuts right through that.

“I have concerns about the approach  Depoorter was taking because it does raise some of these issues itself. Nevertheless, I think it was a neat wake-up call for us individually and collectively and is a good illustration of the sorts of things that are possible.” 

Alistair Duff, emeritus professor of information policy at Edinburgh Napier University, agrees that Depoorter’s project shows that the implementation of AI software in the gathering and understanding of data posted online by members of the public could indirectly infringe on their privacy if they don’t understand the potential consequences.  

“It is very interesting combining two public datasets: Instagram and CCTV,” Duff says. 
 “CCTV stands for closed circuit television, now most of these cameras are not closed circuit; they are linked to the internet, so they really should be called video surveillance cameras. And they are so hackable, and so useable because it is open circuit, it is linked to the internet. It opens a whole pandora’s box [of privacy concerns].  

“People have consented to it in a way, but they maybe haven’t consented to the consequences of it down the line, particularly with the combining of materials.” 

Duff points to digitalisation and the internet as reasons why people are losing more of their privacy. Videos posted to social media can be “a real issue” as “there is a recording forever”. Before this, members of the public enjoyed “practical obscurity” – but now recordings and posts are permanent, in some form, online.  

As the ways we can connect data with the assistance of AI becomes more sophisticated, people’s actions in private will be revealed in greater detail. King explains that AI’s introduction is a “step change” and it is a “question of scale”.   

“If we are putting all this information about ourselves in the public domain, with AI, things like image recognition do make it possible to identify individuals from social media posts, and that is whether you post it yourself or not,” he says. 

“Using AI to link up those data points creates a connected web of information that can be analysed. Then we can extract more meaningful data from that rapidly. That is starting to build up a detailed picture of you, who you are, who you are interacting with, and what you are doing.   

“But the important thing to remember, whilst AI does create risks, which absolutely must be addressed, it is a technology as well that can bring benefits, help save lives, improve services, and grow our economy. The crucial thing is to work together to address these risks.”  

Speaking about the permanent record users leave on social media, Duff believes: “Privacy in public is pretty much dead in the water. There must be a reframing.  

“People have been defenestrated, marginalised, and worse. I think there needs to be tolerance because a lot of that private stuff wouldn’t have come into the public domain [before social media]. I think it is very intolerant and frankly Pharisaical to dig that stuff up.”

Although there are privacy concerns, there are also clear potential benefits for people if AI can be implemented in the right way, King says. It needs to be done in a way that is “transparent and protects people’s rights and privacy,” while ensuring we understand the limitations of the data and don’t overreach.  

He points to the work that was done at the beginning of the pandemic with Google mobility data and how the virus affected moving people around the world. Linking that with economic activity gave an “important picture and insight into the pandemic”. 

The Scottish Government used people’s data for good during the pandemic when they introduced NHS Scotland’s Protect Scotland test and trace app. It used people’s Bluetooth to measure contact between app users and data inputted by users to help better track the spread of Covid-19. Scotland also proved it was able to develop effective software cost-effectively, only costing the Scottish Government £300,000, whereas England’s application software cost an estimated £36m.  

But that is not the complete picture. Although Scotland has performed well in areas of public data-gathering projects, a recent freedom of information request by the Scottish Liberal Democrats revealed that at least 13 Scottish councils are currently using surveillance cameras made by Hikvision, a company allegedly linked to the repression of Uighurs in China. It has been reported that Hikvision’s facial recognition cameras are used to distinguish entire ethnic minority populations, placing Tibetans and Uighurs at serious risk.   

The cameras continued to be installed even after a report by the UK parliament’s Foreign Affairs Committee connected Hikvision to human rights abuses, concluding they should not receive government contracts. Hikvision has said it respects human rights. 

The leader of the Scottish Liberal Democrats, Alex Cole-Hamilton, says: “It’s extremely disappointing that not only are at least a dozen councils using these cameras, but they have carried on installing them after the Foreign Affairs Committee connected Hikvision to human rights abuses.   

“There have been numerous warnings that Hikvision was providing surveillance tools for the Chinese government. The Scottish and UK governments need to come off the fence and introduce more robust rules about partnering with Chinese firms.” 

Looking towards other countries, like China, that is further ahead in their implementation of AI and other technologies into everyday life, it has been apparent that their citizens have been losing aspects of their privacy. 

Duff says: “I interviewed a great anthropologist out in Silicon Valley a few years ago, and I asked, don’t you think it is terrible the way that privacy is disappearing?    

“She said that is a very culturally parochial view you are taking – the Chinese are happy with it, they want my communality, she told me.” 

China is by far the world’s most-watched population on the planet, and facial recognition is being used in a multitude of areas, even in refuse collection and in the dispensing of toilet rolls. But it seems attitudes to privacy are changing in China. 

In 2021, a Beijing thinktank asked 1,515 anonymous Chinese residents if they thought the technology should be used in commercial areas, and almost 90 per cent of people did not want it. Seventy per cent believed it should not be used in residential areas either.  

King is critical of the way that China has been using AI in the last few years: “There is little compunction about how they use these technologies in ways that, frankly, we wouldn’t accept in the UK and Europe.  

“I do think that is partly about protecting people’s rights and privacy, but I also worry that the Chinese state is overreaching the capability of the technology. There are concerns about bias in algorithms and the failure of image algorithms.  

“There are quite a lot of stories coming from China that the facial recognition algorithms are just making bad decisions. Not only are the uses of those technologies threatening other people’s rights, but they are fundamentally inappropriate, and they are flawed.  

“I think we are very fortunate that we live in a country that is taking a much more thoughtful approach to the adoption of these technologies.” 

The question of regulation at the government level to protect people’s right to privacy is apparent. Duff believes that “the panopticon [the surveillance state] is inevitable” but it must be met with “good information policies, and more regulations of the right kind without a nanny state”. He does not believe there can be one “great privacy law”, he believes people should be “influenced” before looking at “specific areas and see where regulation might be needed”. 

He adds: “But even with all of that, there is no question that exposure of the individual is going to increase, and we need to live with that because that is the modern world.” 

Duff thinks that the argument could be made that “the panopticon is about a new form of collective life, and we should embrace and be more connected with each other”.  

“Maybe privacy will die out. There wasn’t a lot of it in the medieval era or the ancient world. Privacy was an invention of the Victorian era, that’s when people started getting separate rooms and getting a bit of anonymity.  

“Perhaps privacy is just a blip, and we go back to the global village. Many people on the left would welcome a more communal society rather than an individualist one.”

But King disagrees. “I think privacy is a matter of personal choice,” he says. “How does one choose to engage with technology, what information does one choose to put in the public domain, and to what extent does one have control over it? 

“I think we as individuals have a lot of agency. And we will continue to have a lot of agency around this. There is no doubt that regulators will affect the circumstances in which we do that. 

“I think a lot of it will be a question of personal choice.”

Holyrood Newsletters

Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe

Read the most recent article written by Ruaraidh Gilmour - Public health: Fail to prepare, prepare to fail.

Tags

Connect AI

Get award-winning journalism delivered straight to your inbox

Get award-winning journalism delivered straight to your inbox

Subscribe

Popular reads
Back to top