Human rights at risk in AI revolution, experts warn
Human rights are at risk if technology companies fail to include vulnerable people in the “AI life cycle”, experts have warned at Holyrood’s policing, justice and law enforcement in a digital age conference.
Senior lecturer in law at the University of Strathclyde, Birgit Schippers, highlighted that the impact of smart technologies extends far beyond just data privacy, often in ways the public isn’t fully aware of.
She said: “There is a significant focus on privacy. But I think it's worth perhaps bearing in mind that any use of smart technologies impacts not just on privacy, but on the wide range of rights. From convention and complete expression to freedom of assembly, free trial, etc... But in fairness, how can we all be aware of what's happening? I don't think it's a realistic expectation that every single citizen is fully aware. And may not be aware of what's actually happening.
“And that's where I think it's essential not just to talk about data use and data processing in relation to ethics, but really focusing on human rights.”
Similarly, Anabelle Turner, director of Cybersafe Scotland, said tech companies are failing to protect vulnerable people because this responsibility “is at odds with profits”.
“We need to have a radically different approach to the protection of all the vulnerable [people]. Not just children, we're talking about vulnerable adults,” she said.
“For me, the core principle of safer design comes through understanding your product through the eyes of the vulnerable user.”
Turner called for stronger collaboration across the tech industry to improve transparency around data usage and to streamline the process for reporting harm online.
She continued: “If you look at any of the products that young people are using, the processes for reporting harm are incredibly complex compared to the way those products are made.
“In Roblox, there are four different ways to report harm. There's a report on the online safety app. You have to go out and go to a totally different section to report harm.
“If you look at TikTok, how long it takes to report harm compared to how fast you're making videos and so on and so forth... I do think radically changing and industry collaborating with child protection agencies to change reporting, both from the law enforcement side and the product side, is critical in this space. The number of reports that we're dealing with, the scale of harm, is just so enormous.”
James Stevenson, technology-facilitated CSEA (child sexual exploitation and abuse data specialist at the Global Child Safety Institute Childlight, echoed the concern, telling delegates that the constantly changing nature of digital platforms makes reporting even harder.
He said: “When I was working at a centre for child protection, we would update every three months guidance on how children could report.
“We saw significant changes across all platforms in every three-month period, making it even more difficult. So if a child has reported once four months ago, the procedures may have changed within that short time period.”
Earlier in the morning, Aoife Houghton, a student at the University of Strathclyde, had told delegates about her experience as victim of sextortion in the social media app Yubo.
As a young teenager, she was tricked into sending photographs of herself to a man she believed to be her age. After revealing his real age, he threatened to send the pictures to her friends and family if she didn't complete his demands.
In her address, Houghton spoke on the fear she felt and the importance of reporting these crimes, discussing the critical role Police Scotland and the National Crime Agency had played in helping her deal with the incident.
Schippers also highlighted the importance of international collaboration on digital regulation, but acknowledged political differences often make this difficult.
“It would be desirable to have cross-border regulation, international regulation but that, of course, is incredibly difficult because it requires spying from states which regard themselves as sovereign.”
She added: “Ideally, in an ideal world, yes, I'd like to see international regulation, but I think it would take significant effort to get there.
“And there may be a risk that the regulatory instruments become so diluted in order to get buy-in from as many states as possible that perhaps they're not particularly effective.”
Holyrood Newsletters
Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe