Nesta proposes public sector code of conduct for AI decision making
Nesta has created a list of 10 principles it believes should define how government uses artificial intelligence and algorithms
Artificial intelligence concept brain vs network - Image credit: Pixabay
Government has to be as open as possible about the way that algorithms are created and used to inform decision making, says innovation charity Nesta.
Nesta has published a draft code of conduct for how algorithms are used to automate decision making or assessments.
The code, which contains 10 core principles, was written by the organisation’s director of government innovation Eddie Copeland.
In a blog post, he wrote: “The application of AI that seems likely to cause citizens most concern is where machine learning is used to create algorithms that automate or assist with decision making and assessments by public sector staff.
“While some such decisions and assessments are minor in their impact, such as whether to issue a parking fine, others have potentially life-changing consequences, like whether to offer an individual council housing or give them probation.
“The logic that sits behind those decisions is therefore of serious consequence.”
Copeland said that “considerable amount of work has already been done to encourage or require good practice in the use of data and the analytics techniques applied to it”.
He singled out the government’s Data Science Ethical Framework as an example of this work.
But he suggested greater efforts are needed, particularly on the part of governments and the wider public sector.
“After all, an individual can opt out of using a corporate service whose approach to data they do not trust,” he said.
“They do not have that same luxury with services and functions where the state is the monopoly provider.”
The 10 principles are:
1. Every algorithm used by a public-sector organisation should be accompanied with a description of its function, objectives and intended impact, made available to those who use it
2. Public sector organisations should publish details describing the data on which an algorithm was (or is continuously) trained, and the assumptions used in its creation, together with a risk assessment for mitigating potential biases
3. Algorithms should be categorised on an algorithmic risk scale of 1-5, with 5 referring to those whose impact on an individual could be very high, and 1 being very minor
4. A list of all the inputs used by an algorithm to make a decision should be published
5. Citizens must be informed when their treatment has been informed wholly or in part by an algorithm
6. Every algorithm should have an identical sandbox version for auditors to test the impact of different input conditions
7. When using third parties to create or run algorithms on their behalf, public sector organisations should only procure from organisations able to meet principles 1-6
8. A named member of senior staff (or their job role) should be held formally responsible for any actions taken as a result of an algorithmic decision
9. Public sector organisations wishing to adopt algorithmic decision making in high-risk areas should sign up to a dedicated insurance scheme that provides compensation to individuals negatively impacted by a mistaken decision made by an algorithm
10. Public sector organisations should commit to evaluating the impact of the algorithms they use in decision making, and publishing the results
With the current code published in 2011, the new code, currently in a draft form, will aim to provide guidance
Audit Scotland criticised SSPA for not applying enough scrutiny to Capita’s tender, describing it as “abnormally low cost”
Key themes identified at Holyrood’s annual Connect digital conference were the need to collaborate and to focus on users rather than technology
Figures from Which? show around 1,700 cashpoints were converted to pay-to-use in the first three months of 2019
Vodafone explores some of the ways IoT is significantly improving public sector service delivery
With the annual worldwide cost of cybercrime set to double from $3tn in 2015 to $6tn by 2021, BT offers advice on how chief information security officers can better...
BT's Amy Lemberger argues that having the right security in place to protect your organisation is no longer just an option. It is a necessity.