Experts warn House of Lords that we don’t know which public bodies used automated systems to make decisions, how they use them or what the impacts on us all are
A committee of the House of Lords has been warned by legal experts that we don’t know enough about the way government and public bodies use automated decision-making systems – or how these systems impact people’s lives.
The Lords’ Constitution Committee is undertaking an enquiry into the “rule of law” in the UK, which has tended to focus on the way legislation is made, is scrutinised and applied, and people’s attitudes to the legal system more generally. A range of expert witnesses have given evidence and raised a number of points of concern.
This has included tech-related matters. With widespread, lengthy backlogs in the legal system, it is hoped that technology can ease administrative burdens, enhance efficiency and speed up the process. There’s already evidence that people seek legal advice from automated chatbots and other AI systems – though there is some concern about the quality of resulting answers.
Then, right at the end of yesterday’s evidence session, the Liberal Democrat peer Baroness Hamwee asked about the legal ramifications of government and public bodies using automated decision-making (ADM) systems reliant on pre-programmed algorithms rather than human involvement.
Shameem Ahmad, CEO of the charity Public Law Project (PLP), responded that this is now ‘one of the most significant’ issues affecting the rule of law in this country, and that there is, ‘nowhere near the level of transparency’ needed.
She said: ‘We do not know which authorities are using this – which departments or public bodies up and down the country are using it. We do not know for what types of decisions they are using it or what criteria they are applying, which is hugely problematic, and we do not know what impact they have personally assessed it will have on individuals, such as in equality impact assessments.
‘It is so difficult to establish where these tools are and how they impact people, so we cannot test the lawfulness of them.’
PLP’s Tracking Automated Government register aims to increase transparency and accountability by listing government bodies known to be using ADMs. But knowing what is being used is just part of the concern. These systems have the potential for huge and devastating impacts on people’ s lives.
Ahmad warned that there have already been examples of ADM ‘going horribly wrong’, citing the automated Robodebt assessment and recovery scheme in Australia. This issued some 470,000 incorrectly calculated debt notices, causing widespread harm and a public outcry. That has come at significant financial cost to the government. After a lengthy legal process, in 2021, the Federal Court approved a A$1.8bn settlement to cover repayment of debts wrongly paid, write off unpaid debts and meet legal costs.
In a similar case, Dutch tax authorities were fined €3.7 million for use of an ADM system aimed at spotting benefit fraud. In this case, as well as identifying innocent people as fraudsters – and so cutting off payments to which they were entitled – the system was also found to have breached laws on personal data.
Stephanie Needleman, Legal Director of the charity and campaign group Justice added that such systems are often developed by private companies who provide little transparency about the data used to train them or the means by which they make decisions. ‘More needs to be done to ensure that the government, buying in these tools, can understand exactly what is going on and that they are accurate,’ she said.
ADM and other technologies are seen as a means to improve efficiency and save money across the public sector. The question being raised is how much they really cost.
To see the question about ADM and the answers given, visit Parliament.tv.
In related news:
High adoption of AI produces little economic gain – new study
Needin’ you: ’25 top tech minds’ wanted for government roles
Leave a Reply