Advertisement

English councils’ AI tool treats male and female patients differently

New research suggests artificial intelligence downplays severity of physical and mental health issues. 

Conducted by the London School of Economics and Political Science [LSE], the study focused on Google’ s ‘Gemma’ system, which produces summaries of case notes and is used by more than half of English councils.

Analysis showed words such as ‘disabled’, ‘unable’ and ‘complex’ were more likely to be used in descriptions of male patients compared with females. The team behind the analysis — which involved case notes on 617 adults, various large language AI models for comparison and 29,616 pairs of summaries — believe there is a risk of gender bias becoming commonplace in care decisions, with specific needs going unreferenced or given less severe labels.

‘We know these models are being used very widely and what’s concerning is that we found very meaningful differences between measures of bias in different models,’ said Dr Sam Rickman, lead author of the report and an LSE Care Policy and Evaluation Researcher.

‘Google’s model, in particular, downplays women’s physical and mental health needs in comparison to men’s,’ he continued. ‘And because the amount of care you get is determined on the basis of perceived need, this could result in women receiving less care if biased models are used in practice. But we don’t actually know which models are being used at the moment.’

The research has been published as staff at the UK’s Turing AI Institute have warned the charity is at risk of collapse if Technology Secretary Peter Kyle acts on his recent threat to cut funding. A complaint issued to the Charity Commission has accused leadership of creating a ‘toxic internal culture’, with the government urging a shift in focus to defence applications — a massive pivot from the current areas of research. 

Image: National Cancer Institute / Unsplash 

More Data Management: 

Government to use AI to prevent violence in prisons 

400,000 people demand Online Safety Act repeal

Public sector lessons from the WeTransfer data debacle

 

 

Help us break the news – share your information, opinion or analysis
Back to top