Social Justice

DWP's 'automation' of universal credit discriminates against single mums, researchers say

Questions are being raised about the fairness of using automated tools to distribute benefits

benefits/ money

Image: Unsplash

Problems with the Department for Work and Pensions’ (DWP) digital universal credit (UC) system are disproportionately impacting working single mothers, raising questions about the fairness of using automated tools to distribute benefits. 

Initially rolled out in 2013, UC is a digital-by-default benefits system that uses various automated processes, including machine learning, to determine people’s eligibility for welfare, calculate benefits payments and detect fraud.

Responding to the findings of a recent Freedom of Information (FOI) request – which revealed that nearly half of in-work UC claimants are single parents, the vast majority (just under 90%) of whom are women – researchers say they are concerned about the role of automation within the UC system to entrench discrimination and surveillance in people’s lives.

“A certain subsection of the population – those on low-incomes and with disabilities – are being put under surveillance by systems that simply don’t affect the rest of the population,” explained Morgan Currie, a senior lecturer in data and society at the University of Edinburgh who submitted the FOI request. 

She added that the three most common problems with universal credit’s automated processes – including mistakes caused by flawed information about earnings, hardship as a result of delayed childcare reimbursement, and a mismatch between UC calculation dates and paydays – are therefore overwhelmingly affecting working single mothers.

A DWP spokesperson told the Big Issue: “We are committed to reviewing universal credit so it makes work pay and tackles poverty.”

Risks of discrimination against universal credit claimants 

In the Department for Work and Pensions’ (DWP) annual report for 2022-2023, the UK’s Comptroller and Auditor General said that when using machine learning to prioritise cases for fraud review, “there is an inherent risk that the algorithms are biased towards selecting claims for review from certain vulnerable people or groups with protected characteristics.”

According to the researchers Big Issue spoke with, the FOI disclosure points to the clear risks of discrimination within UC given the way that algorithms rely on historical information to make predictions about future events, as well as the system’s track record of flawed decision-making. 

“Existing biases can be encoded in automated decision making through the data that systems are trained on,” explained Anna Dent, head of research at Careful Trouble. 

“If, say, a system to spot fraud is trained on data about historical fraud cases there is a risk that institutional biases will be baked in – if disabled people, people from certain ethnic backgrounds or single parents, for example, have been disproportionately targeted in the past, there will be a higher percentage of fraud cases involving these groups, which will then encode the same bias into any system trained on that data.”

On universal credit’s track record of flawed decision-making, the researchers cited numerous reports of how the DWP’s UC algorithms fail to factor in how often people are paid; a design flaw that frequently leads to over- or underestimation of people’s earnings, which in turn fuels financial insecurity via unpredictable changes in the amount they receive in benefits from month-to-month. 

In 2019, for example, four single mothers represented by Leigh Day and the Child Poverty Action Group won a High Court case against the DWP after it found the women were struggling financially because the automated system meant there was no consistency in their monthly payments. 

The court specifically noted that if a claimant was paid a day early due to a weekend or bank holiday, the system would detect them as being paid twice in a month and drastically reduce their UC payment; a problem the women’s lawyers said was likely to affect tens of thousands of people.

For those on lower incomes, this inconsistency in payments can mean the difference between being able to afford their living costs and falling into arrears or debt. Research from early 2024 found that for more than half of the households, universal credit payments varied by £400 or more from one month to the next at least once in the year 2022-23. 

“[Because] single mums are disproportionately represented among working claimants (around 40%), the flaws with the payment algorithm are more likely to affect this group. What this means is that single working mums are especially likely to encounter income volatility and financial instability due to the means-testing calculation,” Dent said.

Human Rights Watch also reported in 2020 on how the faulty automated design of universal credit processes led to the system over-predicting the earnings of a single mother, resulting in her support being cut by £1,000

Currie says the frequent cases of flawed automation in government decision making raises questions about the effectiveness of these tools and who they are designed for. 

“Universal credit has placed one type of volatility with another. In our Automating Universal Credit study, some claimants encountering flaws and fluctuations in the calculation had to borrow from their employers and visit food banks to get by,” she said. “So even people following UC’s work requirements sometimes found themselves unable to cover household bills and living expenses.” 

These risks of discrimination are also mirrored across countries and historically marginalised communities, which are often more vulnerable to harm from algorithmic decision-making processes.

In the UK, for example, disabled benefit claimants have previously been unfairly investigated, while in Serbia, Roma and disabled people were left unable to afford food and living costs after being removed from social assistance support following the introduction of automated welfare eligibility assessments. 

“Especially with the risk profiling for benefit fraud, certain subsection of the population – those on low-incomes and with disabilities – are being put under surveillance by systems that simply don’t affect the rest of the population,” said Dent. 

A lack of transparency

Researchers said the issues around algorithmic discrimination are compounded by a lack of transparency, which makes it near-impossible to challenge wrongful or unfair decisions. 

They noted, for example, that while various pieces of legislation – including the UK General Data Protection Regulation (GDPR), the Data Protection Act 2018, the Equality Act and the Human Rights Act – contain mechanisms for safeguarding people from algorithmic discrimination, it is hard to challenge instances of automated discrimination without knowledge of when and how the algorithms are being used. 

While the automation of government decision-making is a growing trend, public information on how algorithms are being used in decision making is scarce. 

In 2021, Big Brother Watch found that 540,000 benefits applicants were secretly being assigned fraud risk scores by councils’ algorithms before they could access housing benefit or council tax support.

While there are only eight algorithms recorded and explained on the government’s Algorithmic Recording Transparency Record, the Public Law Project’s (PLP) Tag Register database was set up in 2023 to show the public details of the secretive algorithms used by multiple government departments and local authorities.

As of October 2023, there are 55 automated tools recorded in the Tag Register that are used by the government to make or inform decisions. 

However, according to PLP senior research fellow Caroline Selman, the database has been created through “resource intensive FOI requests, rather than proactive transparency on the part of the DWP”.

She added that “contesting decisions made or supported by AI or ADM systems is one of the most uncertain and untested areas of public law. The current opacity in how and where algorithmic decision making systems are being used is one of the biggest barriers to ensuring accountability in practice.”

Do you have a story to tell or opinions to share about this? Get in touch and tell us moreBig Issue exists to give homeless and marginalised people the opportunity to earn an income. To support our work buy a copy of the magazine or get the app from the App Store or Google Play.

Support your local Big Issue vendor

If you can’t get to your local vendor every week, subscribing directly to them online is the best way to support your vendor. Your chosen vendor will receive 50% of the profit from each copy and the rest is invested back into our work to create opportunities for people affected by poverty.
Vendor martin Hawes

Recommended for you

View all
Labour warned that crackdown on shoplifters risks 'criminalising' poverty: 'Some are out of options'
shoplifting
Shoplifting

Labour warned that crackdown on shoplifters risks 'criminalising' poverty: 'Some are out of options'

Premier League clubs step up gambling sponsorship on front of shirts ahead of ban: 'It's shameful'
A montage of Premier League shirts from Brentford, Everton, Southampton, Nottingham Forest, and Crystal Palace, over a pixellated backdrop of Wembley Stadium as Southampton FC returned to the Premier League
Exclusive

Premier League clubs step up gambling sponsorship on front of shirts ahead of ban: 'It's shameful'

Summer holiday childcare costs surge to more than £1,000 per child: 'Stressed out to the max'
child drawing
Childcare

Summer holiday childcare costs surge to more than £1,000 per child: 'Stressed out to the max'

I was in a violent relationship. I wouldn't have got free without financial help
Person using ATM
Financial inclusion

I was in a violent relationship. I wouldn't have got free without financial help

Most Popular

Read All
Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits
Renters: A mortgage lender's window advertising buy-to-let products
1.

Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal
Pound coins on a piece of paper with disability living allowancve
2.

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal

Cost of living payment 2024: Where to get help now the scheme is over
next dwp cost of living payment 2023
3.

Cost of living payment 2024: Where to get help now the scheme is over

Strike dates 2023: From train drivers to NHS doctors, here are the dates to know
4.

Strike dates 2023: From train drivers to NHS doctors, here are the dates to know

Support our vendors with a subscription

For each subscription to the magazine, we’ll provide a vendor with a reusable water bottle, making it easier for them to access cold water on hot days.