Advertisement
Become a member of the Big Issue community
JOIN
Opinion

Universal Credit: How algorithms pose a threat to accessing benefits

They’re hiding in plain sight and could be making the process of claiming benefits inherently unfair. Lee Coppack and Joanna Bates are getting claimants the protection they deserve

Algorithms could make the process of claiming Universal Credit inherently unfair. Image credit: Geralt / Pixabay

Algorithms could make the process of claiming Universal Credit inherently unfair. Image credit: Geralt / Pixabay

During the Covid-19 pandemic, we clapped for our heroes, our frontline workers and hoped for a better, more empathetic future for everyone. We witnessed drastic changes in our social interactions. We could not shake hands with anyone, we could not share our space with others and silently, another revolution was fomenting change, with algorithms at its forefront.

Support The Big Issue and our vendors by signing up for a subscription.

It was all too natural to turn to machines when human contact was a potential risk to our lives. Telemedicine, for example, cut out the need for face-to-face consultations. With staff furloughed, working at home or laid off, companies accelerated their plans for automation and artificial intelligence.

As advances in technology disrupt more sectors, it is clear that many jobs will disappear. There will be more new jobs – but they will be different, as the World Economic Forum states: “They will demand the skills needed for this Fourth Industrial Revolution and there is likely to be less direct employment and more contract and short-term work.”

Algorithms are hiding in plain sight in many instances, not just in Universal Credit

Low-waged and vulnerable people will be hardest hit. As the WEF said in its Future of Jobs Report last year: “In the absence of proactive efforts, inequality is likely to be exacerbated by the dual impact of technology and the pandemic recession.”

Many will need to retrain in radically different work environments. Given the speed of these changes, which we are already seeing, we must provide time and resources for those who need them most.

Universal Credit Algorithm illustration by Laurie Avon
1461 Future of work
Universal Credit Algorithm illustration by Laurie Avon

This brings us to Universal Credit, which should provide support during this time of transition, but clearly fails to do so in many cases. One reason for these failures is poorly designed algorithms, the coded instructions that tell a computer programme what sequences of steps to follow to make a decision.

Algorithms are hiding in plain sight in many instances, not just in Universal Credit. Many are valuable and work smoothly but they’re only as good as the minds creating them. Currently, there is no regulatory framework that creates standards for algorithms or indeed, whether the data they are using are fair or representative.

Advertisement
Advertisement
Article continues below

Current vacancies...

Search jobs

These are not just issues for Universal Credit and public benefits, but for fairness in the new world of work. Organisations are looking to use algorithm-based systems as part of hiring and promotion. If we don’t take more responsibility for these protocols, we will fail to protect people. Without transparency and accountability at all levels of the process, from conception to implementation and beyond, we run a high risk that they will cause harm.

The Just Algorithms Action Group (JAAG), like unions and advocacy groups, are working to counter the injustice and lack of consideration of many modern computer-based systems in their real-world use, and the malign impact they can have on human beings.

JAAG wants to investigate further to identify specific harms caused by the automated government flagship system of Universal Credit. We are looking for people who have suffered from the inadequacies and failures in the Universal Credit system to help build a legal case for change.

This brings us to Universal Credit, which should provide support during this time of transition, but clearly fails to do so in many cases. One reason for these failures is poorly designed algorithms, the coded instructions that tell a computer programme what sequences of steps to follow to make a decision.

Algorithms are hiding in plain sight in many instances, not just in Universal Credit. Many are valuable and work smoothly but they’re only as good as the minds creating them. Currently, there is no regulatory framework that creates standards for algorithms or indeed, whether the data they are using are fair or representative.

These are not just issues for Universal Credit and public benefits, but for fairness in the new world of work. Organisations are looking to use algorithm-based systems as part of hiring and promotion. If we don’t take more responsibility for these protocols, we will fail to protect people. Without transparency and accountability at all levels of the process, from conception to implementation and beyond, we run a high risk that they will cause harm.

The Just Algorithms Action Group (JAAG), like unions and advocacy groups, are working to counter the injustice and lack of consideration of many modern computer-based systems in their real-world use, and the malign impact they can have on human beings.

JAAG wants to investigate further to identify specific harms caused by the automated government flagship system of Universal Credit. We are looking for people who have suffered from the inadequacies and failures in the Universal Credit system to help build a legal case for change.

If you think you may have been affected by an unfair UC decision contact Jo at UC@jaag.info or call 07305 159700.   jaag.org.uk

Advertisement

Become a Big Issue member

3.8 million people in the UK live in extreme poverty. Turn your anger into action - become a Big Issue member and give us the power to take poverty to zero.

Recommended for you

View all
It's never been more dangerous to be a Palestinian aid worker – we feel the world has abandoned us
Salma Altaweel

It's never been more dangerous to be a Palestinian aid worker – we feel the world has abandoned us

I spy a new John Le Carré novel – well, almost
Paul McNamee

I spy a new John Le Carré novel – well, almost

Starmer and Streeting have talked the talk on mental health support – now they must walk the walk
Labour minister for health, Wes Streeting
Oliver Chantler

Starmer and Streeting have talked the talk on mental health support – now they must walk the walk

Dehumanising rhetoric on immigration shows we have learned nothing from the Windrush Scandal
A protest against the hostile environment stance on immigration
Nick Beales

Dehumanising rhetoric on immigration shows we have learned nothing from the Windrush Scandal

Most Popular

Read All
Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits
Renters: A mortgage lender's window advertising buy-to-let products
1.

Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal
Pound coins on a piece of paper with disability living allowancve
2.

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal

Cost of living payment 2024: Where to get help now the scheme is over
next dwp cost of living payment 2023
3.

Cost of living payment 2024: Where to get help now the scheme is over

Strike dates 2023: From train drivers to NHS doctors, here are the dates to know
4.

Strike dates 2023: From train drivers to NHS doctors, here are the dates to know