Algorithms in social security: cause for concern? | CPAG

Algorithms in social security: cause for concern?

Published on: 
28 August 2020
Written by: 

Sophie Howes

Senior policy and research officer

Two weeks ago pupils, parents and schools were up in arms when the news broke that 40 per cent of teacher-assessed A level results had been downgraded by at least one grade. The culprit? A computer, or to be precise, an algorithm (described as a ‘mutant algorithm’ by the Prime Minister on a visit to a Leicestershire school on 26 August). The government has since done a U-turn and permitted teacher-assessed grades to take precedence over the algorithm-generated grades if these are lower. But this episode clearly shows the chaos that can be caused to people’s lives when the technology that so many parts of public services now rely on goes wrong.

Last year CPAG published our Computer Says No! report series. These reports focus on access to justice in universal credit (UC), and the problems claimants experience understanding their UC award and exercising their appeal rights. Many of these problems are caused or exacerbated by the computer system used by the Department for Work and Pensions (DWP) to process and administer UC claims and awards.

The first report highlights the lack of information provided to claimants about their UC award, and how to challenge a decision if they believe a mistake has been made. The second report focuses on the difficulties claimants experience challenging decisions made in relation to their UC award, with various roadblocks placed in their way.

While some of the issues raised in the reports have been rectified by the DWP, for example the information provided to claimants in the UC payment statement has improved somewhat, many of the issues remain and are still causing problems for claimants- as cases submitted to CPAG’s early warning system illustrate.

A man made a claim for UC in March. He was used to receiving his benefits weekly and having his housing benefit paid directly to his landlord, so he believed that his monthly UC award (as shown on his online UC payment statement) was everything he was entitled to – and his rent was being paid directly to his landlord. It was not until August that he realised he had rent arrears because a housing costs element had not been included in his UC award. If his statement had listed the housing costs element and indicated that he wasn’t receiving this, he would have realised immediately that his rent was not being paid.

 

As this example highlights, many of the issues raised in the reports point to a central issue: a lack of transparency and accountability about how the UC computer system actually works. CPAG, alongside many others, is still unclear how and when algorithms and automated decision making are used within UC. The reports highlight examples of claimants who have been provided with very little information about how their UC award has been calculated, only to contact the DWP officials on the UC helpline and be told that these officials don’t have access to the calculations either. This is worrying for a number of reasons.

If claimants and the officials responsible for administering UC do not understand how claims have been calculated, it makes it difficult to spot mistakes when they have been made. For claimants who believe a mistake has been made and want to challenge a decision in relation to their award, this becomes extremely difficult if the rationale for these decisions is not readily available.

In such a system, the computer becomes king, never questioned and always trusted to be producing the right results. While technology has unquestionably improved many aspects of public services, including some aspects of UC (for example, being able to process and pay millions of claims on time during the coronavirus crisis), the exam results fiasco has shown that technology can make serious mistakes with severe consequences. Where public services rely on technology to assist them in decision making (or make decisions for them) robust systems need to be put in place to make this technology accountable to the public – improving transparency is the first step towards making this happen.

The Computer Says No! reports scratch the surface on the potential implications of a digital-first benefits system. Over the next two years, CPAG is conducting an in-depth investigation of the UC computer system to unearth more information about how this system operates, and crucially whether it upholds rule of law principles. While this might at first glance seem like a project that would only interest the computer geeks among us, we believe that this project is both relevant to millions of citizens and very timely.

With the full scale of the economic crisis caused by coronavirus still unfolding, more and more people are turning to the social security system for financial support to get by. The effective administration of this system is therefore vital, and when mistakes are made it is important that these can be rectified quickly. People’s livelihoods quite literally depend on it.

We look forward to sharing our findings as soon as we are able, but until then we will continue to urge the DWP to take further steps to increase the transparency of the UC computer system. Implementing the recommendations in this report from technology experts Pt2, who have undertaken analysis of the UC digital system and how it could work better for claimants, would go a long way towards achieving this.