By Sophie Squire
Downloading PDF. Please wait... Issue 2720

Algorithms – why assumptions made by computers aren’t neutral

This article is over 3 years, 7 months old
Issue 2720
Students unmasked the algorithms role
Students unmasked the algorithm’s role (Pic: Guy Smallman)

The A-Levels scandal had an algorithm at the heart of it—and algorithms are all about us.

At its simplest an algorithm is a set of rules or instructions designed to solve a problem. They are essential to the way computers process information.

They can also be used to simulate sequences and trends. By using data that already exists, they can try to predict data that doesn’t.

This is what was so pernicious about the exam results scandal. It was assumed that schools’ past performance would necessarily be replicated by individuals in those schools again.

If it was the sort of school that “didn’t get A grades”, then it wouldn’t this time either.

But this is just one example of a highly conservative method. People don’t always behave in the way they have before.

They can change and grow and learn—or become embittered and regress. And behind algorithms there are always political concepts and political decisions.

They aren’t neutral or simply technical. They are based on assumptions and views about people as individuals and as classes.


If you work for Uber or Deliveroo, your boss is— supposedly—an algorithm. The police use algorithms to try and predict where crimes will be committed and who will be the criminals.

But the data fed in relies on how police have operated before— with all of the inevitable bias.

Hannah Couchman of the human rights group Liberty said that arrests made on the basis of such data were “already imbued with discrimination, entrenched by algorithms”.

This month, after an outcry, the government was forced to scrap an algorithm processing visa applications. It played a role in deciding whether people have a right to stay in Britain.

But Chai Patel of the migrant rights group JCWI explained that the “streaming tool took decades of institutionally racist practices” and “turned them into software.”

Another group described it as “speedy boarding for white people”.

Such systems are highly attractive for many banks and huge corporations, such as Amazon.


And now some free marketeers hope that algorithms could mean a stripped-back capitalist state.

It offers the possibility of removing all those costly civil service workers with lines of code that don’t have discretion—and code doesn’t join unions.

Futures ruined by Tories’ rigged results
Futures ruined by Tories’ rigged results
  Read More

Far better to have a soulless programme delivering sanctions and cutting off benefits than relying on a person. It hides a process based on conservative assumptions behind a veneer of “computer says no”.

It’s not Tory ministers who are wrecking your future, it’s the algorithm.

Revealing the assumptions and the politics behind algorithms is important.

That’s what happened as the brilliant demonstrations led by students over A-levels started to shake the system.

Many carried signs saying they aren’t simply statistics or a grade. Once conscious human intervention takes hold, the algorithm becomes powerless.

There isn’t a limit on our capacity to change, or our ability to reorder the world. And however tightly and carefully they target the adverts, Facebook can’t guarantee that you will buy what you are told is good for you—or make you vote Tory.

Sign up for our daily email update ‘Breakfast in Red’

Latest News

Make a donation to Socialist Worker

Help fund the resistance