Red Train Blog

Ramblings to the left

The Red Train Blog is a left leaning politics blog, which mainly focuses on British politics and is written by two socialists. We are Labour Party members, for now, and are concerned about issues such as inequality, nationalisation, housing, the NHS and peace. What you will find here is a discussion of issues that affect the Labour Party, the wider left and politics as a whole.

  • Home
  • Topics
    • Topics
    • EU referendum
    • The Crisis in the Labour Party
  • Art
  • Books
  • About us
  • Search
Coded-Bias.jpg

Coded Bias shows how deeply embedded racism is in our society

May 11, 2021 by Alastair J R Ball in Film

The film of A Time To Kill opens with a scene of horrific racism. I’m not going to describe it here because it’s awful. If you’ve seen the film (or read the book it’s based on) you’ll know what I mean. If not, then take my word for it being both horrific and racist.

Events like this scene have been terrifyingly common throughout American history, as well as occurring in countries like Britain - where we like to pat ourselves on the back for being less racist than America, whilst celebrating our own racist colonial history. In addition to being a shocking depiction of how horrifying racism can be, this scene underlies some of the common misconceptions about racism. 

I’m aware that I’m a white person writing about racism and I don’t fully appreciate what it’s like to experience it as an everyday occurrence. I can’t speak for all white people, but from my experience there is an incorrect assumption by many white people that racism has two criteria. Firstly, that it’s an action that someone made a conscious decision to do, i.e. someone chose to be racist. Secondly, racism is always clearly racism, i.e. something that, if seen by a neutral white person, they would understand to be horrific. Events such as murder, rape, arson and beatings fall into this definition.

The subtle nature of racism

The opening of A Time To Kill meets these two criteria. An awful act - so bad that it moves an all-white jury - and one that the perpetrators choose consciously to do, even as a spur of the moment decision. Burning down an African American church or lynching someone also fits these criteria. The misconception that racism must have these two criteria obscures how subtle a lot of racism is.

A new documentary on Netflix presents a different picture of racism. It shows that it might not appear horrific at first and not arise through conscious decisions. This is a more nuanced exploration of racism than the very violent opening of A Time To Kill. It’s more nuanced than the films that focus on the sort of instances of racism that even a white person, unaware of their white-privilege, would consider racist.

This film is called Coded Bias. It begins by exploring the inbuilt biases in facial recognition technology. The film follows Joy Buolamwini, a grad student at MIT who was working on a project involving facial recognition and discovered that the software being used to scan people’s faces struggled to read faces of women and people of colour. Buolamwini realised that she had to wear a white mask for the software to recognise her face.

Failing to recognise faces

Buolamwini discovered that the problem with the software failing to recognise women or people of colour’s faces meant that many times, facial recognition software failed to match, or provided the wrong match, for people who are not white men.

Coded Bias goes on to look at a group called Big Brother Watch, who are organising against the use of facial recognition software by the Metropolitan Police in London. They follow police surveillance vans using facial recognition software, which is flagging people for the police to question who aren’t the suspects they are looking for. The film shows an example of a school boy, a person of colour, who is questioned by the police but is not someone they are looking for. He was flagged for questioning by the software.

The reason for these mistakes is that the software is not familiar enough with women or people of colour’s faces. The AIs that match police camera footage to databases of suspects need to be trained to analyse human faces. To do this, these AIs are fed millions of photographs to scan for faces. However, not enough women or people of colour were included in the training data that has been fed to AIs, so the AIs didn’t understand the difference between different women or people of colour when they encountered them in the real world.

The taint of discrimination

The reason why women and people of colour were not included in the training databases is that these databases were initially made out of the photos that were to hand, i.e. pictures of the staff at the elite universities and computer science labs that pioneered AI and facial recognition research. I’m sure I don’t have to explain to you why women and people of colour have been historically underrepresented at elite universities and research institutions.

This failure of facial recognition software to recognises people of colour’s faces - which has real world consequences when this software is being used by police - shows how subtle racism is and how the taint of discrimination creeps into everything that our society produces. Machines may not have bias in their hearts, but the society that produced them did.

The companies and institutions that use this technology are not transparent about how they are used. When the police question someone, that person has no way of knowing that the reason they are being questioned is because an AI that can’t tell the difference between different people of colour flagged them as a police suspect. If we don’t know what means are being used to investigate crime, then the concept of due process and fair legal proceedings goes out the window.

Assumptions about racism

Some believe that using machines to select candidates to interview for a job, or to identify police suspects in a crowd, is a way to strip out the human biases from these processes. There is no denying that humans bring their gender and racial biases into their decisions (consciously or unconsciously). However, machines only follow the programmes of their designers and unconscious bias can creep into their design. The machines our society makes reflect the prejudices of our society, just as the makeup of the faculty and student body of our elite universities represents the prejudices of our society.

The problems that Buolamwini found when she tried to design a piece of university course work that could recognise her face is connected to the opening of A Time To Kill. They both show how racism is a part of our society, but their differences show an assumption that many people make: that racism is a conscious decision by bad people and that it’s not something that can come about through unconscious biases that are so deeply ingrained in society that they are invisible. Like the air around us, racism can be invisible, and we don’t think about it, but it's always there.

No one set out, with deliberate malice in their heart, to create a facial recognition AI that would misread women and people of colour’s faces so that police in London can harass an innocent school boy. However, that is exactly what happened.

Racism doesn't have to announce itself

Many people (most of them white) like to tell themselves that if they don’t have hate in their heart and they don’t use the N word they’re not racist. If you’re thinking that, then you can give yourself a pat on the back for being a better person than a far-right thug with swastika tattoos, throwing bricks through the windows of a mosque. You are objectively better than that piece of scum.

However, you can still be doing something racist without deliberately choosing to be racist and without being motivated by hate. You could be using software that’s incorrectly flagging up people of colour for police questioning, and not stop to ask what’s happening or what the effects of this are on the people being falsely questioned by the police.

This idea that racism is only done by people like the vile dirt-bags depicted in the opening of A Time To Kill, people who have hate in their hearts and the N-word on their lips, leads many white people to either ignore or become angry at the people of colour who try to explain how racism is often subtler than the clear morality of a Hollywood film. Racism doesn't have to announce itself with Sieg Heils and street fights. It can be brought quietly into the world by people with the best of intentions, who are still subject to the deep-rooted biases in society.

Deeply embedded racism

Coded Bias shows how deeply racism is embedded into our society and how it manifests in surprising ways that have profound effects on people’s lives. The film ends with Buolamwini testifying before Congress about the serious impact that biases in facial recognition software can have.

We need to be aware of how our technology can replicate the deep-rooted injustices in our society. We also need to be aware that just because something isn’t obviously racist that doesn’t mean it’s not racist.

Related posts
Coded-Bias.jpg
Film
Coded Bias shows how deeply embedded racism is in our society
Film
Film
Seaspiracy.png
Film, Environment, Political narratives
Seaspiracy is weakened by framing the environment as a consumer issue
Film, Environment, Political narratives
Film, Environment, Political narratives
The-Big-Meeting.jpg
Film
The Big Meeting is a celebration of radical left culture
Film
Film
peterloo_film.jpg
Film
Everyone should go and see Mike Leigh’s Peterloo
Film
Film
the-death-of-stalin.jpg
Film
The Death of Stalin
Film
Film
Film
Living memory of death in war
Film
Film
Aylesbury Estate 1.png
Housing, Film
Dispossession: The Great Social Housing Swindle
Housing, Film
Housing, Film
Film
I, Daniel Blake
Film
Film
Film
Fuck for Forests
Film
Film
Film
Invisible Britain
Film
Film
May 11, 2021 /Alastair J R Ball
Film
  • Newer
  • Older

Powered by Squarespace

Related posts
Capitalism.jpg
May 27, 2025
“That’s Your GDP”: Labour’s big growth delusion
May 27, 2025
May 27, 2025
nigel farage.jpg
May 15, 2025
Nigel Farage is seriously uncool
May 15, 2025
May 15, 2025
Keir_Starmer.jpg
May 13, 2025
Labour’s plan to defeat Farage by becoming him
May 13, 2025
May 13, 2025
Apr 12, 2025
How should the left view the porn industry?
Apr 12, 2025
Apr 12, 2025
8644221853_6af3ffe732_c.jpg
Apr 6, 2025
With welfare cuts Starmer’s Labour is grabbing the Tory spade and digging deeper
Apr 6, 2025
Apr 6, 2025
Books.jpg
Mar 28, 2025
Behold the smartest people in the room: The Waterstones Dads
Mar 28, 2025
Mar 28, 2025
Ukraine-flag.jpg
Mar 13, 2025
Austerity, military spending and Trump’s temper: the war in Ukraine continues
Mar 13, 2025
Mar 13, 2025
Feb 23, 2025
Has cool really abandoned Left Britannia?
Feb 23, 2025
Feb 23, 2025
Feb 18, 2025
Russell Brand isn’t the only person on the hippy to alt-right pipeline and the left should be aware of this
Feb 18, 2025
Feb 18, 2025
Trump-rally.jpg
Feb 10, 2025
Trump is back in the White House and the billionaires are in the Rotunda
Feb 10, 2025
Feb 10, 2025