Film review: How Coded Bias reveals the racism within algorithms

When Joy Buolamwini, a PhD student at MIT, was conducting facial recognition experiments using artificial intelligence, she ran into one key setback: the technology couldn’t accurately process her face. Investigating further, she found out that these programs struggle to register women more than men, and also have a very difficult time identifying black or brown faces.

When Buolamwini placed a white mask over her face and again used the facial recognition software, the AI immediately identified what was in front of the camera as a face.

She realized the AI she was using had a race problem: Because machine learning is only as robust as the dataset fed into its system, allowing it to recognize objects or people based on the photos already in its “brain,” if the information it’s given is of only white faces, it won’t recognize black faces. And because AI tech is predominantly created by white male scientists and engineers, very few diverse photos are used to develop the foundation of AI datasets.

Buolamwini’s story is the main current running through the new documentary Coded Bias, debuting in Canada at the online Hot Docs Film Festival. Directed by Shalini Kantayya, who previously quarterbacked a doc on clean energy, Coded Bias uncovers the dirty secret behind AI, but not just what Buolamwini discovered about facial recognition’s bias.

A slew of other algorithm-based technologies push out diverse populations: trained automated risk-profiling systems disproportionately identify Latinx people as illegal immigrants, and credit scoring algorithms disproportionately select black people as risks and prevent them from buying homes, getting loans, or finding jobs.

Boulamwini told Frontline in 2019 (a quote which isn’t part of the doc): “When these systems fail, they fail most the people who are already marginalized, the people who are already vulnerable. And so when we think about algorithmic bias, we really have to be thinking about algorithmic harm. That’s not to say we don’t also have the risk of mass surveillance, which impacts everybody. We also have to think about who’s going to be encountering the criminal justice system more often because of racial policing practices and injustices.”

That’s why she and other like-minded AI analysts formed the Algorithmic Justice League in order to publicize the racial and gender bias embedded within AI systems. This is the kind of civic action that can be encouraging to those who think certain technologies will always hide in the shadows, their inner workings shrouded in mystery, only to spit out results that the public accepts without question.

The film also goes across the pond in London to profile Silkie Carlo, the director of Big Brother Watch, an organization that monitors the use of facial recognition AI by British law enforcement. Carlo explains how civilian civil liberties are violated with this technology, and points out the growing number of citizens being misidentified. For example, Big Brother Watch found that the use of photo biometrics produced 2% identification accuracy for the Metropolitan police force, while South Wales police is only 8% accurate.

Coded Bias does a fantastic job in warning us about the sly racism found in technologies that will only become more popular in the coming decade. After all, we have AI tech embedded in Siri/Alexa, camera phones, chatbots, Google Images, etc, and if we want to level the playing field and ensure racism doesn’t creep further into this sector, we can’t just stand still.

What I would have liked to see more of in this film, though, is the perspective of those white male scientists creating AI tech for major firms such as Amazon and Google. Are they going to bring more diversity to their datasets? How do they respond to Boulamwini’s discovery? If there is going to be change in this field, the major companies have to own up to their own biases, but we never get to hear them on camera explain their position.

This doc is inspiring for those us who have long been interested in AI and its future. But it’s also frightening to recognize how biased this innovation can be, and how we rely on the determination of researchers such as Boulamwini and her Algorithmic Justice League focused on tipping the scales to favour racialized voices who have long been discriminated against offline and now online within AI systems, too.

You can still catch Coded Bias on the Hot Docs website by purchasing tickets to stream the film here.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store