The Racial Bias Being Baked Into Our Algorithms22 Aug 2016
My "fellow" Presidential Innovation Fellow Mollie Ruskin (@mollieruskin), was doing some work with veterans recently and stumbled across a pretty disturbing example of how racial bias is being baked into the algorithms that are driving our online, and increasingly offline worlds.
This morning I was searching #Google for images of #Veterans for a project. I stumbled upon a photographer who had taken hundreds of beautiful photographs of Veterans all in the same style.
I clicked on a few striking portraits...I quickly noticed something very troubling.
When doing image searches, I often use the 'related images' feature to uncover more pictures relevant to what I'm hunting for, as was the case this time around. Where most of the photos returned related images of other veterans, one photo of a smiling black male vet in his uniform fatigues, garnered a series of related images that were all mugshots of CRIMINALS.
The tools we use to fuel our 21st century lives are not the seemingly neutral blank slates we imagine them to be. They are architected and shaped by people, informed by our conscious and unconscious biases. Whether this is reflecting back a dark mirror on what people click on or surfacing a careless design in an algorithm, this random search result shines a little more light on the more subtle and insidious ways racism is baked into our modern lives.
For my friends who work at the big tech giants which are increasingly the infrastructure to our lives, please help make sure your institutions are addressing this stuff. (And thanks to those of you who already have been.)
PS: Recently saw a great talk about this idea of 'oppression' in our algorithms. Def worth a watch: https://www.youtube.com/watch?v=iRVZozEEWlE
This isn't some edge weird case, this is what happens when we craft algorithms using development teams that lack diversity. Racial bias continues to get baked into our algorithms because we refuse to admit we have a problem, or unwilling to actually do anything about it. Sadly, this is just one of many layers in which bias being built into our algorithms, which are increasingly deciding what shows up on your Facebook wall, all the way to which criminals will commit crimes in the future.
You will hear more stories like this on API Evangelist as I push forward my APIs and algorithm research, working to identify ways we can use open source, and open APIs to make these often black box algorithms more transparent, so we can potentially identify the bias inside. Even with this type of effort, we are still left with having to do the hard work change the culture that perpetuates this--I am just focusing on how we crack things open, and more easily identify the illness inside.