Can technology be racist?

April 27, 2021
Technology isn't neutral. Take a look under the hood of an algorithm to find out how racial biases can become easily amplified with harmful effects.

Yes, technology can be racist.

"But wait, technology is a 'thing'. It's neutral. Only people can be racist."

Let's unpack this statement a bit. Yes, technology in and of itself is not human or conscious. Although it's quite good at emulating our behaviors:

An illustration of a set of lips surrounded by loudspeakers situated on a table in a restaurant. A "reserved" card is placed on the table.
The artificial intelligence assistant, Google Duplex, books your reservations for you with a chilling human-like voice (Credit: New York Times)

But technology doesn't emerge out of nothing. It's imagined, shaped and structured by humans.

Humans with bias. Humans with implicit bias. Humans with implicit racial bias. And with technology being increasingly powered by algorithms, we’re seeing bias being amplified in ways that easily escape our control.

It shows up in facial recognition systems used by police, credit scoring algorithms used by banks, healthcare algorithms used by hospitals, and social networking algorithms that have been repeatedly found to promote racial profiling, discrimination and inequity. We know this thanks to investigative research and deep analysis by independent researchers, scholars and journalists.

The consequences? Disproportionate harm and adverse effects for underrepresented groups, including lowered self-esteem, detrimental effects to mental health, diminished access to housing and home ownership, predatory interest rates, and higher probability of arrests, incarceration and deportations.

In other words, real damage caused to real people.

Is algorithmic bias intentional?

Are engineers at big Tech companies and government institutions writing algorithms to discriminate and exclude specific groups? Yes and no. In fact, the answer falls in a bit of a gray zone.

On one hand, it could be just a blindspot. A lack of awareness of the potential harm that technology can wreak when biases are unconstrained. In this case, data samples on which algorithms are trained are (unknowingly) biased toward majority groups because no one ever bothered to ask the question, "Who may be underrepresented in our data?"

But it can also be the deliberate denial of equal access to solutions and opportunities to minority groups by political actors and financial institutions — as such are the historical underpinnings of Western capitalist societies where white supremacy reigns.

Regardless of the reasons, it's short-sighted to dissociate intentions from impact. This means that, even if there's no "racist" intention to harm minority groups, algorithmic bias leads to racist outcomes. And the teams behind the creation of this technology should not be shielded from the harms they produce. They need to take ownership, accountability and responsibility for doing better.

In other words, ethics in AI matters.

Take a look under the hood of an algorithm.

So what's actually happening under the hood that leads to such amplification of bias?

If you only have a few minutes, start with a short article from The Guardian on the racial bias in Twitter's image cropping algorithm.

If you have 20 minutes, watch the fantastic video below from the media journalism team at Vox where they dig into the black box of that same Twitter algorithm in an approachable "explain-it-like-I’m-five" sort of way.

And if you have a few hours, read the book Race after Technology by Ruha Benjamin, sociologist and Professor of African American Studies at Princeton University (and who is also featured in the aforementioned Vox video). You will dive much deeper into the numerous ways technology amplifies social hierarchies and possible solutions.

The black and white book cover for Race After Technology featuring a bald black woman with sunglasses set against a background of circuits

Is there a way out of this?

Yes (sort of).

There's the diversification of teams to incorporate unique perspectives into decison-making behind the scenes.

Credit: Women of Color in Tech Chat

There's the diversification of data sets to more equitably represent the wide spectrum of human diversity.

There’s the reworking of processes and workflows to lead to more inclusive, ethical and equitable solutions for all sorts of people.

There’s the acknowledgement of implicit, institutional and systemic bias and how it’s intrinsically linked to systems of power and privilege that surround us.

And then there’s the dismantling of such systems of oppression.

But the first step is just to recognize that there is even a problem.

And that whether you are an end user, a content creator, an engineer, a designer, a marketer, or a business leader, you have a role to play in it.

Whether it’s choosing to remain in ignorance or inaction.

Whether it's acknowledging your complicity and starting to adopt more inclusive practices, to call out bias when you see it, and to make adjustments to your workflows to address your blind spots.

Or whether it's actively working to dismantle the whole system through ongoing advocacy and activism.

What's next?

Credit: James Eades

I have one ask for you if you wish to go beyond inaction: spread the message. Share a link to this article or to the resources shared here to your network. Or have a chat with your friends and colleagues about the topic of bias in technology and ways that you could raise awareness around it

Want to join me in becoming a product inclusion advocate? Get in touch and let’s chat further.

Subscribe today.

Get a biweekly dose of inclusive design tips, insights and resources in the Inclusive Design Digest.

Your data is protected. View Privacy Policy.
Thank you!
Please check your inbox to confirm your subscription.
Oops! Something went wrong while submitting the form. Please try again.
Services

Take a closer look at how we can work together.

I help teams align their business practices with their values by harnessing the power of inclusive design. All it takes is a willingness to challenge your assumptions and evolve how you design.
"I strongly recommend Sandra for any job that requires strong analytical skills, an innovative mindset and leadership."
Muy Cheng Peich, Libraries without Borders