Is Google racist? 

The Google Image result for "three black teenagers" is enormously different to the result for "three white teenagers". But what does that actually mean? 

Added on

By Caroline O'Donoghue on

Is Google racist? Well, yes and no. In much the same way that a calculator can spell "BOOBIES" if you tell it to, the algorithm that makes Google search is based on what humans tell it. Google results are based on relevancy – and only humans can determine what is or isn't relevant to them. 

An example: say you're writing an article about peacocks.  But you don't want a shy-looking peacock ambling about with its feathers tucked in, you want a glorious purple-and-royal-blue spread of feathers. So you look for the correct image on Google Images, put it in your article, and title it "12 Things You Didn't Know About Peacocks."
Now imagine two million people do the same thing. Pretty soon, that's what Google thinks peacocks look like all the time, so they make it the top result on Google search. Pretty soon, enough people have Googled it that we forget that peacocks can tuck their feathers in. 

All of this would be pretty much fine if not for one thing: the media. 

In a Twitter video posted by @iBeKabir on June 7, a group of friends gather around an iPhone and Google Image search "three black teenagers" to find an array of police mugshots. They change it to "three white teenagers" and immediately crack up at the results: stock photos of smiling, wholesome white kids with sports equipment. 

It's hilarious, upsetting and a startling look at what is a huge problem: a problem that's bigger than Google, if anything can be. It's a culture where the term "black teenager" isn't a descriptor, it's a death sentence. Where news stories about black children being shot by white adults are littered with references to the victim carrying "what looked like a gun" but wasn't, what "looked like a threat" but wasn't. Where pictures of mothers and fathers protesting the deaths of their children are labelled as antisocial rioters. And in the case of Brock Turner – where white criminals are protected and praised as promising athletes, and black ones are held up as horror stories. 

And so, Google learns from our behaviour. It learns, in its robotic and systematic way, that the humans are calling black teenagers criminals so that is what they must be. The humans are employing only white, male, able-bodied CEOs, so that's what they must be. There's no space for the brilliant black teenagers or the female CEOs because Google determines them rare enough to not even exist.

No, I don't think Google is racist. I think people are racist. I think people are sexist, and I think they make grossly unfair judgements and they use the media to sensationalise those judgments. "Fixing" Google would only be a temporary stopgap – we need to fix ourselves first. 


Tagged in:

Tap below to add to your homescreen

Love The Pool? Support us and sign up to get your favourite stories straight to your inbox