Is everything, and everyone crazy biased?
How to fix our dirty rotten biases when we are part of the problem.
Image: Arthur Edelmans, Unsplash
I’m assuming you have a LinkedIn profile, or use it in some way. Are you biased?
Let’s say you are asked to screen candidates for job. You pop into LinkedIn and check the profiles of 200 people, shortlist 25 candidates and schedule them for interviews. What could be wrong with that? You’ve done a job right? Or have you outsourced your search to an algorithm?
Well, for a start 57.2% of LinkedIn users are men. It’s not the fault of the network. Like it or not, the dataset is going to reflect that imbalance. A bias in other words. Besides, a large percentage of LinkedIn uses (60%) are between the ages of 25 and 34. Only 2.9% of them are above 55 years. So yes, sexism and ageism would be inherent to the database wouldn’t it? ‘Baked in’ as they would say. Is the network to blame, or are you exploiting its bias? I picked LinkedIn, but I could have picked any other network - Snap, Instagram etc.
I was working on my recent podcast about bias in AI and was talking to my guest who said that yes, all software, and not just AI, is unapologetically biased. Don Wilde, a self-taught software programmer explained why software biases are baked in, so to speak. On further digging around I found something interesting not just about software, but the websites that use data that could be also hugely biased.
Imagine what biases we are reinforcing in ourselves each time we look up something - what we flippantly call ‘research'.
I’ve been testing some AI apps for the past two months. I’m just about to test Bard, the imperfect AI app from Google (which required me to be on a waitlist but am cleared to go.)
The reality is that software merely reflects the neuroses and biases of us humans. We could throw up our hands and learn to live with it. Or, we could scrutinize what it regurgitates. Some of us just rage against the machine.
_______________________________
Speaking of AI, I’m reading “The Big Nine” by Amy Webb. It’s very, very disturbing. The blurb describes it as a call to arms about the broken nature of artificial intelligence, and the powerful corporations that are turning the human-machine relationship on its head. The nine? Includes those you may have forgotten about: AliBaba, Tencent, Baidu and IBM (plus the usual suspects, Amazon, Apple, Facebook, Google and Microsoft.) The deep strategic planning going into AI in China is enough to give you or any foreign policy wonk nightmares. Which makes me wonder: have they been sleeping at the wheel? Or, are we simply amusing ourselves to death with ChatGPT, and missing the wood for the trees?
What’s more disturbing is that this book came out in 2019. Which means Amy Webb was, like a few others, firing warning shots, pre-pandemic - shouting into the void.
______________________________
In related news, Microsoft (which has vested interests in ChatGPT) recently laid off its entire ethics and society team. That team was part of its AI division. If you read its PR, it’s very impressive, but strikes me as a load of spin. “Right now, we do live in a world that is unfair and biased in many different ways,” says a spokesperson in a video of its website, that outlines it’s six principles of responsible AI.
______________________________
It’s Holy Week. I love this week of the year, and I am struck by how discrimination and bias was germane to life in Israel in the time of Christ. There was a throwaway line in the (very long) reading on Palm Sunday, in which Peter the apostle is put in a spot. Jesus is being questioned in the palace of the high priest, and Peter tries to watch at a distance. One of the maids accuses Peter, and says something along the lines of “The way you speak tells me you are one of his people.” Sort of like saying, “I can tell by your accent that you are not from here.”
Before that, the Pharisees hatched several conspiracy theories about Jesus because they could not stand someone from a poor town, Nazareth, coming in and destabilizing their power base.
____________________________
Are we hardwired to be biased? It’s a hard question. Harder to answer than trying to get to the bottom of whether the software we use is biased.