Embedded Biases
I vividly remember a scene from a movie I saw as a teenager. In a class full of students, a girl defines the word history as, “His or her story.” That was the first time I realized the words we use implicitly carry gender. “If the story is about both her and him,” I thought, “why the need to represent one by the other?” This mentality set its roots in my fabric, yet my childhood mind did not comprehend the effects it could have. While I tried my best to be word-conscious in my life, it was not until I was in university that I understood the effects transcend to the realm of machines.
Artificial Intelligence is a hot topic nowadays. Being interested in the technology, I follow up on news in the matter closely. And so I stumbled up on a Ted Talk given by Dr. Timnit Gebru, a Stanford graduate in electrical engineering. In the video, she uses a simple analogy to demonstrate the implicit bias we carry as society. The analogy is not uncommon; people immediately associate women to nurses and men to doctors. I would know because I am one of those people. Though it pains me to admit, I am not immune to making and receiving such biased discrimination. This doesn’t necessarily mean calling a woman doctor a nurse. One’s biases are reflected in one’s surprise at finding a woman at a top position in a company.
What does all this have to do with computers?
Computers are trained based on existing data. We, humans, create this data, which means it includes all the biases we have placed in our language. What makes it more difficult is that when a human makes a mistake, you can point it out to them and make them aware of the mistake. That is how I had my “awakening” moment when it comes to language. But when you try and correct the mistake a machine has made, the workload is not so easy. For one thing, you would not need the machine if you had the time to sit down and analyze the data yourself—the machine has defeated its purpose.
In the book Weapons of Math Destruction the author, Cathy, explores the dangerous and sometimes overlooked damages of mathematical models. The most interesting one to me is the effect of Artificial Intelligence in the job market. Due to the biased training of computers, women found themselves on the rejected side of AI evaluated job applications. Machines are not perfect. But when they make mistakes, they make them so perfectly that it becomes too obvious. If it were a human undertaking the evaluation, gender would not be a sole determining factor. Even if the person is sexist, they will be bound by the policies of the company and the country in being discriminatory. But when you give the same data to a sexist machine, rest assured it will throw out the application right back at you.
What do we do now?
Women programmers all over the world are calling for more women in the field of Artificial Intelligence. The hope is having more women in the making of the machine will result in less gender based discrimination. Work is also being done in innovating solutions to de-bias the machine after training. At this point you might be asking, why not just feed it unbiased data in the first place? The problem is that there is not enough of that data out there to train computers in large scale. The go-to training data, for example Wikipedia, is written by people from all over the world each contributing their own biases. In one study, an AI was trained to read 3.5 million books and analyze the words associated with women and men. Beautiful and sexy were the two most frequently used adjectives to describe women while men were portrayed to be righteous and brave.
Of course we need more women in AI! But the problem is not just in the people coding the machines. It is in the literature and the language we use. We need more Maya Angelous, Chimamanda Nigoze Adiches, Opera Winfreys and Michelle Obamas providing us with enough literature to train machines that are not sexist and not racist. There are male writers such as Dr. Sharon Moalem, who wrote The Better Half: On the Genetic Superiority of Women and Yuval Noha Harari who uses women in esteemed positions in many of his examples. We have Trevor Noah painting the beautiful picture of his strong mother in his book Born a Crime. We need more of them too!
What about other languages?
Currently, most of the work being done on AI, especially with regards to language processing, revolves around the English language. That is not necessarily a bad thing. It means we don’t have biased AI spitting out our Amharic CVs—yet. But it also means we need more women like Azeb Worku breaking bounds and gifting the world with powerful women characters. It means we need more poets like Tigist Mamo coloring the world with stories of Tayitu. We most definitely need more herstory books and blogs on the unsung heroines: the brave women that fought wars and brought the enemy to its knees. We need more literature on the female farmers working tirelessly to provide us with food. We need all those and more, for if the time comes and we train our machines in our language, we are going to need enough stockpile of words that demonstrate the unbiased nature of both women and men.
Written by: Hellina Hailu