top of page
Search

Why are algorithms sexist?



Nothing can describe my state of mind when, for the fiftieth time, my beloved translation software persisted in putting into masculine form the beautiful feminine phrasing that I asked it to translate into English.


My interlocutor, the program, made himself comfortable using all the unisex grammatical latitude that Shakespeare's language gives him to surreptitiously conceal my " elle " or " sa" in " he " or " his ".


So, I wanted to know more: why is it that even in a machine, the masculine gender spontaneously predominates? It's the algorithms, I thought. But I was also thinking deep inside that these algorithms, which I visualized as an unintelligible mathematical sequence, couldn't have a gender preference. So, like any self-respecting curious person, I went to find out.


Point 1: The algorithm for dummies.


The algorithm is a kind of step-by-step instruction manual to reach a given goal. The comparison most often suggested is that of the recipe. For example, if your goal is to make a pizza, then your algorithmic recipe will be: take flour, then water, then olive oil, then tomato sauce etc .. the steps leading up to putting the shaped pizza in the oven at 180 degrees.


The matter becomes more complicated when there are several possible options. If at the moment of putting the topping on the pizza there are several possibilities, the algorithm will be able to introduce, in the form of a tree, the variables according to what you have in the fridge. For example, in your recipe, at the " choice of topping " step, the algorithm could define that if there are mushrooms, you put them on. If there are no mushrooms, but there is ham, then you put ham on. If there are no mushrooms or ham, then you put nothing but tomato sauce. If there are mushrooms and ham, the algorithm will be able to look at your choices in the past and decide for itself what it will offer you, based on them. That's how you move from the autonomous car to medical diagnosis through artificial intelligence or real-time analysis of the financial markets.


Of course, behind this process, there is a human being who translates this recipe into terms that a machine can understand, and that's where the problem lies.


Flora Vincent and Aude Bernheim, in "Artificial Intelligence, not without them! "published in 2019, point out that AI reflects the society in which we live and its biases, Aude Bernheim takes precisely the example of translation software.


Hallelujah! I wasn't completely wrong when I wondered. Thus, "the doctor" is always " le docteur " and "the nurse" is always l'infirmière.


You'll tell me it's quite anecdotal, but it's not so much so when facial recognition software trains mostly on images of white men or when the algorithms of some AIs have learned to associate the female gender with the kitchen area of a house through thousands of images of women in this type of room. The biases that have been experienced to date are not only replicated, but also and above all amplified, as Hannah Kuchler notes in an article in the Financial Times of March 9, 2018, "Tech's sexist algorithms and how to fix them".


Even more delicate is the algorithm that determines a lower salary range for women, based on the statistics of what a woman in this or that position has earned in the past.