Just when you wondered that biases or divisions on certain bases were a human thing and that humans are the ones who construct them, you went a little wrong. In many cases, it has been discovered that the algorithms or the interfaces on which our search engines rely upon can be biased too.
A research conducted by the students from the University of Maryland and the University of Washington explains and exposes how the gender biases work through the way of the search engines when we look up for pictures representing careers and jobs.
To begin with, they did a relative examination to check whether the predominance of people in picture list items for professions compares to their portrayal in real professions. The analysts did this by contrasting the number of ladies who showed up in the main 100 Google picture indexed lists for 45 unique occupations, which extended from bartenders to a scientific expert to a welder. At that point, they did a subjective examination to perceive how men and women are depicted in these results. (The study was conducted about the U.S. population and is being used just as an example.)
The resulting responses were similarly convincing. For example, as indicated by their examination, the greater part of the U.S. authors are ladies (56%), yet the picture search shows just about 25% of ladies as authors.
Another example is of picture results portraying "doctors" recovered essentially a bigger number of pictures of men compared to the men who really work in the field. The opposite was valid for "nurses." At the end of the day, the specialists indicated that picture web indexes misrepresent sexual orientation.
Is it just Gender?
Surprisingly this bias is not restricted to just professional work environment or just women, it even comes up in your most unexpected searches. For example when googled "unprofessional hair colour styles" the majority of the image results were of ‘coloured’ women with funky and unprofessional hair clour styles. On the other hand when googled ‘professional hair’ the majority of pictures comprised of white women.
Or keeping women aside for a while and talking precisely about colour, when searched for ‘black teenager’, the majority of the search results brought about a decent arrangement of mug shots whereas when searched for ‘white teenagers’, sadly the results seemed normal i.e. of actual teenage kids having fun. And yes surprisingly all this is contained by your search engine.
Your everyday life
Further coming to our very used platforms or in Layman’s terms what we call as social media websites, surprisingly, even after consisting of so many young and empowered thinkers, ends up getting these biases in their search engines too. A very prominent example of the same is ‘Pinterest’, a visual revelation engine for discovering thoughts and ideas, and at times even these ideas end up taking sides when searched. Something similar goes with the graphic designing website ‘Canva’, which allows the users to create visual content with much more ease when compared with the other complicated ones. When specifically searched about the pictures of people in leadership roles there, women in large proportion were underrepresented in the results.
The Filter Bubble
Now the question that arises here is that from where do these search engines get these biases in their algorithms. The best way to understand the same is on a micro or an individual level. The term that holds a lot of importance here is Filter Bubble. It means a circumstance where an Internet user experiences just data and conclusions that adjust to and fortify their convictions, brought about by algorithms that customize a person's online encounter. It’s just like they’re reading our minds. So apparently you see it there because you have looked it up earlier because you think about it, at times may be more than you should.
And this again brings in the idea of ideological leanings of the people. They don’t search for something because they don’t want to see it. They are happy in their little bubble and in fact at times get defensive if someone with a different ideology tries to interfere their space, as according to a study it is revealed that an attack on the ideologies of these people feels like an attack on themselves.
One of the respondents, when asked about this, said,
“I always thought about it this way- if women in my house never get the chance to study and eventually to take up jobs for whatever reason they might not search about it and then Google will show them that only i.e. their search might never want to explore the opposite side since they have never seen someone or even themselves experienced how it feels like being in a position of power.”
Which means if there are no examples around then it might not incite the people to change their thinking patterns and subsequently their search behaviours. But countries like New Zealand pose an altogether different example for the same when their prime minister Jacinda Ardern, a women leader was all over the news because of her phenomenal policies in curbing the COVID-19 pandemic all over the nation. So at times, your geographical constraints might also tell your algorithms to work differently.
So the only thing that can improve the situation is how we look up for all these things and how open we are to accept the news ideas