SocietyScience & Technology ‘Bitches Near Me’ And The Sexist AI: A Mirror To Our Misogynist Society

‘Bitches Near Me’ And The Sexist AI: A Mirror To Our Misogynist Society

The ‘bitches near me’ search results are a testimony to how we use the internet and is reflective of our patriarchal and sexist cultures.

Typing in ‘bitches near me’ into Google’s search generates results containing the addresses of girl’s schools, women’s accommodation, and women’s stores. The news of this broke out when a Twitter user tweeted about this after they came across it on a Facebook page, and this prompted multiple users to do the same, with all of them being met with the same results.

Google’s algorithm viewed the word ‘bitches’ as a synonym for women. A google search with the phrase ‘women near me’ prompts similar results, as well. Speaking to The News Minute, Pranesh Prakash, a fellow at the Centre for Internet and Society said, “Google knows that ‘bitches’ can be slang for women. For many years now, Google has been trying to understand and search for what you meant than what you might have typed. However, if you enclose this search phrase in quotes, you will not get the same results, because then it will look for the phrase.” He also added that although he doesn’t think this gives rise to a safety concern, this certainly is a reflection of the misogynistic language in use online.

However, Nayantara R, who works for the Internet Democracy Project said to The News Minute that given the clout and monopoly that Google enjoys, it should be held accountable. She also said, “the Google page ranking is one of the most protected patents. No one knows how or why certain pages come before the others. In this case, certain search results coming first does not mean that Google is misogynistic, but the algorithmic decision that is taken by someone at Google to rank certain pages which promote certain values before others does have a social impact. All technology is a result of how it is used. Accountability is required there.”

Although, this isn’t a one-off incident. Last year, Shashi Tharoor had tweeted to Google with screenshots of the vastly different results that were generated when ‘South Indian Masala’ and ‘North Indian Masala’ were typed into its search. While the latter search was met with pictures of Indian spices, the former contained pictures of hyper-sexualised women. Even though this was a year ago, Google did nothing to fix this sexist and problematic search result.

In the past, Google Translate was also accused of being sexist. Google Translate began to associate certain professions with men, based on gender stereotypes. Turkish – which has gender-neutral pronouns – when translated to English using Google Translate, contained English gendered pronouns to replace the gender-neutral ones of Turkish, and this was done based on sexist stereotypes about the intersection of gender and ‘competency’ in certain professions.

Also read: The ‘Hidden Truth’ Of The Women In Science

‘O bir muhendis’  becomes he is an engineer, while ‘o bir hemsire’  translates to she is a nurse. The same issue was found with translations from Malay to English. A research paper by Cornell University’s students found that word-embedding models trained on Google articles displayed gender stereotypes to a disturbing extent.

Speaking to Forbes regarding this, Kate McCurdy, a senior computational linguist said that there isn’t anything wrong with the word-embedding model itself, but it is in need of human guidance and oversight, and emphasised that fires will have to be put out as they come-up. But as a long-term solution, McCurdy suggests that the developers of these applications should begin to integrate a more critical perspective and should hire a diverse range of people as developers, who might be able to catch these issues earlier on in the process.

In line with previous outcomes, this issue hasn’t been fixed by Google either.

Although, today Google removed gendered pronouns from Gmail’s Smart Compose feature after the AI turned sexist and made sexist predictions of gendered pronouns. The issue came to light when one of the company’s research scientists typed in, “I am meeting an investor next week,” and Smart Compose followed that up with, “Do you want to meet him?”

Google has also had numerous issues with its autocomplete feature, were in, it often suggests racist and bigoted things. In 2016, Google autocomplete suggestion ‘Jews are evil’ when ‘Jews are’ was typed in, made news, with Google soon rectifying this, but that didn’t mean the end of such troubling suggestions. After this Google autocomplete continued to suggest things like, ‘blacks are not oppressed,’ ‘Hitler is my hero,’ and ‘feminists are sexist.’ Some of these have been fixed, but the problem is far from gone.

Google cannot be absolved of responsibility because, in most of these instances, they never rectified the error.

Although all these instances are a reflection of the sexist language we use, Google cannot be absolved of responsibility because, in most of these instances, they never rectified the error.

Recently when Amazon’s AI tool created to recruit software developers was discovered to show bias against women, Amazon scrapped the recruiting tool promptly – a practice that Google often seems unable to follow. Algorithms will probably never be perfect and free of bias, but corporations like Google are responsible for putting out these fires as and when they come-up, immediately and effectively.

These sexist AIs are a result of their algorithms picking up on our sexism and discriminatory practices. The ‘bitches near me’ search results are a testimony to how we use the internet – in a misogynistic, sexist manner, which is reflective of our patriarchal and sexist cultures, where women are still constantly deprived of the opportunities men are presented and the space they are allowed to occupy.

Also read: Facebook’s Community Standards Suppress A Marginalized Voice Again 

The AI universe with its sexism and misogyny resulting in lost opportunities and long-term repercussions for women, is now a microcosm of our deeply patriarchal societies. With AIs turning sexist on the daily, maybe it is time that we deeply reflect on our sexist societies and what they cost women, be it in terms of lost opportunities, violence, or everyday sexism.


Featured Image Source: The Quint

Related Posts

Skip to content