Are AI Algorithms Misogynist? Will Tech Go Against Women?

Are AI Algorithms Misogynist? Will Tech Go Against Women?
Published on

How can AI algorithms be racist and sexist? Is it against women even in the digital world?

Even though the first person to write an algorithm was a woman in the 19th century, Artificial Intelligence may now be discriminating against women. Artificial intelligence may inherit, or even amplify, the biases of its creators. and most AI hasn't heard about the global feminist movement. artificial intelligence has issued a stark warning against the use of race- and gender-biased algorithms for making critical decisions

AI bias occurs when results cannot be generalized widely. We often think of bias resulting from preferences or exclusions in training data, but bias can also be introduced by how data is obtained, how algorithms are designed, and how AI outputs are interpreted. In 2018, researchers discovered that popular facial recognition services from Microsoft, IBM, and Face++ can discriminate based on gender and race. Microsoft was unable to detect darker-skinned females 21% of the time, while IBM and Face++ wouldn't work on darker-skinned females in roughly 35% of cases. There are so many AI gender biases happening now, this impact on the daily life of all women: from job searches to security checkpoints at airports.

How do AI algorithms discriminate against women?
  • An employer was advertising for a job opening in a male-dominated industry via a social media platform. The platform's ad algorithm pushed jobs to only men to maximize returns on the number and quality of applicants
  • A white man who goes to an airport, he will quickly pass, but a woman with dark skin will be waiting in a long line."
  • In job recruitment processing industry is male-dominated, the majority of the resumes used to teach the AI were from men, which ultimately led the AI to discriminate against recommending women
  • Facebook posted ads for better-paid jobs to white men, while women and people of color were shown ads for less well-paid jobs.
  • Face-analysis AI programs display gender and racial bias, if you are a woman with dark skin, it will work worse. demonstrating low errors for determining the gender of lighter-skinned men but high errors in determining gender for darker-skinned women
  • Voice-activated technology in cars systems is tone-deaf to women's voices.
  • Google searches for 'black girls' to produced sexist and pornographic results.
Amazon AI recruiting tool that showed bias against women:

One of the best-known cases of discrimination based on the use of Artificial Intelligence was Amazon's attempt at automating their recruitment system. their new recruiting engine did not like women. Amazon's experimental recruiting engine followed the same pattern as Top U.S. tech companies that have yet to close the gender gap in hiring. Amazon's algorithm penalized resumes that included words related to the female gender, even in the hobbies of the candidates. In 2018, it came to light that the American multinational had discarded its AI tool, with which it had been selecting candidates applying for jobs for four years because it was sexist.

That is because Amazon's computer models had been trained to vet applicants by observing patterns in resumes submitted to the company over 10 years. Most of these resumes were men's, a reflection of male dominance across the tech industry, creating a machine-learned bias that favored male applicants.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net