Google Denies Use of AI Algorithms for Laying Off Employees

Google Denies Use of AI Algorithms for Laying Off Employees
Published on

Here is information about the reason why Google denies the use of AI algorithms

According to WaPo, Google has denied employing AI algorithms and claimed that "no algorithm is involved" in its decision-making about Google laying off employees. Clearly, Google denies use of AI algorithms

Whether it's true or not, the workers have a lot of reasons to be suspicious. The newspaper cited a recent survey that found that 98% of human resources leaders at American businesses said they will use software and algorithms to "reduce labor costs" this year. However, only half of these leaders were confident that the technology would provide recommendations that were objective.

It is the darker side of a long-held custom. Joseph Fuller, a professor of management practice at Harvard Business School, told WaPo that the "right person" for "the right project" is frequently found by algorithms in large companies' HR departments.

The technology contributes to the creation of a database that is referred to as a "skills inventory." This database compiles a comprehensive list of each employee's skills and experiences and aids businesses in determining whether these will be sufficient to accomplish their objectives.

Fuller stated, "Suddenly, they are just being used differently because that's where people have a real inventory of skills."

Take, for instance, the company Gloat: an "Artificial Intelligence Talent Marketplace" that connects employees to projects that are more relevant to them and vice versa using AI. While acknowledging the need for transparency from HR leaders, Gloat vice president Jeff Schwartz told WaPo that he is unaware of any clients using it to lay off employees.

These technologies might look at employee performance the most, but many other metrics are less clear, like "flight risk," which predicts how likely someone is to leave the company.

According to Brian Westfall, an analyst at the software review site Capterra, AI software could inadvertently identify non-white workers as a "flight risk" and recommend firing them at a higher rate if, for instance, a company has a discrimination problem that causes non-white workers to leave at a higher rate on average.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net