Beyond Racial Biases, Can AI Be Made Ethical?

Beyond Racial Biases, Can AI Be Made Ethical?
Published on

The racial profiling and police brutality in the George Floyd incident and #BlackLivesMatter protests and rioting unfolded debate on many levels. One of them is flaws in Artificial Intelligence that end up creating a racial bias in technology which is deemed as an instrumental force to bring digital age. However, all hope is not lost, especially in the Australian start-up sector.

Presenting, Akin and Unleash Live, AI-backed companies founded by Liesl Yearsley and Hanno Blankenstein respectively. While Akin, uses AI to build bots that can converse with humans in a lifelike way, Unleash Live employs AI for real-time analysis of video footage coming from security cameras and drones. Both of these companies were founded with a common mission to have an excellent ethical AI culture as well as one that caters to good business policies. Therefore, neither of the companies uses any personal information to manipulate nor 'keep an eye on' the public.

Yearsley had previously sold her AI company, Cognea Artificial Intelligence, to IBM in 2014. Later after a brief hiatus, she came out of her retirement to establish Akin. She learned about the ability of AI to manipulate human behavior to questionable or unsustainable ends while building customer service bots at Cognea. This happens because, unlike humans, AI bots are programmed to optimize themselves towards a goal continuously.

"AI, it turns out, has a frightening ability to bring about change in human behavior," she says.

"Those customer service bots are used for tasks such as encouraging consumers to take on more credit card debt, were already capable of altering people's behavior by anywhere between 30 percent and 200 percent," she adds.

Meanwhile, Sydney based Unleash Live, denied to be involved in using AI that uses personal information. Instead, its AI analyzes video feed inputs from a security camera to help a city decide whether footpaths should be made wider or to detect that hordes of people are running from some incident that law enforcement or emergency services should be notified about. The company's selling point is that it will never collect nor analyze personal information. Hence it is free from the possibility to be used by the government to identify people.

In an interview with the Australian Financial Review, Blankenstein said, "Not only were the ethical implications of mass, computer-based surveillance too troubling but, in any event, that market was already saturated with powerful companies that were all too willing to provide the technology to governments and police forces around the world."

Such a type of ethical use of AI is crucial now than ever. After the outrage over George Floyd's murder in Minneapolis, Minnesota, leaders in technology corporations are contemplating whether to continue providing services like facial recognition to the law enforcement unit. After such eye-opener incidents, which include recent mess by Microsoft's editor AI, tech majors like IBM, Amazon, and Microsoft have currently withdrawn their AI services for mass surveillance, last week. The call for a humanistic, neutral, or lesser biased AI is louder and clear than ever. The pressure is mounting on government across the US and other regions to come up with better regulation for AI.

According to Toby Walsh, professor, the University of NSW, "Such AI-based surveillance systems are now at risk of becoming 'toxic assets' for the companies that develop and sell them, to the point where many companies will be forced to abandon the technology altogether. He further points out that face recognition along with other misuses of surveillance is going to be a topic that will trouble us increasingly.

Australia's Human Rights Commissioner, Edward Santow firmly believes that Australia can position itself also position itself as a supplier of ethically safe technology, especially when it comes to artificial intelligence. Just like it did in positioning itself as a supplier of safe food to the world. And with companies like Akin and Unleash Live, this future is not far from becoming a reality.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net