The Salta province of Argentina approved the development of creepy algorithms by Microsoft in 2018 that could determine which low-income teens would likely get pregnant.
One of the pioneer cases in the use of artificial intelligence data was the "C", which used demographic data including age, ethnicity, disability, country of origin, and whether or not their home was hot.
At the same time, Argentina's Congress was debating whether to decriminalize abortion, Microsoft offered the province the program that was celebrated on national television.
The systems create predictions after analyzing up to 80 variables related to the person (with categories of variables including personal, education, health, employment, housing, family, etc.). Such variables have been found capable of causing undue discrimination. Additionally, the systems have been critiqued for their opacity and error ratio.
While the models utilize a type of machine learning algorithm called a "two class boosted decision tree" (which is more understandable and explainable than various alternatives), both systems still operate within a black box. Lacking transparency, the system makes it impossible for a citizen or researcher to trace the logic of the algorithm and understand why some subjects receive the 'high-risk' label.
The systems also have a significant error rate, with the teenage pregnancy identification algorithm generating a 15% false-positive rate (or, incorrectly labeling 15% of the sample size as high-risk when they should not receive that label) and the school dropout identification algorithm generating a 20% false-positive rate. Despite these criticisms, other provinces in Argentina (Tierra del Fuego, La Rioja, Chaco, and Tucumán), as well as a province in Colombia (Guajira), are negotiating with the Salta government to adopt the systems.
It's not clear what the provincial or national governments did with the data, or if they related to the abortion debate.
Argentina voted to decriminalize abortion in 2020, but the program's existence should be a cause for concern.
The report should serve as a warning of the potentially dangerous intersection between American tech and authoritarianism and offer a reminder that, for the time being, we have less to fear from the technology than from the humans behind it.
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.